Share this post:
A lot can change in a year. And sometimes, nothing does.
In March 2018, just before our annual Congressional fly-in, IBM published a point of view about the impact that harmful online content could have in our society. We emphasized the need to modernize laws so that consumer-focused internet media companies would take greater responsibility for what happens on their platforms.
Around the same time, IBM played a key role in promoting passage of U.S. legislation to crackdown on the spread of online content with a truly horrific purpose: trafficking children for sexual exploitation. We saw this legislation, known as SESTA/FOSTA, as an important step in a broader global effort to stop criminals from using digital platforms for clearly illegal behavior.*
But more remains to be done, because the spread of illegal online activity continues to cause significant harm. Children in particular are increasingly being exposed to harmful content online. And as elected representatives like Sens. Josh Hawley (R-MO), Rob Portman (D-OH), Amy Klobuchar (D-MN) and Mark Warner (D-VA), Representatives Ann Wagner (R-MO), Carolyn Maloney (D-NY) and Doug Collins (R-GA), and others around the world have recognized, measures are needed to curb content that promotes terrorism, violence, suicide, electoral fraud, or the perpetuation of the opioid crisis.
Customers are telling companies they want them to introduce new services and technologies with responsibility and stewardship. And citizens are telling elected officials that they want laws to keep up with the realities of the modern internet.
That’s why IBM will continue to support reasonable, considered measures to regulate online activities that are clearly illegal. We feel the best approach is “precision regulation” – laws that are tailored to hold companies more accountable, without becoming over-broad or hindering innovation or the larger digital economy.
In the United States, precision regulation means taking a fresh look at Section 230 of the Communications Decency Act (CDA 230). As currently written, CDA 230 grants an expansive liability shield to any provider of an “interactive computer service” for the actions that occur on their platform, regardless of whether the platform turns a blind eye to illegal activity. Courts have found companies that knowingly host illegal content to be exempt from legal liability based on the broad protection that CDA 230 provides. But a measure designed nearly a quarter-century ago to foster an infant internet needs to keep pace with the enormous social, economic, and even political power that the online world today commands.
Instead of holding all online platforms exempt from liability by default, IBM believes that the exemption should be conditioned on companies applying a standard of “reasonable care” and taking actions and preventative measures to curb unlawful uses of their service. In a 2017 research paper, Professors Danielle Citron and Ben Wittes proposed this approach as a balanced compromise to address the growing proliferation of illegal and harmful online content.
The “reasonable care” standard would provide strong incentives for companies to limit illegal and illicit behavior online, while also being flexible enough to promote continued online innovation and fairly easy adaptation to different online business models.
Reasonable care does not mean eliminating entirely the intermediary liability protections of CDA 230, or comparable laws in Europe and elsewhere. Nor are we calling for amending the “Good Samaritan” provision of CDA 230, which limits the liability of companies that take voluntary actions to stop bad actors. We simply believe companies should also be held legally responsible to use reasonable, common-sense care when it comes to moderating online content. This means, for example, quickly identifying and deleting content focused on child pornography, violence on child-oriented sites, or online content promoting acts of mass violence, suicide, or the sale of illegal drugs. A reasonable care standard in CDA 230 would add a measure of legal responsibility to what many platforms are already doing voluntarily.
A “precision regulation” approach also means that requirements or liability should be focused on those in a position to do something about illegal online content. The reasonable care standard should apply precisely and narrowly: to “providers” of interactive computer services that not only host information but also make that information available to the public and have the technical means, practical ability, and the right to moderate content. Responsibility for exercising reasonable care should ultimately rest with those companies with the greatest practical ability to take action, not to every entity that may be remotely connected to the value chain of building or managing the internet.
There is no simple solution that will solve all that ails cyberspace. But there are precise and specific steps we can take to move toward a better and safer digital future. The internet doesn’t exist in a vacuum; it’s a part of – not apart from – our everyday lives and the world all around us. It’s time we begin treating it as such.
-Ryan Hagemann, IBM Government and Regulatory Affairs Technology Policy Executive
Sign up for the IBM Policy Lab newsletter for our latest updates:
*On April 11, 2018, President Trump signed Stop Enabling Sex Traffickers Act (SESTA)/Allow States and Victims to Fight Online Sex Trafficking Act(FOSTA) into law. SESTA/FOSTA clarified that Section 230 of the Communications Act of 1996 does not immunize from federal civil and state criminal liability the knowing facilitation online of coerced or child sex trafficking.
(*Editor’s Note: a prior version of this posted did not acknowledge leadership on this issue by Representatives Wagner, Maloney and Collins. It was corrected on August 26th, 2019.)