The Federal Communications Commission is opening a 45-day public comment period on a Commerce Department petition for review of the liability protections provided to web hosts of user-generated content, kicking off a regulatory process and also a debate over whether this is the right venue for addressing disinformation and related cybersecurity issues.
Reforming Section 230 “protections to deal with disinformation might seem tempting to legislators looking for solutions, but I'm very skeptical of this approach,” said Matthew Waxman, a professor at Columbia Law School who specializes in national security law. “Disinformation is a broad category that's tough to define, and resulting liability for its harms is very uncertain. So this would be a blunt instrument, with risks of its own, to deal with a complex set of problems.”
Stewart Baker, of counsel at Steptoe and Johnson and a former assistant secretary for policy at the Department of Homeland Security, said there’s “no doubt” Section 230 of the Communications Act is “ripe for reform,” but suggested regulatory and legislative efforts on that score probably won’t include extensive work on disinformation.
He said the “big platforms … have been aggressive about trying to find and suppress disinformation by foreign governments,” but that there is a need for better collaboration and information sharing between the tech and social media industry and the U.S. government.
The FCC on Monday was poised to issue a notice requesting comment on the Commerce petition, which asked the commission to clarify:
- Whether, and to what degree, Section 230 of the Communications Decency Act provides protection for social media’s content moderation decisions
- The conditions under which content moderation and editorial decisions by social media companies shape content to such a degree that section 230 no longer protects them
- Social media’s disclosure obligations with respect to their content moderation practices
FCC Chairman Ajit Pai said in a statement Monday: “Today, the FCC's Consumer and Governmental Affairs Bureau will invite public input on the Petition for Rulemaking recently filed by the Department of Commerce regarding Section 230 of the Communications Decency Act of 1996. Longstanding rules require the agency to put such petitions out for public comment ‘promptly,’ and we will follow that requirement here.”
Pai said, “I strongly disagree with those who demand that we ignore the law and deny the public and all stakeholders the opportunity to weigh in on this important issue. We should welcome vigorous debate—not foreclose it. The American people deserve to have a say, and we will give them that chance. Their feedback over the next 45 days will help us as we carefully review this petition.”
A cyber dimension to 230 reform?
A Senate Commerce subcommittee held a hearing last week on Section 230 reform, with some witnesses suggesting creation of a panel patterned after the Cyberspace Solarium Commission to examine Section 230 issues.
That’s not in the immediate works, but the Senate hearing demonstrated that support and opposition to Section 230 reform doesn’t divide neatly along partisan lines, and there is skepticism about this as a way to address disinformation.
Kiersten Todt of the Cyber Readiness Institute told Inside Cybersecurity that a re-examination of the law needs to be tightly focused.
“The issue to address with Section 230 is disinformation and misinformation,” Todt said. “The objective of a revision is to ensure more accurate and truthful content is placed on these platforms. Look at what Twitter did -- they were late to the game but at least they got there -- it acknowledged that factually incorrect data was not constructive. This is not about free speech but accurate content. Distribution of disinformation is a national security risk.”
Columbia’s Waxman told Inside Cybersecurity that “foreign disinformation campaigns remain a huge threat and the United States is still working to develop the tools to combat them.”
Waxman said, “One reason why it's hard to combat disinformation through regulation is because while we need to protect our democracy from foreign interference, in our democratic system we don't generally believe the government should be policing speech and expression.“
Baker, the former DHS official, said combatting disinformation probably isn’t a top agenda item for either an FCC or congressional Section 230 reform effort, which is more likely to focus on transparency around platforms’ decision-making in moderating content and ensuring “good faith” by operators.
He noted that “trying to get at disinformation becomes complicated once you get past Russia, China and maybe Iran. One person’s disinformation is another person’s news, it depends on your perspective.”
On the other hand, he said, there remains a need for the major social media platforms to engage with U.S. agencies on what they’re seeing in the disinformation realm.
The companies have been “surprisingly stand-offish,” Baker said. “The state of coordination is not healthy right now, but that’s not a Section 230 problem, it’s a problem with the culture of Silicon Valley and that the law doesn’t allow easy information sharing.” – Charlie Mitchell (firstname.lastname@example.org)