Congress goes after Big Tech for pandemic misinformation

Members focused on how to tweak Section 230 to make social media platforms more accountable for disinformation posted by users.

AP Photo/Jenny Kane
Add to Briefcase
Erin Covey
June 24, 2020, 8 p.m.

With the coronavirus leading to a spike in potentially fatal misinformation, like the viral Plandemic conspiracy-theory video promoting various falsehoods about the virus, House Energy and Commerce Committee members expressed their critical concerns Wednesday about social-media platforms’ ability to address deliberate disinformation.

“People can die if there’s misinformation out there about COVID-19,” Democratic Rep. Lisa Blunt Rochester said at Wednesday’s hearing. “People can die if violence is incited.”

But broader concerns about social-media platforms' response to disinformation fell on typical partisan lines—Democrats focused on President Trump’s tweets about voting-by-mail and protests, while Republicans argued that Twitter placing warnings on those tweets constituted censorship.

Much of the hearing focused on if and how Section 230 of the Communications Decency Act, a law that protects online platforms from being held legally liable for users’ speech, could be amended to incentivize platforms to better address disinformation and be more transparent about their content guidelines and algorithms.

“We can all agree that better transparency regarding how these internal guidelines are determined … is needed,” GOP Rep. Brett Guthrie said in his opening statement.

University of California (Berkeley) professor Hany Farid hammered on platforms’ business models being the ultimate reason that they’ve failed to take action, citing a Wall Street Journal story showing that Facebook executives admitted their algorithms were inflaming division.

“The core poison ... is the business model,” Farid said. “The business model is that when you keep people on the platform, you profit more, and that is fundamentally at odds with our societal and democratic goals.”

He noted that outside of stronger federal regulation to prevent disinformation, companies could pull advertising if platforms fail to address this issue.

Neil Fried, former chief Republican communications and technology counsel to the committee, testified that additional regulation wasn’t necessary if Congress updated Section 230 to hold online platforms to the "duty of care" standard that other businesses are held to.

“We don’t want to stifle innovation—we don’t have to,” Fried said. “So let’s take the regulation off the table.”

While both Joe Biden and Trump have called for Section 230 to be revoked, the witnesses agreed that it should be changed rather than repealed right away.

“We really do need 230,” said George Washington University professor Spencer Overton. “We want Black Lives Matter, we want the tea party, we want a variety of grassroots organizations to be able to participate and post their material without fear that the platforms feel like they’re going to be sued.”

Welcome to National Journal!

Enjoy this featured content until July 25, 2020. Interested in exploring more
content and tools available to members and subscribers?