As efforts by social-media platforms to hold back a flood of COVID-19 disinformation start showing signs of strain, some lawmakers on Capitol Hill are preparing to take action.
House Homeland Security Committee Vice Chair Lauren Underwood announced on Monday her plan to introduce new legislation focused on the threat online disinformation poses to public health.
“Vaccine hesitancy and misinformation are already at dangerous levels in America. The dissemination of false narratives that OK baseless fears about vaccination threaten the health of children, seniors, pregnant women, cancer patients, and other vulnerable populations,” Underwood said during a committee-sponsored virtual forum with fellow Democratic Rep. Elissa Slotkin and two disinformation experts, who attended in part to advise the lawmakers on crafting the bill.
“Now, during a pandemic on a scale we’ve not seen in a century, these trends are more troubling than ever,” Underwood said. “The stakes are truly life-and-death.”
Democratic committee spokesperson Adam Comis told National Journal the legislation would direct the Homeland Security Department to coordinate and improve federal research into disinformation that targets public health and safety, as well as educate state and local governments about ongoing or suspected public-health disinformation campaigns. Comis said the bill should not require significant federal funding, and that it should be formally introduced within the next few weeks.
One stakeholder in touch with the committee about the legislation, who requested anonymity due to the sensitivity of ongoing discussions, said the bill’s current iteration may also require the department to brief Congress on any campaigns spreading public-health disinformation as they unfold.
The congressional effort comes as the spread of disinformation related to COVID-19 accelerates online. According to The New York Times, a 26-minute video clip taken from a discredited documentary entitled “Plandemic” was viewed more than 8 million times on YouTube before the company took it down.
Portions of the clip include Judy Mikovits, a discredited scientist, inaccurately claiming that vaccines damage people’s immune systems and that wearing a mask “literally activates your own virus.”
That last claim, in particular, was a bridge too far for Facebook’s content moderators. Monika Bickert, Facebook’s head of content policy, said during a Tuesday press call that the claim that wearing a mask activates COVID-19 violated the company’s rules against misinformation that contributes to the risk of imminent physical harm. Bickert said the clip was removed from Facebook’s platforms, along with hundreds of thousands of pieces of COVID-19 content deemed potentially harmful.
Facebook CEO Mark Zuckerberg said around 50 million posts related to the pandemic have been flagged with a misinformation warning by the company’s third-party fact checkers. Zuckerberg said those fact-checking flags caused users not to click on the offending content 95 percent of the time.
Twitter, facing its own tide of COVID-19 disinformation on its platform, is also taking action. On Monday the company announced it would tack labels and warning messages onto tweets containing disputed or misleading information about the virus or the public-health response.
Disinformation experts welcome the takedowns but say simply removing or flagging false content isn’t enough to tackle the problem. They say a push to educate the public about the topic at hand, as well as broader efforts to inform the public about how disinformation spreads online, is also required.
Renée DiResta, an expert on disinformation at the Stanford Internet Observatory, suggested to Underwood and Slotkin on Monday that the lawmakers look to the actions undertaken by the State Department’s Global Engagement Center, currently tasked with coordinating counterterrorism messages to foreign audiences, as they craft a new bill on public-health disinformation.
DiResta also pointed to last year’s “pineapple on pizza” social-media campaign, conducted by DHS’s Cybersecurity and Infrastructure Security Agency, as an example of how the government can effectively communicate with the public about disinformation.
“One of the real questions is how do you produce PSAs and come up with legislative structures that are incorporating some of the efforts that have already begun to take place,” DiResta said.
Underwood’s planned legislation isn’t Congress’s first bite at the disinformation apple. Last fall, a group of House Democratic lawmakers led by Rep. Jim Langevin introduced the Digital Citizenship and Media Literacy Act, which would create a $20 million Education Department fund to educate citizens on digital literacy and spotting disinformation. Companion legislation was introduced by Sen. Amy Klobuchar last summer, with several other Democratic senators cosponsoring the effort.
But no Republican lawmakers have signed onto either bill. Nina Jankowicz, a disinformation expert at the Wilson Center also tapped by Underwood to provide input on the legislation, said the issue of online disinformation continues to be riven by partisan divisions.
President Trump has repeatedly refused to acknowledge that online-disinformation efforts sponsored by the Russian government were designed to boost his 2016 electoral prospects. And Jankowicz says any federal effort to respond to disinformation around COVID-19 will likely be handicapped by the administration’s lack of focus on the issue.
“So much could be different with more willingness from the executive branch—if there were a ‘disinformation czar’ in the [National Security Council] or at the Cabinet, who was bringing the interagency folks together to talk about these issues and making sure that not only the Department of State and [the Defense Department] is in the room, but the Department of Education as well, discussing these really critical issues," she said.
“Congress is doing what they can, but a lot of this work is undermined by the politicization of the whole issue of disinformation."