Sunday, July 18, 2021

Facebook is Driving Misinformation Leading to Low Vaccination Rates

President Joe Biden has warned that the spread of Covid-19 misinformation on social media is "killing people".  He was responding to a question from a reporter about the alleged role of "platforms like Facebook" in spreading falsehoods about vaccines and the pandemic.  "They're killing people," Mr Biden told reporters at the White House on Friday. "The only pandemic we have is among the unvaccinated."

The White House has been increasing pressure on social media companies to tackle disinformation. Facebook claims that it is taking "aggressive action" to protect public health.

U.S. health officials have warned that the country's current spike in Covid-19 deaths and infections is exclusively hitting unvaccinated communities. Earlier on Friday, Rochelle Walensky, director of the US public health body Centers for Disease Control and Prevention (CDC), told reporters: "There is a clear message that is coming through: this is becoming a pandemic of the unvaccinated."

“There’s about 12 people who are producing 65 percent of anti-vaccine misinformation on social media platforms,” White House Press Secretary Jen Psaki said. “All of them remain active on Facebook, despite some even being banned on other platforms, including ones that Facebook owns.”

Facebook has faced criticism for its moderation, and misleading content about the pandemic is still widely available on its platforms.  In March, a report said anti-vaccine activists on Facebook, YouTube, Instagram and Twitter had reached "more than 59 million followers, making these the largest and most important social media platforms for anti-vaxxers".

Earlier this year, NPR reported on how right-wing sourced misinformation is fueled by Facebook.  An NYU-based group, Cybersecurity For Democracy, was studying online misinformation, wanting to know how different types of news sources engaged with their audiences on Facebook. After the Capitol insurrection, researcher Laura Edelson said her team noticed a troubling phenomenon.  "The thing was, most of that spike was concentrated among the partisan extremes and misinformation providers.  And when I really sit back and think about that, I think the idea that on a day like that, which was so scary and so uncertain, that the most extreme and least reputable sources were the ones Facebook users were engaging with, is pretty troubling."

But we're not talking just one day of high engagement. A new study from Cybersecurity For Democracy found that far-right accounts known for spreading misinformation are thriving on Facebook, being more successful than other kinds of accounts at getting likes, shares and other forms of user engagement.  "It's almost twice as much engagement per follower among the sources that have a reputation for spreading misinformation," Edelson said. "So, clearly, that portion of the news ecosystem is behaving very differently."

The research team used CrowdTangle, a Facebook-owned tool that measures engagement, to analyze more than 8 million posts from almost 3,000 news and information sources over a five-month period.  Each source was then evaluated on whether it had a history of spreading misinformation or conspiracy theories. What Edelson and her colleagues discovered is what some Facebook critics — and at least one anonymous executive — have been saying for some time: that far-right content is just more engaging. In fact, the study found that among far-right sources, those known for spreading misinformation significantly outperformed non-misinformation sources. 

Facebook has repeatedly promised (but seemingly has failed) to address the spread of conspiracies and misinformation on its site.   Edelson has called on Facebook to be transparent with how it tracks impressions and promotes content: "They can't say their data leads to a different conclusion but then not make that data public.  I think what's very clear is that Facebook has a misinformation problem," she said. "I think any system that attempts to promote the most engaging content, from what we call tell, will wind up promoting misinformation."

 

No comments: