It's obvious at this point that Facebook seems content to continue to permit the spread of extremist disinformation and organizing on its social media platform, while paying lip service to its responsibilities and taking hollow half-measures to correct the problem—largely because its revenue stream is so powerfully dependent on the features fueling the phenomenon. Facebook’s profits, as whistleblower-provided evidence has established, are fundamentally built on creating social division and real-world strife.
Facebook's undeniable role in helping facilitate the bloody campaign by Myanmar’s military against its Rohingya minority population is already well-established. Now, a fresh report from Nick Robins-Early of Vice details how it is replicating those results in Ethiopia, where military leaders and their authoritarian supporters are unleashing genocidal violence in the midst of an ongoing civil war.
The Facebook model for engendering social chaos for profit with which we have all become familiar is on full display in Ethiopia. Just as occurred in Myanmar, the nation’s military leaders have leveraged the spread of disinformation on Facebook to encourage ethnic violence against a regional minority population and to organize lethal violence against them. And just as it has everywhere, the social media giant is exerting minimal effort to correct the abuse of its platform-- resulting in next to nothing to slow the looming genocide.
Last summer, a video went viral on Facebook showing a man telling a large crowd of people that anyone who associates with certain ethnic minorities is “the enemy.” It was re-posted multiple times before the platform removed it. The same account that called for Kassa’s arrest also appeared to celebrate the Fano, a notorious Amhara militia, for carrying out an extrajudicial killing. That post that remained online for months. Another account with over 28,000 followers posted an instructional video on how to use an AK47 with a caption that suggested every Amhara should watch it. The post has been up since April and has nearly 300,000 views. In September, a local media outlet published unproven allegations on Facebook that members of the ethnic Qimant minority were responsible for a shooting. That same day a government-aligned militia and mob attacked a Qimant village, looting and burning down homes. Months later, the post remained on Facebook.
Facebook’s claim to hiring moderation staff in Ethiopia is pathetic. Moderation and fact-checking in Ethiopia is in fact is operated by group of volunteers who send Facebook spreadsheets of posts to investigate and frequently have to explain to FB staffers why content on their platform is dangerous. “They completely lack context,” researcher Berhan Taye told Vice. “Every time we talk to them, they’re ask for context. That’s been a big issue—they don’t understand what’s happening in the country.” The company also routinely ignores researchers when they point out violent or hateful content, telling them that the posts don’t violate Facebook policies. “The reporting system is not working. The proactive technology, which is AI, doesn’t work,” Taye said.
If this sounds familiar, it should. When the Myanmar military used fake Facebook accounts to organize ethnic-cleansing violence against the Rohingya, it allowed the posts to remain online until The New York Times published an account of the platform’s culpability in the genocidal violence. An independent fact-finding commission by the United Nations Human Rights Council found that both the specific violence and the ethos that fostered it were spread readily on Facebook.
The Facebook model of engendering social chaos for profit has already had its effect in the United States, which particularly came home to roost at the Capitol on Jan. 6; the company’s own internal reports acknowledge that much of the extremism (particularly disinformation about the 2020 election) and violence, including the siege on Congress, that day was spread and organized on Facebook.
No comments:
Post a Comment