‘Disinfo kills’: protesters demFacebook act to stop vaccine falsehoods | Facebook

[ad_1]

Activists descended on Facebook’s Washington headquarters on Wednesday to demthe company take stronger action against vaccine falsehoods spreading on its platform, covering the area in front of Facebook’s office with body bags that read “disinfo kills”.

The day of protest, which comes as Covid cases surge in the US, has been organized by a group of scholars, advocates activists calling themselves the “Real” Oversight Board. The group is urging Facebook’s shareholders to ban so-called misinformation “superspreaders” – the small number of accounts responsible for the majority of false misleading content about the Covid-19 vaccines.

“People are making decisions based on the disinformation that’s being spread on Facebook,” said Shireen Mitchell, Member of the Real Facebook Oversight Board founder of Stop Online Violence Against Women. “If Facebook is not going to take that down, or if all they’re going to do is put out disclaimers, then fundamentally Facebook is participating in these deaths as well.”

In coordination with the protest, the Real Oversight Board has released a new report analyzing the spread of anti-vaccine misinformation on Facebook during the company’s most recent financial quarter. The report protest also come as Facebook announced its financial earnings for that same quarter, logging its fastest growth since 2016.

The report references a March study from the Center for Countering Digital Hate (CCDH) that found a small group of accounts – known as the “dirty dozen” – is responsible for more than 73% of anti-vaccine content across social media platforms, including Facebook. That report recently drew attention from the White House, Joe Biden has condemned Facebook other tech companies for failing to take action.

Facebook banned misinformation about vaccines from the platform in February of 2021, but critics say many posts slip through the platform’s filters reach audiences of millions without being removed.

At Facebook’s Washington DC headquarters, activists lay body bags that read “disinfo kills”. Photograph: Eric Kayne/AP

It also has introduced a number of rules relating to Covid-19 specifically, banning posts that question the severity of the disease, deny its existence, or argue that the vaccine has more risks than the virus. Still, the Real Oversight Board found that often such content has been able to remain on the platform even make its way into the most-shared posts.

According to the Real Oversight Board’s report, a large share of the misinformation about the Covid vaccines comes from a few prolific accounts, continues to be among the platform’s best performing most widely shared content. It analyzed the top 10 posts on each weekday over the last quarter found the majority of those originated from just five identified “superspreaders” of misinformation.

“When it comes to Covid disinformation, the vast majority of content comes from an extremely small group of highly visible users, making it far easier to combat it than Facebook admits,” the board said, concluding that Facebook is “continuing to profit from hate deadly disinformation”.

The group has called on Facebook to remove the users from the platform or alter its algorithm to disable engagement with the offending accounts. A Facebook spokesman said the company disagrees with the statistic that 65% of vaccine misinformation comes from just 12 people.

“We permanently ban pages, groups, accounts that repeatedly break our rules on Covid misinformation, this includes more than a dozen pages, groups, accounts from these individuals,” he said.

The spokesman added that Facebook has removed more than 18m pieces of Covid misinformation flagged more than 167m pieces of information, connecting users to its Covid-19 information center.

“We remain the only company to partner with more than 80 fact-checking organizations covering over 60 languages, using AI to scale those factchecks against duplicate posts across our platform,” he said.

Congress has also taken note of the spread of vaccine misinformation on Facebook other platforms, with the Democratic senator Amy Klobuchar introducing a bill that would target platforms whose algorithms promotes health misinformation related to an “existing public health emergency”.

The bill, called the Health Misinformation Act, would remove protections provided by the internet law Section 230, which prevent platforms from being sued over content posted by their users in such cases.

“For far too long, online platforms have not done enough to protect the health of Americans,” Klobuchar said in a statement on the bill. “These are some of the biggest, richest companies in the world, they must do more to prevent the spread of deadly vaccine misinformation.”

[ad_2]

Source link