Advertisement

Did Facebook Withhold a Report That Made it Look Bad? It Seems So.

The story said that the coronavirus vaccine could lead to death.

A few days ago, Facebook released a "first-ever" report. It focused on the content that was most viewed by people in the United States. Specifically, it looked at the last quarter. 

"Today, we’re releasing the first in a series of reports that will give an overview of the most widely viewed content in News Feed, starting with domains, links, Pages, and posts in the U.S.," said a blog on Facebook's site

However, The New York Times revealed that the company had already produced a similar report for the first quarter of 2021, one that it ultimately decided not to share because its contents may make Facebook look bad. 

The New York Times obtained a copy of the report and revealed that the most-viewed link in the first quarter was a headline that read, “A ‘healthy’ doctor died two weeks after getting a COVID-19 vaccine; CDC is investigating why.” The headline came from an article that was published by The South Florida Sun Sentinel and republished by The Chicago Tribune.

You may wonder why this is a problem? What does a most-viewed article have to do with Facebook? Well, it turns out Facebook has had issues with users promoting vaccine hesitancy and with users posting content that contradicts expert consensus. So much so that many have claimed that an unconscionable amount of Facebook's revenue comes from fake news.

In fact, President Biden even alleged that the company allowed misinformation about coronavirus vaccines to flourish. And other White House officials have said that many Americans are likely reluctant to get a vaccine, at least in part, because of false or misleading information they have read on Facebook.

And the amount of misleading information on Facebook is rather abundant. 

In May of 2021, the Center for Countering Digital Hate (CCDH) and the Anti-Vax Watch found out that just 12 people were behind the majority of misleading COVID-19 anti-vaccine posts and comments on social media.

Advertisement

The report discovered that up to 65 percent of anti-vaccine content posted or shared on Facebook and Twitter between just February 1 and March 21 could be brought back to the "Disinformation Dozen," a nickname the researchers gave the 12 individuals responsible for spreading these messages. On Facebook alone, 73 percent of all anti-vaccine content was found to come from members of the Disinformation Dozen over the months of May and April 2021.

So, did Facebook withhold that report? “We considered making the report public earlier but since we knew the attention it would garner, exactly as we saw this week, there were fixes to the system we wanted to make,” Facebook spokesman Andy Stone said in a statement to The Verge.

Stone further responded to our questions with the following tweets:

Follow Us on

Stay on top of the latest engineering news

Just enter your email and we’ll take care of the rest:

By subscribing, you agree to our Terms of Use and Privacy Policy. You may unsubscribe at any time.