Facebook Fed Researchers Flawed Data, Caught by Its Own Transparency Report

Facebook blamed the 'inaccuracies' on a technical error.
Loukia Papadopoulos

In August of 2021, Facebook released a "first-ever" report, called the "Widely Viewed Content Report," that focused on the content that was most viewed by people in the United States in the last quarter. It was then revealed that the company had already produced a similar report for the first quarter of 2021, one that it ultimately decided not to share because its contents may have made Facebook look bad. 

Now, misinformation researchers studying the new report are starting to realize the firm also gave them incomplete and flawed data, according to a report by The New York Times. 

Over the past couple of years, Facebook has supposedly been giving researchers access to all its data to track the spread of misinformation on its platform.

The New York Times, however, reported that University of Urbino associate professor Fabio Giglietto compared the data handed over to academics with the report Facebook published publicly, and found that they did not match. Giglietto was the first to spot this inaccuracy but not the last.

What followed was a series of researchers finding the same inaccuracies. In an email seen by The New York Times, Facebook spokesperson Mavis Jones blamed the inaccuracies on a "technical error" and said the firm was "working swiftly to resolve" it but that it could take weeks due to the sheer volume of data it has to process. Facebook also told the researchers that only the data on U.S. users was inaccurate, emphasizing that data for users outside the U.S. was indeed correct. 

However, Facebook has also been known to interfere with the work of misinformation researchers. In August, Engadget reported that Facebook disabled the accounts associated with the NYU Ad Observatory project that used a browser extension to collect information on political ads.

Laura Edelson, the project's lead researcher, then told Engadget that Facebook was interfering with her team because its "work often calls attention to problems on its platform." The question then becomes: Who will police Facebook if researchers can't even get accurate data?

Add Interesting Engineering to your Google News feed.
Add Interesting Engineering to your Google News feed.
message circleSHOW COMMENT (1)chevron
Job Board