New AI Helps Spot Scientists Overlooked by Wikipedia And it Turns Out Most Are Female

The system referred to as "the industry's first machine-generated knowledge base" can also automatically draft Wikipedia-style entries about the missing scientists.

Last month, US politicians were furious when it was revealed artificial intelligence (AI) criminal spotting technology was revealed to potentially have dangerous racial biases. This Friday, however, a new AI tool was introduced with the hope that it holds the ability to help correct bias.

The new technology was discussed in a company blog post by director of science for AI startup Primer John Bohannon. In the article, the industry expert outlined his firm's latest machine learning system called Quicksilver, an innovation created to try and correct Wikipedia's many scientist omissions.

Female scientists overlooked

What the system inadvertently uncovered was that most of these overlooked scientists were women. Once this bias was identified, Primer went one step further collaborating with non-profit female STEM group 500 Women Scientists to use Quicksilver in Wikipedia editathons to improve the website's coverage of women of science. 

AI

11 Times AI Beat Humans at Games, Art, Law and Everything in Between

But how did this impressive tool Primer calls "the industry's first machine-generated knowledge base for scientists" come to identify these omissions? First, Quicksilver was fed 30,000 scientist entries that included Wikipedia articles, Wikidata entries, and over 3 million news coverage sentences relating to scientists' work.

Once that information was stored, Primer's team uploaded the names and affiliations of 200,000 scientific paper authors. It took Quicksilver only one day to determine that 40,000 of those authors did not have corresponding Wikipedia entries.

The tool even flagged up important information missing from existing entries. However, Quicksilver did not stop there.

The system proceeded to automatically draft Wikipedia-style entries on the omitted scientists using information it had been fed. Hoping to encourage people to post these overlooked entries to the online encyclopedia's database, Primer published 100 of these Quicksilver-generated articles online.

A possible solution to Wikipedia's gender bias

Could Primer's impressive self-flagging, self-writing, self-updating knowledge base be the answer to Wikipedia's gender bias issues? The digital encyclopedia's many gender-related inconsistencies are one of the site's most frequent criticisms and have been acknowledged with their very own Gender bias on Wikipedia entry.  

The problem has been mostly attributed to the low percentage of female Wikipedia contributors. A 2008 survey found that less than 13% of the site's editors worldwide were women and a 2011 follow up edition revealed that number had further decreased to a mere 9%.

Advertisement

Primer is very aware of this ongoing predicament and its potential damaging future implications. "As it becomes more and more essential to the world, biased and missing information on Wikipedia will have serious impacts," writes Bohannon.

The AI expert believes Quicksilver has come to the rescue ready to support "the human editors of the most important source of public information" through machine learning. "To solve the recall problem of human-generated knowledge bases, we need to superpower the humans," he concludes.

Via: Primer

 

 

Advertisement