Facebook (now known as Meta) reportedly has proposed a new plan to use AI to improve the accuracy of articles on the left-wing “encyclopedia” Wikipedia.
Digital Trends reports that Facebook recently announced that its AI division, Meta AI, has developed a machine learning model that can automatically scan hundreds of thousands of citations at once across Wikipedia to detect if they support corresponding claims, improving the accuracy of the website. Facebook’s blog on the subject states, “Wikimedia and Meta are not partnering on this project. The project is still in the research phase and not being used to automatically update any content on Wikipedia.”
Fabio Petroni, research tech lead manager for the FAIR (Fundamental AI Research) team of Meta AI, said in a statement: “I think we were driven by curiosity at the end of the day. We wanted to see what was the limit of this technology. We were absolutely not sure if [this AI] could do anything meaningful in this context. No one had ever tried to do something similar [before].”
The machine learning model used was trained with a dataset of 4 million Wikipedia citations. The new tool is able to effectively analyze the information linked to a citation and then cross-reference it with supporting evidence.
Petroni commented: “There is a component like that, [looking at] the lexical similarity between the claim and the source, but that’s the easy case. With these models, what we have done is to build an index of all these webpages by chunking them into passages and providing an accurate representation for each passage … That is not representing word-by-word the passage, but the meaning of the passage. That means that two chunks of text with similar meanings will be represented in a very close position in the resulting n-dimensional space where all these passages are stored.”
While the tool can spot fraudulent citations, it can also suggest better references for existing citations. But there is still much work to do before the tool is usable, says Petroni. “What we have built is a proof of concept. It’s not really usable at the moment. In order for this to be usable, you need to have a fresh index that indexes much more data than what we currently have. It needs to be constantly updated, with new information coming every day.”
Breitbart News has frequently reported on Wikipedia’s left-wing bias. Most recently, the site’s editors feverishly edited the article on recession to better reflect Biden administration talking points.
Breitbart News reporter Allum Bokhari wrote:
Editors of the leftist-dominated online encyclopedia are pushing a definition of “recession” that is unusually broad and favors the Biden administration’s claims that no recession has occurred. This definition, from the National Bureau of Economic Research (NBER), claims that a recession is a ” significant decline in economic activity spread across the market, lasting more than a few months.”
What was until recently the broad consensus on the definition of a recession — two consecutive quarters of negative GDP growth — remains at the top of the page, but editors have been attempting to remove it. This definition is also described as the United Kingdom’s definition.
The article continues to note, further down the page in a section on the definition, that “in a 1975 New York Times article, economic statistician Julius Shiskin suggested several rules of thumb for defining a recession, one of which was ‘two down quarters of GDP.’”
Read more at Digital Trends here.
Update — Facebook clarified in a blog post that “Wikimedia and Meta are not partnering on this project. The project is still in the research phase and not being used to automatically update any content on Wikipedia.” This article has been edited to reflect the clarification.
Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship. Follow him on Twitter @LucasNolan