A team of researchers has developed a new AI system that can help improve the credibility of Wikipedia citations by 70 per cent. The system, called SIDE (Semantic Inference for Document Evaluation), works by rigorously checking the accuracy of primary sources and proposing new sources to back up the information presented.
In a recent study, participants preferred SIDE’s suggested citations over the original ones a substantial 70 per cent of the time. The AI managed to identify the top reference already used by Wikipedia in nearly 50 per cent of cases. Additionally, SIDE outperformed human annotators by providing recommendations that were deemed appropriate 21 per cent of the time.
SIDE’s capabilities are currently limited to web page references, while Wikipedia incorporates various other forms of citations, including books, scientific articles, and multimedia content. Moreover, the nature of Wikipedia, where anyone can add references, might introduce biases into the system.
SIDE is also dependent on training, and could be prone to the exposure of the biases of its programmer. This data used to train and evaluate SIDE’s models could be limited in that regard. But still, the benefits of using AI to streamline fact-checking, or at least use it as a supportive tool, could be useful.
The sources for this piece include an article in Engadget.