Wikipedia is undoubtedly the world's most widely read website, which contains information on almost any subject. One wonders how this vast space keeps the exactness of its content intact, well-organized, and updated. It's not just the work of millions of human editors; it's the intelligent use of artificial intelligence that plays the major role behind the curtains.
Because Wikipedia hosts more than 6.6 million articles in English and 59 million worldwide, human editors can't monitor all pages for errors or acts of vandalism. Of course, most changes made each day are innocent. However, AI's role in keeping the site credible is very important. An important tool that has been employed is ORES, a machine-learning algorithm that was released in 2015.
ORES enables the edition to identify relatively rapidly, damaged, or infrequently constructed edits and thus prevents false information from going on the site for as short a time. Today alone, more than 100,000 changes go through, and the ORES algorithm rates the possibility of each edit being useful or damaging based on patterns derived from millions of edits that came before. Although not perfect, ORES significantly reduced the time editors have spent reviewing and addressing problematic changes to ensure the resulting content is trustworthy while increasing the rate of community oversight.
Apart from detecting harmful edits on a piece, the AI tools in Wikipedia also assist human editors in improving the quality of an article. Machine learning algorithms analyze the article sections that need expansion and suggest sources that add value to a particular subject matter. For example, Wikipedia's AI system may indicate articles that contain little or no citations or that require more input for quality purposes. The millions of articles on the site mean AI filters out what requires attention from human editors. The integration of human oversight complemented by AI recommendations ensures that Wikipedia holds its credibility.
While human editors write most content on Wikipedia, bots are still involved in certain articles. The bots are mainly used for articles on species, geographic places, and historical events, where structured data assures accuracy. For instance, Lsjbot has generated over 2.7 million articles on the Swedish Wikipedia. This database queries taxonomy and other geologic records to give accurate and informative entries. Some worry that AI will eventually replace human editors. These bots, however, only edit highly structured information, which has a low risk of error.
AI complements the work of Wikipedia editors but does not replace them since humans do critical thinking, while repetitive, mundane tasks fall under the discretionary power of AI technology. This method of augmentation helps contribute relevant, precise, and trustworthy content, which has accounted for Wikipedia's success to date.
In short, the linkage between technology and human expertise for Wikipedia's use of Artificial intelligence is in that direction. The exchange draws together efficiency at the algorithmic level and the judgment tied to it brought by having human editors create content that will remain relevant, precise, and trustworthy for millions of users around the world.