The research shows that the Wikipedia’s bots have been interacting with each other in ways that sometimes led to unpredictable outcomes.
“These bots are not supposed to override each other. They use the same technology and should be able to communicate with each other in a much more civilized way,” Taha Yasseri said.
The software robots have been servicing Wikipedia ever since the project was launched.
When asked whether Wikipedia provided all the data used by its software robots, Dr. Yasseri said that the whole history of the encyclopedia is recorded and so it can easily be accessed.
“The data is available, but we are talking about terabytes of data, so it took us quite a while to go through all these revisions. In some of our language editions almost half of the edits are done by bots, and the consequences of these edits could be huge given the massive amount of edits being made by bots.”
When asked how these bots function, where they come from and if those willing to edit some of the Wikipedia articles can create their own bots, he said that creating a bot requires a certain degree of experience.
However, the programmers, who create the bots, use different scripts that eventually lead to such conflicts.
“There are no normal editors looking after the work being done by these bots and this is one of the reasons of the conflict we see going on between different bots. The main reason for conflicts is lack of central supervision of bots,” Taha Yasseri said.
“Having one bot doing everything would be against the whole idea of Wikipedia where articles are edited by people from all over the world, not by people from organizations.”
“The openness of the system also makes these problems unavoidable, but we opted for an open system,” Dr. Taha Yasseri said in conclusion.
Have you heard the news? Sign up to our Telegram channel and we'll keep you up to speed!