The decline and fall of Wikipedia?
Now, Wiki is the sixth most widely used website in the world. Every month 10 billion pages are viewed on the English version of Wikipedia alone. The whole thing is operated but by volunteers who generally work under pseudonyms.
When Wikipedia launched in 2001, it wasn’t intended to be an information source in its own right. Wales, a former-financial trader, and Sanger, a philosophy PhD, started the site to boost Nupedia, a free online encyclopedia started by Wales that relied on contributions from experts. Sanger and Wales hoped Wikipedia, where anyone could start or modify an entry, would rapidly generate new articles that experts could then finish up.
When they saw how enthusiastically people welcomed the encyclopedia that anyone could edit, Wales and Sanger quickly made Wikipedia their main project. By the end of its first year it had more than 20,000 articles in 18. It was growing so fast that in 2003, Wales formed the Wikimedia Foundation to operate the servers and software that run Wikipedia and raise money to support them.
But control of the site’s content remained with the community dubbed Wikipedians, who developed sophisticated tools for producing and maintaining entries.
However, now Wikipedia and its ambition to “compile the sum of all human knowledge” are in trouble. The volunteer workforce that built the English-language Wikipedia—and must defend it against vandalism, hoaxes, and manipulation—has shrunk by more than a third since 2007 and is still shrinking, according to MIT Technology Review story headlined “The Decline of Wikipedia.”
The researchers say the main source of the problems is the loose collective running the site today, which is estimated to be 90 percent male, and which operates a crushing bureaucracy that deters newcomers who might increase participation in Wikipedia and broaden its coverage.
Among the significant problems that aren’t getting resolved is the site’s skewed coverage: its entries on Pokemon and female porn stars are comprehensive, but its pages on female novelists or places in sub-Saharan Africa are sketchy, some analysts say.
And the latest trouble has hit the resource. It has been revealed that the site’s volunteer editors had uncovered a major ring of “sockpuppets,” or bogus user accounts, that were allegedly editing articles on behalf of paying clients. That’s a serious problem, because Wikipedia articles are supposed to be unbiased and not promotional.
According to the Daily Dot, these sockpuppet accounts were linked via to one original bogus account with the username Morning277, which had been active since November 2008. To make the articles look more credible, they would often write
and edit them by several different accounts working together, and would insert vague citations to mainstream news accounts.
As a result of the investigation, the admins have shut down some 250 accounts linked to the sockpuppet ring.
The Daily Dot’s Simon Owens traced several of the deleted pages to a company called Wiki-PR, a firm that offers to help clients “tell your story on Wikipedia.” Its services include “page creation and editing,” “page management,” and even “crisis editing,” in which Wiki-PR pledges to “directly edit your page using our network of established Wikipedia editors and admins.”
But Wiki-PR is defending itself, telling the Wall Street Journal that “the ‘PR’ in ‘Wiki-PR’ is a misnomer”—it’s simply a “research and writing firm” that helps clients ensure that articles are neutral and accurate.
But how could this happen at all? MIT Technology Review’s Tom Simonite says that the number of active editors on the site has been declining for about the past five years.
Editing-for-pay has been a divisive topic inside Wikipedia for many years as it can lead to biased and promotional articles, while it’s clear that current team fails to maintain Wiki standards of quality. On the other hand, the investigation makes it clear that those admins continue to be as committed as ever to keeping the site clear.
The biggest risk, then, may be that the sockpuppet affair gets too much attention— the MIT Tech Review story explored. Wikipedia may not be in decline, but it will be hard-pressed to significantly improve until it can figure out how to attract more talented and dedicated editors with different interests and backgrounds from around the world. The question is, who will do the paying?
The project’s most active volunteers introduced a raft of new editing tools and bureaucratic procedures intended to combat the bad edits. They created software that allowed fellow editors to quickly survey recent changes and reject them or admonish their authors with a single mouse click. They set loose automated “bots” that could reverse any incorrectly formatted changes or those that were likely to be vandalism and dispatch warning messages to the offending editors.
The tough new measures worked. Vandalism was brought under control, and hoaxes and scandals became less common.
But those tougher rules and the more suspicious atmosphere that came along with them had an unintended consequence. Newcomers to Wikipedia making their first, tentative edits—and the inevitable mistakes—became less likely to stick around. The number of active editors on the English-language Wikipedia peaked in 2007 at more than 51,000 and has been declining ever since as the supply of new ones got choked off. This past summer only 31,000 people could be considered active editors. Over the same period, the proportion of those deletions made by automated tools rather than humans grew.
Plus, a 2011 survey by the Wikimedia Foundation suggested that being an active editor already required a significant time commitment. Of 5,200 Wikipedians from all language editions of the project, 50 percent contributed more than one hour a day, and 20 percent edited for three or more hours a day.
Even though Wikipedia has far fewer active editors than it did, the number and length of its articles continue to grow. What matters is quality. When Google’s search engine puts Wikipedia content into a fact box to answer a query, or Apple’s Siri uses it to answer a question, the information is presented as authoritative. Google users are invited to report inaccuracies, but only if they spot and then click an easy-to-miss link to “feedback/more info.” Even then, the feedback goes to Google, not to Wikipedia itself.
Though Wiki is still a great resource it’s high time users treated it with a grain salt.