Wikipedia uncovered

Some call it a miracle of the information age. Lauded by science journals, wealthy tycoons, national newspapers and government ministers, in the space of six years Wikipedia has become one of the most widely consulted knowledge resources in the world. Leapfrogging rivals such as Britannica, it has become the online encyclopedia. The fact that it isn’t a professional endeavour but the product of thousands of volunteers only makes this all the more amazing.

Wikipedia uncovered

Yet, Wikipedia doesn’t earn universal admiration. It’s been described as a cult; a faith-based encyclopedia; a glorified repository for trivia. Indeed, there’s evidence of a Wiki-backlash. In February, an influential contributor – or Wikipedian as they’re known – was caught lying about his academic credentials. Then in April, one of the site’s founders, Larry Sanger, described Wikipedia as “broken beyond repair”, listing problems ranging from “serious management problems, to an often dysfunctional community, to frequently unreliable content”. With scandals in its history and criticism rising, Wikipedia’s Utopian veneer is fading.

So what’s the truth? Even the project’s guiding father, Jimmy Wales, describes it as “a work in progress” with “mistakes that haven’t been caught yet”. Yet even Wikipedia’s critics have to admire the breadth of content (1.8 million plus articles in the English language and counting) and admit that at least parts of it are excellent. And would Sanger have been so critical were he not pushing Citizendium, his spin on the same idea?

Time for some answers. We’re going to examine how articles in Wikipedia are constructed, how the facts are checked and how vandalism is prevented. We’ll investigate the community and ask whether these people can be trusted. We’re even going to put the site to the test. Whatever you think about Wikipedia, prepare to change your mind.

Click here to read ‘Wikipedia vs The Old Guard’.

Click here to read ‘How quickly are errors corrected?’.

How can you get involved with Wikipedia? Read our hands-on guide at www.pcpro.co.uk/links/154wiki.

How Wikipedia works

To get a handle on what makes Wikipedia unique, consider how things work on a more traditional encyclopedia – MSN Encarta. Throughout the year, the site’s expert editorial team lists articles that should be included and those that need reworking. They schedule ahead, read around and then commission academics and experts in the relevant fields. The resulting drafts will be edited and scrupulously fact-checked, looking for signs of bias or omission. Before it goes online, the article will be signed off by everyone involved. It’s slow and bureaucratic, but, to a degree, it guarantees quality, accuracy and reliability.

On Wikipedia, the process is very different. Anyone can go online and edit anything. Spot a mistake? Well, just click on the edit link at the top of every article, make the changes, then save to the live encyclopedia. And if there isn’t an article already? Just make one. It’s not a top-down project such as Encarta, but a bottom-up, functioning anarchy.

Still, as experienced Wikipedians will know, the trick isn’t making changes, it’s keeping them. Up to 5,000 new pages are deleted every day, most because they’re silly experiments or outright vandalism. Some will be removed immediately by the privileged users known as Admins. Others will be nominated on the Wikipedia: Articles for Deletion page. If a consensus emerges that the entry doesn’t meet Wikipedia’s criteria for inclusion, or that it merely extends another piece, it can be deleted or merged as necessary.
Most edits will come under similar scrutiny. Even though there’s no formal peer review process, your work is likely to be checked by other editors, who may post comments on an article’s talk page, rework your edit, or simply revert the page to the state it was in before you arrived. Hard-core Wikipedians will have a long watchlist of articles they take special interest in, and some will cruise the “Recent Changes” page. Experienced Wikipedians may have custom tools that alert them to particular changes.

The Wikipedia recipe

For a clear example of how the collaborative magic of Wikipedia works, click the History tab on an article, and you’ll probably be amazed at how many edits and reverts have gone into what you see onscreen. Some will be in response to vandalism and some to sort out poor writing, grammatical errors or typos, but a surprising number will be because the edit didn’t conform to Wikipedia’s strict criteria.

In Wikipedian theory, the sum of all this collaboration and consensus should be continual growth and improvement, every article making a butterfly transformation from graceless “stub” (a short, unpolished article) to a “gold standard” Featured Article (FA). The FAs are meant to exemplify the site’s structure and style, as well as being reliable and stable.

The problem with this process is its glacial speed. At the time of writing, only 1,393 of the 1.8 million articles in the English edition have made the grade. At the current rate of progress, Wikipedians estimate that it will take thousands of years before the majority of entries reach FA status. What’s more, there are no guarantees that the FA you look at today is of the same quality or reliability as the one that was elevated to FA status originally. They’re critiqued from time to time, and 373 FAs have been demoted during Wikipedia’s lifetime. “While initially poor articles may indeed improve over time, initially superior ones will degrade, with all tending to middling quality and subject to random fluctuations,” Robert McHenry, former editor of Encyclopedia Britannica, said in an article on Wikipedia.

The wisdom of crowds?

The fact is that Wikipedia’s approach has its upsides and downsides. On the one hand, the system has enabled Wikipedia to grow from zero to 5 million articles (in all languages) in a six-year period – an incredible achievement. On the other, countless man-hours are spent fighting vandalism or engaged in nitpicking and infighting. At worst, this takes the shape of so-called “edit wars”, where two or more editors take turns to edit or revert each other’s work.

It’s difficult to say how rife these problems are, but commentators such as blogger Jason Scott and Citizendium’s Sanger have painted a picture of a community wrapped in its own systems, culture and jargon, and fearful of outsiders, particularly experts. They have a point. Wales himself has talked about a hard-core group of dedicated volunteers, where less than 2% of the users make nearly 75% of the edits.

In fairness, the community disparages belligerent activity, often with a touch of humour (see Wikipedia: No Climbing the Reichstag dressed as Spider-Man, or Wikipedia: No Angry Mastodons for details) but it does go on. The problem is that edit wars aren’t necessarily won by the most logical argument, but often through sheer persistence. Provided each editor can provide enough citations or rules to maintain their case, the winner is the one who fights longest, or can persuade other users or Admins to take their side.
There are also those who monitor their watchlists a little too closely. “There are some people who are extremely persistent and stubborn, and they have a lot of time on their hands,” Citizendium’s Sanger explains, and these people will “squat” on the page, then watch changes made with a specific agenda in mind. “What’s ironic about this,” Sanger suggests, “is that you have some people who have no compunctions about accusing other people of what they call POV-pushing [point-of-view pushing] – in other words, making biased edits – when what those people are actually trying to do is reduce the amount of bias that’s already in the article.”

Even the most ardent Wikipedian would admit the system has flaws. “You don’t have to go far to find mistakes,” says David Gerard, a volunteer media contact for Wikipedia in the UK, “but we work toward quality.” In Gerard’s view, the important thing that Wikipedia’s users need to understand is that “Wikipedia isn’t a finished product, but a live working draft.” In other words, a Wikipedia article needs to be read with a critical eye. “We don’t promise to think for the reader,” he continues. “If you see that, for example, your favourite pop star died in a bizarre toilet-related accident this morning, click the History tab to see if someone’s just having a bit of fun.”

The problem is that not everyone thinks it’s fun; not journalist John Seigenthaler (whose Wikipedia entry accused him of complicity in the death of JFK) nor golfer Fuzzy Zoeller (who sued the owner of an IP address for libel in response to claims made on his entry). The Wikimedia Foundation is certainly worried enough to implement a special function – WP: Office – to rapidly remove contentious material as a “courtesy” to angered public figures.

Human checks and balances

The job of protecting Wikipedia from vandals and maintaining peace in the articles and talk pages is, strictly speaking, the responsibility of every Wikipedian. However, certain users, dubbed “Admins”, have been granted special powers. They can protect, delete and undelete pages, block specific IP addresses from editing, or quickly revert pages in the event of vandalism. As having an elite goes against Wikipedia’s egalitarian grain, the site promotes the view that Admins aren’t privileged or special. “Just a normal user with a mop and a bucket,” as Wikipedia: What Adminship Is Not explains.

To some extent this is naive, and to know why you have to understand how key decisions are made. Wikipedia isn’t a democracy. Issues aren’t decided by votes, but by consensus. On articles where the debate becomes heated, editors discuss their concerns and attempt to reach a decision that all involved can abide by, but eventually someone has to make a decision – the Admin. True, the Admin’s decisions are accountable and there are arbitration boards, but in many cases the Admin simply makes a judgement call. It isn’t just a question of counting votes; they have to decide where the consensus lies.

This leaves a certain amount of power in the Admins’ hands. Read enough talk pages and it’s clear that some misuse their power. Take a look at the archives page on Wikipedia: Requests for Comment, and you can see allegation after allegation that Admin X blocked or banned users who disagreed with them, teamed up with Admin Y even though user Z was in the right, or used threats and wiki-lawyering – the interpretation of Wikipedia’s rules, as lawyers interpret laws, to their own ends – to push their own views. Some of these accusations are clearly motivated by spite or anger, but certainly not all.
Credentials? What credentials?

You might expect the Admins to be subject specialists or experts in their chosen field, but this often isn’t the case. While Encarta has professional editors with academic backgrounds, Admins are free to intervene in any subject, regardless of their qualifications – if they have any, that is. Wikipedia demands no credentials: Admins are Admins because they make substantial contributions to Wikipedia and because they’re trusted by the community. Even Wikipedia’s Gerard describes the community (tongue-in-cheek) as “a bunch of nerds who think writing an encyclopedia is really cool. We’re quite pleased the world likes the result.”

Here’s how it works: prospective Admins are nominated, frequently by themselves, on the Wikipedia: Requests for Adminship page. Again, consensus is key – the nomination will get postings of support, opposition and neutrality, and nominees are expected to answer any questions or comments. After seven days, a Bureaucrat (one of the 14 or so highest-ranking Wikipedians) decides where the consensus lies. There’s something wonderfully Utopian about this process; users are judged purely on the edits they’ve made and the work done. However, it leads to accusations that Wikipedia promotes cliques with shared interests.

And what worries Wikipedia’s critics is there’s no way of knowing who the Admins are. Some user pages give personal details, but others give little more than a username and a vague list of interests. Cultural, religious or political biases can be difficult to ascertain as long as the Admin isn’t blatant in their promotion of a particular point of view. It also means users can pretend to be someone else entirely.

The famous example is Ryan Jordan. Under the username of Essjay, Jordan was an Admin and influential Bureaucrat, claiming to be a tenured professor of religion with a PhD in theology and a degree in canon law. In fact, he was a 24-year-old college drop-out, a fact that embarrassingly came to light in an interview with The New Yorker.

The flipside of this is that Wikipedia can be a hostile environment for genuine experts. Wikipedians dispute this: “Anyone claiming we’re anti-expert has to explain why we have as many as we do,” says Gerard. But pages such as Wikipedia: Expert Retention tell a different story; one of academics and professionals exhausted by cranks, anti-elitist grumbles and senseless wiki-lawyering.

Cleaning up its act

Wales is plainly aware that something must be done. He’s discussed new tools and features, including a development of the long-promised stable versions feature whereby editors can define a particular page version as stable and make this the page that browsers see first, while enabling all users to edit a live version of the article. Also on the table is a procedure whereby editors who choose to disclose credentials go through a verification process. The same would go for users being considered for senior positions within the community.

Even though this last proposal is fairly watery (“the suggested verification approach isn’t mandated and no sanctions can be taken against an editor who chooses, for any reason, not to participate,” Wales’ essay on the policy reads), it has angered some Wikipedians. Many prefer the rival “Ignore All Credentials” proposal. On an encyclopedia obsessed with verification and neutrality, who cares about expertise?

The challenge for Wikipedia is this: how do you rein in the more disruptive impulses of the community without weighing it down in even more bureaucracy? As Stephen Bury, head of European & American Collections at the British Library, tells us, “Wikipedia is potentially a good thing – it provides a speedier response to new events, and to new evidence on old items.” Although whether the world really needed another opinion-filled article on the Virginia shootings just hours after it happened is debatable.
A new approach

Maybe we don’t have to throw the baby out with the bathwater. Sanger’s Citizendium combines Wikipedia’s Wiki-based approach with a more rigorous policy on identity – users are known by their real names and have to be approved by Citizendium’s “constables”. Most interestingly, it ropes in qualified experts to guide discussions and settle debates. “You can’t have genuine consensus if people are constantly disagreeing with each other,” Sanger notes, explaining that at Citizendium, if participants can’t agree, “the issue won’t be decided by whomever is most persistent.”

The problem is that it will take time – maybe too much time – for Citizendium to achieve anything like Wikipedia’s critical mass of articles and editors. With only 2,000 articles in place, it doesn’t have the content to draw a large audience, and it may need a large audience to attract contributors.

So in lieu of something better, why not take a good long look at Wikipedia and celebrate it for what it is, while making damn sure we know what it isn’t. “Sensible criticism can only do Wikipedia good,” says Gerard. “We have plenty of growing pains – no-one outside is more painfully aware of Wikipedia’s defects than those of us inside.” But the key thing for Gerard is that “every obvious criticism of Wikipedia happens all the time, and we deal with it – trolls, vandals, POV pushers, obsessive nutters, obnoxious idiots, people who mistake it for a video game and so on. We deal with this in the normal course of events, and the site remains good enough to use.”

Bury agrees. For him, the problem isn’t so much the reliability of Wikipedia’s content as the way in which it’s used. “It’s already become the first port of call for the researcher,” Bury says, before noting that this is “not necessarily problematic, except when they go no further.” The trick to using Wikipedia is to understand that “just because it’s in an encyclopaedia (free, web or printed) it doesn’t mean it’s true. Ask for evidence for this, that or the other. And contribute, especially if you disagree.”

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.