Let’s begin with the positive: YouTube has recognised it has a problem, and that’s widely regarded to be the first step on the road to recovery. This particular problem is that its best-performing videos are often conspiracy theory-filled garbage polluting the minds of impressionable young viewers who are growing up under the impression that all the answers can be found online.

So that’s a good start. After you’ve recognised you have a problem, the next step is to seek help, right? But where does a $75 billion company like YouTube turn to for help? If your answer was to its even richer parent company Google, then you’re about to be sorely disappointed. Rather, YouTube has turned to non-profit Wikipedia – yes, the same Wikipedia that is often found begging for donations.
The development was announced by YouTube CEO Susan Wojcicki at SXSW this week and will involve “information cues” popping up alongside videos with dubious content, with a link to Wikipedia so that video watchers can learn more. For example, a video explaining that the moon landing was a hoax would have a link to the Wikipedia article explaining in detail how it was accomplished and why it would have been very hard to fake.
It’s, at best, a very weak solution. This is why.
1. Video consumers aren’t necessarily big readers
If you’re on YouTube, you want to watch video. And if you do want to read text, then the comments section will dissuade you of that impulse pretty quickly.
What I’m saying is that while it would be a gross oversimplification to say YouTube watchers aren’t big readers, on average it’s unlikely to be their chosen medium. So how can a dense 20,000-page Wikipedia article compete with a snappy, smartly produced five-minute video?
Spoiler: it can’t.
2. YouTube can afford its own solutions
YouTube is, as I mentioned above, valued at $75 billion. Alphabet, which owns its parent company Google, is the world’s most valuable business. Wikipedia, while hardly poor, is still a non-profit run by volunteers.
For YouTube to simply pass the responsibility of educating its users onto a financially poorer website is both lazy and irresponsible. It can and should do better.
3. Wikipedia isn’t set up to handle an influx of conspiracy nuts
While Wikipedia has a hard-earned reputation for objective truth, it remains a resource that anyone can edit – and this means a lot slips through the net.
But that’s not even the worst problem here: if you point a bunch of conspiracy theorists to a website telling them they’re bonkers, is their reaction to change their mind or fight? Wikipedia might be able to withstand some vandalism on its articles, but a site the size of YouTube sending over its most paranoid members is something else entirely.
I should add at this point that the first Wikipedia heard about YouTube’s plan was when it was announced at SXSW. “Neither Wikipedia nor the Wikimedia Foundation are part of a formal partnership with YouTube,” the company revealed in a statement to Gizmodo. “We were not given advance notice of this announcement.”
4. This doesn’t tackle the root of the problem
Trying to push people away from conspiracy-theory videos on YouTube is a noble ambition, but a more pertinent question is to ask how they’re getting there in the first place. On this point, YouTube’s hands are far from clean.
It’s not just the content YouTube promotes, either. The system is designed to push viewers towards more and more extreme content, as this New York Times feature explains. “It seems as if you are never ‘hard core’ enough for YouTube’s recommendation algorithm. It promotes, recommends and disseminates videos in a manner that appears to constantly up the stakes,” writes the piece’s author Zeynep Tufekci, who experimented with a bunch of new accounts to see what YouTube would recommend. “Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.”
Yes, this is an algorithmic problem rather than through design, but while people are being directed to the fringes, YouTube is still far more on the side of the problem than the solution.
5. This can’t tackle every bit of propaganda on YouTube
But even assuming this solution was a good one, how would it work beyond the obvious examples? It’s easy to spot moon-landing conspiracy theories, but what about modern urban legends, and those started on YouTube with no wider debunking? What Wikipedia article would these link to? What about videos that just nonchalantly drop in fake news, but don’t mention the topic will be covered in the description? How would YouTube know?
Guess what: Most conspiracy theorists don’t put “conspiracy theory” in the title of their videos because they believe they’re sharing the truth. Good luck catching every instance of “truth”, when 65 years worth of video gets uploaded every day.
6. Conspiracy theorists don’t trust the likes of Google to tell the truth
One thing that tends to unite conspiracy theorists is their distrust of big corporations. Exactly why would someone susceptible to paranoid nonsense trust YouTube to fairly adjudicate what is true and what isn’t?
Weak, weak, weak
In short, as solutions go it’s better than nothing, but only just. A company of YouTube’s resources can and should do better – and not just over the embrace of fringe views, either.
It needs to crack down on a business model that rewards dangerous attention seekers with huge amounts of money. It needs to find a solution to the spread of extremist content that’s half as effective as its own porn-blocks. It needs to actually accept the responsibilities it has and police its own content effectively, not just tinker around the margins.
So far, only the risk of losing advertising dollars has prompted any action at all, and while governments talk tough on the platform, the truth is that they’re too weak and disorganised to actually do much about it. That’s deeply unfortunate because if sprinkling a couple of Wikipedia links on a video is the best we can hope for, the company sure won’t take the lead in regulating itself.
Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.