30 tech myths debunked
Macs don’t get viruses. We’ve all heard that “fact” a lot over the years, but as several recent outbreaks have painfully demonstrated, it isn’t true at all. Neither is the one about expensive cables. Nor the one about megapixel counts, or even the one where you can crash a plane with your mobile phone if you forget to push a button before take-off.
Whether they’re peddled by national newspapers, sold as truth by greedy retailers, or simply parroted down the pub on a Friday night, these tech myths are everywhere, and for all our sakes it’s time they were debunked. In this feature, we’ll round up the most common untruths in our industry, and for each one we’ll patiently explain the real facts. So the next time you hear one of these myths uttered as fact, you know what to do: get debunking.
“Bill Gates said 640K was enough”
Introducing an IBM PC back in 1981, the Microsoft boss infamously said its “640K ought to be enough for anybody”. It’s a quote that’s laughably short-sighted, and one Gates strenuously denies making. In a Bloomberg Business News Q&A in 1996, he wrote: “I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There’s never a citation; the quotation just floats like a rumour, repeated again and again.” When asked about it again in 2001, he said: “Do you realise the pain the industry went through while the IBM PC was limited to 640K? The machine was going to be 512K at one point, and we kept pushing it up. I never said that statement – I said the opposite of that.” It’s his word against the internet’s.
“Linus Torvalds invented Linux”
This is a mistake you’ll make only once, if you’ve ever had the misfortune to lunch with GNU Project founder Richard Stallman. Most people think of “Linux” as the entire operating system, whereas in fact it’s merely the kernel – the part of the OS that bridges the gap between hardware and software. The main system space tools and libraries emerged from Stallman’s GNU Project, which is why he irritably insists that the OS is referred to as GNU/Linux. So, while Torvalds was indisputably the driving force behind the kernel that was named after him, he can’t really take credit for the entire operating system.
“Microsoft stole the Windows UI from Apple”
In 1998, Apple filed a lawsuit against Microsoft claiming Windows used interface elements too close to those on the Mac. Even today, many still argue Microsoft stole its style from Apple, but it didn’t quite happen that way. Apple actually licensed many UI elements to Microsoft for use in Windows 1, and only objected when more arrived in the slicker Windows 2. Apple argued the licence was only for one version of Windows; Microsoft argued otherwise. As Bill Gates explained at the time, “we’re saying that these graphic interface techniques, the ideas, are not copyrightable.” After five years the judge sided with Microsoft, and the rest is history.
“Android is an open source operating system”
Built on Linux and released under the Apache 2 software licence, Android is strictly speaking an open source project, in that anyone can download, modify and redistribute the code. However, although the current stable version is available for download, work on the next version takes place behind closed doors, and it’s only released when Google is ready. That goes against one of the principal tenets of open source software: community.
A study in late 2011 named Android the most “closed” of eight open source projects, stating that it “would not have risen to its current ubiquity were it not for Google’s financial muscle and famed engineering team”. Firefox co-creator Joe Hewitt agrees. “It kills me to hear the term ‘open’ watered down so much,” he wrote in 2010. “It bothers me that so many people’s first exposure to the idea of open source is an occasional code drop, and not a vibrant community of collaborators like I discovered ten years ago with Mozilla.” So, Android: a bit more open than iOS, but not nearly as open as it could be.
“Downloaded software should be cheaper than a boxed copy”
The snap judgement that digital downloads should be much cheaper isn’t always correct. Although hardware costs such as manufacturing, shipping and storage space are eliminated, additional costs are introduced. There’s the cost of running the servers; there’s the fact that you pay the tax rate of the country in which the online store is based; and there’s the simple fact that developing software is expensive, and only a fraction of the purchase price goes on the box and manual.
“Software ‘bugs’ derive from a real instance of insects in a computer”
In 1947, operators of the Harvard Mark II, an early electromechanical computer funded by the US Navy, traced an error to a moth trapped in a relay. This, so the story goes, is what coined the use of the terms “bug” and “debugging”. A nice tale, but in fact the use of the word has been recorded among engineers as far back as Thomas Edison in 1878, and there’s no doubt the Harvard operators were already familiar with its meaning, even if it perhaps wasn’t widely used in the computing world. As the log from that day – complete with poor moth still taped to it – notes, “first actual case of bug being found”.