The sci-fi legends who shaped today’s tech

From the earliest days of Jules Verne and HG Wells, science fiction and technology have enjoyed a mutually beneficial relationship. Sci-fi stories and novels expressed man’s desire to conquer space, find new worlds or explore the ocean depths, and while man would probably have landed on the moon or launched deep-sea expeditions without them, these tales inspired those who made such giant leaps.

In turn, real-world technology has inspired the science-fiction writer. After all, it’s science fiction that charts what happens when humanity meets high technology, asking what will happen, where it will take us, and what we’ll find when we get there. This is as true of computer technology as it was of the space race. Perhaps, even more so.

The geek and hacker cultures that have powered so much of the PC and internet revolution are hugely sci-fi literate. Writers and experts have even crossed paths; the academics and software engineers becoming sci-fi writers, the writers earning a name as futurologists.

In this feature, we’ll explore how science fiction has motivated trends and products in computing, and catch a glimpse of where this relationship might take us in the future.

READ NEXT: The tech sci-fi writers invented

Visions of the future

Does sci-fi really have that great an impact on the technology that emerges from the labs of the world’s biggest technology companies? Labs that are so well funded (Microsoft alone spent $8 billion on research last year) that they can afford to scoop up the brightest talent emerging from MIT and beyond? Indeed it does, according to Bruce Hillsberg, director of storage systems at IBM Research in Almaden. For him, the value of science fiction is that it “paints visions of the future that cause people to think about possibilities beyond what is possible today”.

Sci-fi can consciously or unconsciously help authors think outside the box

Hillsberg believes the fact so many hi-tech visionaries are sci-fi fans, tied to fiction’s power to stimulate creative thought processes, means that an interest in the genre can lead to real breakthroughs. “I don’t believe sci-fi necessarily sets the agenda for researchers,” said Hillsberg. “That is, I don’t think most researchers try to invent what they read about or see in movies. Rather, they try to move science or technology forward, and sci-fi can consciously or unconsciously help them think outside the box.”

History bears out his theory. Do a little digging and you’ll be surprised to find how many big names in the computing world are sci-fi fans: Apple’s Steve Wozniak, Netscape’s Marc Andreessen, Tim Berners-Lee, Google’s Sergey Brin and the GNU Linux creator Richard Stallman, to name only a few of the tech elite. Microsoft co-founder Paul Allen has even helped fund a museum of science fiction in Seattle.

To Hal and back

It wasn’t long into the history of computing that the sci-fi greats began to see technology’s potential. During the 1950s, Isaac Asimov wrote a sequence of stories featuring Multivac, a huge, artificially intelligent computer, culminating in the classic The Last Question – a tale that tracks the evolution of Multivac and the human race.

Asimov recognised that computers would grow both smaller and more powerful, with Multivac transforming from a sprawling giant into an entity that exists outside of space and time. He merely underestimated the timescale – Asimov thought it would take thousands of years for Multivac to shrink to a vaguely mobile form.

Computing owes an even greater debt to Asimov’s contemporary, Arthur C Clarke. In his work on the 1968 film and novel 2001: A Space Odyssey, Clarke created HAL, the model for all future dedicated, logical, mildly psychotic AI. In creating HAL, Clarke and director Stanley Kubrick sought guidance from Marvin Minsky, co-founder of the Artificial Intelligence Laboratory at MIT. In turn, the film would inspire a new generation of engineers and designers, including a young Rodney Brooks, who would go on to be director of that same institution.

In the book HAL’s Legacy: 2001’s Computer as Dream and Reality, Brooks describes the movie as “a revelation, because I grew up in a place without a lot of technology and I was really interested in AI and then to see that movie, it told me that there were other people in the world with the same sort of weird ideas that I had.” For Brooks, “the film really inspired me and pushed me to push my whole life towards Artificial Intelligence”.

2001: A Space Odyssey

Clarke also influenced the man who would go on to create the World Wide Web. In a 1997 interview with Time magazine, Tim Berners-Lee mentions a youthful fascination with Clarke’s 1964 short story Dial F for Frankenstein, where computers networked together pass a critical threshold and learn to think autonomously. In the interview, Berners-Lee makes it clear that he doesn’t see the web as the fulfilment of Clarke’s prophecy, but he does see it as having emergent properties with the potential to transform society – and 12 years later, he’s been proven right.

Virtual reality and viruses

Science fiction hasn’t only stimulated the beneficial side of computing. The origins of hacking and viruses can also be traced back to the pages of a novel.

British author John Brunner’s 1975 novel The Shockwave Rider is a fantastically prescient work, describing large-scale networks, phreaking, hacking and genetic engineering long before such things entered the mainstream consciousness. Not surprisingly, the book is widely acknowledged as an influential text for the nascent hacker movement. However, it was The Shockwave Rider’s description of a self-replicating program that could propagate across a network, destroying all bonds of secrecy, that had unintended results.

In the Amazon.com listing for the book, you’ll find one Carnegie Mellon University alumni mention that it was “an unofficial but necessary part of our education” to locate and read a copy

In 1982, John F Shoch and John A Hupp, researchers at Xerox PARC, created their own real-world equivalent: a small program designed to identify idle CPU cycles on the network, but which rapidly grew beyond its creators’ original intentions. Brunner had called his program a “worm”, and Shoch and Hupp borrowed the term for their subsequent research paper in honour of the book that inspired them.

Vernor Vinge’s 1981 novella True Names was, if anything, even more prophetic than The Shockwave Rider, describing immersive worlds and aspects of internet culture and the geek mentality that seem eerily familiar today. Again, True Names was influential on the hacker culture. In the Amazon.com listing for the book, you’ll find one Carnegie Mellon University alumni mention that it was “an unofficial but necessary part of our education” to locate and read a copy.

However, it was with the Cyberpunk movement that sci-fi really met the computing world head-on. Spearheaded by William Gibson’s Neuromancer, Cyberpunk revelled in a synthesis of Raymond Chandler and Philip K Dick, taking stylistic cues from Ridley Scott’s Blade Runner – an adaptation of Dick’s Do Androids Dream of Electric Sheep? – and its combination of neon-lit future and urban decay. Neuromancer was particularly influential.

If Gibson didn’t coin the term “cyberspace” in his debut novel or the short stories that preceded it, he certainly popularised it, and this is where the concept of the internet as an alternate reality where lives can be transformed began.

Gibson helped focus the geek imagination on the possibilities of online worlds and global networks, and on the convergence of hi-tech and the human body; issues that would be further explored in The Matrix in 1999 (the film’s title is itself employed by Gibson as another name for cyberspace).

The motherlode

Yet Neuromancer is far from being the most influential novel in the history of IT. For that, we need to look to Neal Stephenson’s 1992 book, Snow Crash. Stephenson isn’t only an excellent writer, but a computing enthusiast and knowledgeable programmer – the sort of guy who plays with Mathematica in his spare time. Blessed with a big imagination and a sense of what makes the geek heart tick, Stephenson expanded on the ideas of Gibson and Vinge with a vision of how a virtual-reality-based internet, dubbed the Metaverse, might look. In doing so, he helped popularise the use of the word “avatar” as an online persona while inspiring two major Web 2.0 enterprises, and the online world from a major games console.

Sci-fi

The most obvious sign of Snow Crash’s influence is Linden Labs’ Second Life. While Linden founder Philip Rosedale claims that his creation predates Snow Crash, he’s admitted that reading the novel helped crystallise ideas. “When Snow Crash came out, I was already really intent on the idea of creating a virtual world like Second Life,” Rosedale told the New York Times in 2007. “But Snow Crash certainly painted a compelling picture of what such a virtual world could look like in the near future, and I found that inspiring.”

Despite this, and the fact that chapters of the novel were made available within Second Life, Stephenson himself has always played down the link. “I have nothing negative to say about it,” Stephenson said in an interview with the Chicago Tribune, before explaining that “every hour I spend in a virtual reality is an hour I’m not spending reading Dickens or visiting Tuscany”.

The second real-life product spun from Snow Crash also becomes obvious once you recall the book’s vision of “Earth”, a 3D, global information tool that combines data feeds with geographical information:

“Hiro turns his attention to Earth. The level of detail is fantastic. The resolution, the clarity, just the look of it tells Hiro, or anyone else who knows computers, that this piece of software is some heavy shit. It’s not just continents and oceans. It looks exactly like the earth would look from a point in geosynchronous orbit directly above LA, complete with weather systems and vast spinning galaxies of clouds, hovering just above the surface of the globe, casting gray shadows on the oceans and polar ice caps, fading and fragmenting into the sea.”

Sound familiar? In 2006, Keyhole co-founder John Hanke, now employed by Google, told O’Reilly’s Where 2.0 conference that Google Earth had its origins in a conversation with some guys from Silicon Graphics. “They said, you know, ‘you’ve probably read Snow Crash’ and I said ‘yeah’ and I had, and they said ‘well, you know that thing that the character uses, that Earth thing, where it’s there in 3D and he can just dive in and get information’. They said ‘We can build that’.”

Finally, consider that Snow Crash was required reading for the team working on Microsoft’s Xbox 360. In fact, the console’s prime architect, J Allard, uses the name of the hero as his Xbox Live ID. You can easily see the book’s influence in its emphasis on public profiles, renown (in the form of achievements and gamer points) and community features. A quick look at Sony’s Home virtual world on the PlayStation 3 reveals that Snow Crash’s influence isn’t only confined to Microsoft consoles.

Tomorrow’s prophets

I made some predictions, thinking that in ten years they’d either be laughable or they’d have come true. The weird bit? Most of them came true already, by 2009!

Since Snow Crash, no novel has had quite the same impact on the computing world, and you might argue that sci-fi and hi-tech are drifting further apart. Sci-fi seems to be following the lead of Iain M Banks and Alastair Reynolds into takes on galactic exploration, all-knowing AIs and the technologically and genetically enhanced “post-human” condition. In fact, writers such as William Gibson and Charles Stross have spoken of the difficulties of writing near-future sci-fi in a world where the future accelerates towards us at such a prodigious rate.

As Stross told PC Pro: “Back in 2005 I began writing a novel, Halting State, about the future of MMOs and the gaming industry. It came out in 2007, and was set about a decade out – around 2018. I made some predictions, thinking that in ten years they’d either be laughable or they’d have come true. The weird bit? Most of them came true already, by 2009!”

Yet Stross believes that there’s still a relationship between sci-fi and real-world technology. “There’s definitely feedback going on,” he adds. “I get invited to tech conferences and get fan mail from readers who are interested in the ideas in my fiction – in some cases to the extent of basing business ideas on them.” Stross, like popular blogger and Wired pundit, Cory Doctorow, now effectively doubles as a sci-fi author and futurologist, creating fictional worlds yet also helping to change what happens in real companies.

Do Androids Dream of Electronic Sheep?

And this is probably science fiction’s biggest achievement. Chris Bishop, chief research scientist at Microsoft, doesn’t believe that sci-fi stories can be traced directly to goods coming off the production line, but he argues that it “often provides the mechanism that brings ideas to the public attention for the first time”.

“Take, for example, multitouch interactive displays,” he said. “Many people saw these for the first time in the Tom Cruise film Minority Report. The ideas go back many years before the film, but it took advances in processor power and display technology to allow these ideas to be turned into working devices. So science fiction does a good job of whetting people’s appetite for what might be possible.”

Yet the film wasn’t relying entirely on the imagination of its creators: John Underkoffler, technical advisor on the film, had been working on a similar gestural interface technology at MIT. However, it was only when Minority Report was shot, and others saw Underkoffler’s work, that he was put in a position where he could, one day, make it a mass market product. Sometimes the truth really is stranger than fiction.

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.