How tech is rewiring your brain

Bad news: all that technology we’ve been urging you to buy for the past 20 years, it’s making you dumber. You can’t remember phone numbers any more, your body clock is shot and your hippocampus – the part of the brain that controls your sense of direction – is shrinking.

How tech is rewiring your brain

Not that you care. You probably haven’t even taken all of that in, because the constant chirps and distractions from your various computing devices have wrecked your attention span. You’re almost certainly skim-reading this article, especially if you’re reading it online, and the chances of you making it all the way to the end are negligible. Wait, is that your phone buzzing in your pocket? Relax: you’re not even carrying your phone. It’s just another of those phantom vibrations you’ve been getting lately.

Is this a problem? It doesn’t have to be. You don’t need to know where you’re going these days, because your smartwatch will buzz every time you need to change direction. Soon, you won’t even need to know how to drive: your Google car will get you home without you even having to put your hands on the wheel. Not that there will be a wheel: you’re not smart enough to be trusted with that.

Eventually you won’t need to know anything at all. In 20 years’ time, you’ll be able to connect your brain directly to the cloud and access all of the knowledge on the internet on demand. Technology has already rewired your brain; now you’re going to need to wire your brain to technology. That’s the theory, anyway. But does it have any basis in fact?

Brain damage

The debilitating impact of the internet, and modern technology in general, has been a growing concern for some years. In 2008, the writer Nicholas Carr wrote a seminal essay for The Atlantic asking “Is Google making us stupid?”, in which he questioned whether the internet was shortening our attention spans and harming our intelligence. “Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory,” Carr wrote. “My mind isn’t going – so far as I can tell – but it’s changing. I’m not thinking the way I used to think.”

tech-is-rewiring-your-brain-nicholas-carr

The internet, he added, was “chipping away my capacity for concentration and contemplation”, leaving him unable to read more than two or three pages together without losing his train of thought or moving onto something else.

Although Carr’s article was written before tablets and smartphones had taken off, he also foresaw how the internet (and apps) replacing functions such as maps, the telephone and television would make matters worse. “When the net absorbs a medium, that medium is recreated in the net’s image,” he said. “It injects the medium’s content with hyperlinks, blinking ads, and other digital gewgaws, and it surrounds the content with the content of all the other media it has absorbed. A new email message, for instance, may announce its arrival as we’re glancing over the latest headlines at a newspaper’s site. The result is to scatter our attention and diffuse our concentration.”

By 2010, Carr’s theory had been turned into a Pulitzer Prize-nominated book, The Shallows, in which he expanded on this notion of modern technology altering the brain’s behaviour. “If, knowing what we know today about the brain’s plasticity, you were to set out to invent a medium that would rewire our mental circuits as quickly and thoroughly as possible, you would probably end up designing something that looks and works a lot like the internet,” he wrote.

Carr isn’t alone in linking internet activity to impaired brain function. Baroness Susan Greenfield, a distinguished British scientist and former director of the Royal Institution, has likened the effect of technology on the brain to climate change. “It’s pretty clear that the screenbased, two-dimensional world that so many teenagers – and a growing number of adults – choose to inhabit is producing changes in behaviour,” she wrote in The Daily Mail in 2009. “Attention spans are shorter, personal communication skills are reduced and there’s a marked reduction in the ability to think abstractly.” This summer, she published a book on the topic entitled Mind Change.

tech-is-rewiring-your-brain-reflection-of-laptop-screen-in-glasses

Although Carr and Greenfield’s claims may sound persuasive, there’s actually little in the way of hard evidence to back them up. In The Shallows, Carr cited a 2008 study undertaken at the University of California (UCLA), which monitored the cognitive behaviour of 12 web newcomers and 12 experienced surfers. The initial brain scans of the two groups showed distinct differences in the parts of the brain most commonly used by the different groups, but after only six days of exposure to the web, the newcomers’ patterns of brain activity were almost identical to those of the internet veterans.

“These findings point to the sensitivity of brain neural circuits to common computer tasks such as searching online, and constant use of such technologies has the potential for negative brain and behavioural effects, including impaired attention and addiction,” the researchers declared.

Yet the study’s own authors admitted that their results were far from conclusive. “The subject sample was small and not representative of the general population,” they noted. “Sampling bias or measurement error, or both, could explain these results.”

Greenfield, meanwhile, has been repeatedly challenged to present her findings about the impact of technology on the brain in a scientific, peer-reviewed paper, rather than in the popular press. “This isn’t really science at all,” Martin Robbins wrote recently in The Guardian’s The Lay Scientist column, echoing criticisms made by fellow hoax-snuffer, Ben Goldacre. “Greenfield’s work is bogstandard social commentary disguised as neuroscience.”

A new piece of research funded by the Department of Health may give Greenfield’s critics some of the scientific answers they crave. The Study of Cognition, Adolescents and Mobile Phones (SCAMP) is attempting to provide a clearer answer to whether constant exposure to electromagnetic fields has any impact on the cognitive behaviour of children. The study will track the behaviour of around 2,500 secondary-school pupils over the course of three years, in a bid to find out whether smartphones can damage (or even improve) language comprehension, attention spans and memory.

Exploring the brain’s GPS

Smartphones do have proven downsides, however. If you increasingly find it hard to get around without a smartphone in your hand, then you’re not alone: studies suggest that our growing reliance on satellite-navigation systems is having a negative effect on our own in-built sense of direction. 

British-American professor John O’Keefe won a Nobel Prize this year for locating the “brain’s GPS system” – but scientists believe that this part of our brain is shrinking due to the easy availablity of navigation systems both in our cars and our pockets.

O’Keefe was part of the team that identified the “place cells” in the hippocampus, the part of the brain that’s active when we’re trying to find our way. But a 2010 study from Canada’s McGill University – to which O’Keefe is affiliated – provided some startling results for GPS users.

Brain scans were taken of two groups of people: GPS users and those who relied on old-fashioned manual navigation. The study found that those accustomed to finding their own way around exhibited higher brain activity and a greater volume of grey matter in the hippocampus than those who had become dependent on their satnavs. The non-GPS users also scored better in a test for cognitive impairment that’s used to diagnose sufferers of Alzheimer’s disease. The hippocampus – which aside from providing our sense of direction, also deals with memory – is one of the first areas of the brain to be affected by Alzheimer’s.

Their findings supported a 2008 study at UCL (where O’Keefe is based), which found that London taxi drivers – who are still required to memorise their way around the city – had an enlarged region in the hippocampus, compared to the average person.

Feeling the strain

While the jury may be out on whether the internet directly affects other elements of our cognition, there’s a stronger body of evidence that suggests technology is disrupting our sleep patterns, increasing our stress levels and even inducing signs of paranoia. Children, it seems, are particularly susceptible.

tech-is-rewiring-your-brain-brain-eeg-activity-graph

In addition to examining the effects of radio waves, the SCAMP study will also attempt to discover whether mobile-usage patterns are damaging children’s development. Dr Mireille Toledano from Imperial College London, who is leading the research, claims that two-thirds of children are taking their smartphones to bed with them, and that this may be responsible for sleep deprivation and other problems associated with poor attention spans.

A separate study of 32,000 GCSE students is investigating similar concerns. The trial will see some pupils start school an hour later than normal, to see whether a lie-in will improve exam results. In addition to examining whether teenagers are simply biologically predisposed to wake later than adults, the research will explore whether smartphones and tablets are interfering with sleep patterns.

“We’ll certainly be investigating the role of phones and other technology in sleep difficulties,” Colin Espie, professor of sleep medicine at the University of Oxford, told Alphr. “There have been several studies demonstrating a link between short sleep and increased media usage. For example, Arora et al (2014) showed that use of several different types of technology before bed was associated with difficulty falling asleep in teenagers. However, that study was crosssectional, so it isn’t clear what was causing what.”

“Recent research from Australia suggests that media usage has a bidirectional association with short sleep, at least in young children. Magee and colleagues showed that the more time four-year-olds spent interacting with media, the shorter their sleep at age four and six years. The reverse association was also true, with short sleep predicting higher media usage two or four years later.”

It isn’t only children who suffer from late-night exposure to smartphone and tablet screens. Several studies, including one published earlier this year by the Harvard Medical School, have shown that the blue-tinged light emitted by devices such as smartphones and tablets suppresses the production of the sleep-inducing hormone melatonin in the brain. “If you have sleep problems, it’s a good idea to stop using devices at least an hour before you go to bed to give yourself time to wind down,” advises Professor Espie. Ironically, Espie is the co-founder of Sleepio, an online sleep-improvement programme with an iPhone app that’s designed to help users sleep more soundly.

tech-is-rewiring-your-brain-long-exposure-of-cars-going-past-houses

Of course, you don’t need to be looking at a phone’s screen for it to disrupt your relaxation. Work-related emails and messages now invade employees’ personal time as a matter of course, with many feeling as if they’re on call around the clock. Germany has witnessed a 50% increase in the number of working days lost to psychological illnesses such as burnout over the past 12 years, leading the country’s labour minister, Andrea Nahles, to commission a study into the causes of work-related stress. The country’s metalworkers’ union has even drafted an “anti-stress act”, including a demand that workers should be protected from being “permanently reachable by modern means of communication”.

That anticipation of the phone buzzing to alert you to a new message has also given rise to a form of paranoia, with people feeling “phantom vibrations” from their mobile even when they’re not carrying it. A 2010 paper published in the British Medical Journal reported that up to three-quarters of medical professionals carrying a mobile phone or a pager had experience of these false alarms, with up to a fifth experiencing them daily.

“The sensations are better characterised as tactile hallucinations, in which the brain perceives a sensation that isn’t actually present,” the report stated. “Since the brain is anticipating a call, it misinterprets sensory input according to this preconceived hypothesis. The actual stimulus is unknown, but candidate sensations might include pressure from clothing, muscle contractions, or other sensory stimuli.”

The report concluded that the global impact of three-quarters of mobile users suffering such symptoms would be “substantial”. “If even a small proportion of users experience severe symptoms, then effective treatment will be required,” it added.

Automatic for the people

Computing devices may be adding to our workload in some ways, but in others they’re lightening it. Perhaps doing a little too much of the heavy lifting, in fact.

tech-is-rewiring-your-brain-aeroplane-cockpit-nighttime

Nicholas Carr’s new book, The Glass Cage, explores the downsides of handing over complex tasks to computers, and the effect it’s having on human intelligence. Nowhere can this be seen more clearly than in the case of commercial pilots, who now “hold the controls for a grand total of three minutes” on a typical passenger flight, according to Carr. “The commercial pilot has become a computer operator. And that, many aviation and automation experts have come to believe, is a problem.”

As an example of this over-reliance on machines, he cites the fate of Air France Flight 447, which in 2009 crashed into the Atlantic Ocean killing all 228 passengers and crew on board. Prior to the incident, the plane’s autopilot system had disengaged, handing the controls back to the pilots – who then, according to the subsequent air accident investigation report, made several critical misjudgements that ultimately resulted in the plane stalling and crashing into the sea.

Carr cites studies that show a direct correlation between the time a pilot spends manually flying an aircraft and their ability to maintain airspeed control – a skill that’s critical when it comes to avoiding stalls. “It’s no mystery why automation degrades pilot performance,” he writes. “Like many challenging jobs, flying a plane involves a combination of psychomotor skills and cognitive skills – thoughtful action and active thinking.” Without regular practice, he notes, such skills are “particularly vulnerable to decay”.

Carr fears that the next big advance in automation, the self-driving car, will have an even more profound effect, stripping away one of the core leisure activities that makes us feel human. Although the modern car already relies upon a barrage of computerised controls to keep four wheels on the road, Google’s self-driving car prototype does away with manual controls completely, making the “driver” even more redundant than an aircraft pilot.

Speaking to Alphr, Carr suggests that the omission of a steering wheel might in fact represent something of a defeat for Google’s vision. “Google’s removal of all manual controls is actually a sign that the company’s goals for autonomous vehicles have become more modest,” he said. “Google found that the shift of responsibility back and forth between computer and person was dangerous, since the automation tended to reduce the person’s situational awareness. So the new, totally automated car is a very limited vehicle, which can’t go faster than 20 miles an hour and is designed for controlled situations, rather than the open road.”

tech-is-rewiring-your-brain-google-driverless-car

Nevertheless, Carr still fears that the next generation of part-human, part-computer driven vehicles will present problems. “Computers are still a long way from matching people’s ability to make sense of the world and all the strange and unexpected events that take place in it,” he said. “The real challenge, given that we’re going to have semi-automated cars, not fully automated ones, is to make sure that the systems are designed in a way that keeps the human driver engaged in the task.”

At present, 95% of all road accidents involve some form of human error, according to The Royal Society for the Prevention of Accidents (RoSPA), and in 76% of road accidents a human is solely to blame. Cars kill more people than cancer or guns in the US – and while the Air France crash investigators certainly did point the finger of blame at the pilots, we’ll never know how many other potential air disasters have been averted as a result of the autopilot overriding human error. It’s easy to see why Google describes the self-driving car as “one of those 10x opportunities to save lives and make the world a better place”.

Knowing more or less

When it comes to the retrieval of knowledge, the human brain has already been comprehensively overtaken by technology. A very public tipping point was reached in 2011, when IBM’s Watson computer thrashed two human champions on the US quiz show Jeopardy. The fact that a computer could access an almost infinite number of facts was never in doubt, but Watson’s ability to interpret the question and deliver the correct answer (or vice versa under the peculiar rules of Jeopardy) proved something of a rude awakening for one of those beaten contestants. “I felt like a Detroit factory worker of the 1980s seeing a robot that could now do his job on the assembly line,” said Ken Jennings, in a TED talk delivered earlier this year. “I felt like quiz-show contestant was the first job that had become obsolete under this new regime of thinking computers. And it hasn’t been the last.”

tech-is-rewiring-your-brain-jeopardy-tv-set

Despite that humbling experience, Jennings remains optimistic that humans won’t simply outsource their knowledge to Google, and rely on looking up facts that they’d previously have taken the effort to learn. In his talk he cited the example of the ten-year-old British girl who saved a beach full of people from drowning in the 2004 tsunami in Thailand, because a month earlier her geography teacher had taught her how to spot the warning signs. She wouldn’t have Googled “rapidly retreating tide” and “waves churning in the distance”, because she wouldn’t have known anything was wrong; she only knew there was a problem because she’d been told what to look for. Jennings quoted the 18th-century British theologian Samuel Parr in support of his argument: “It’s always better to know a thing than not to know it.”

The man machine

It may be too early to say for sure whether technology is metaphorically rewiring our brains, but what’s clear is that the relationship between the two is becoming increasingly nuanced – and the boundaries between what we want to keep in our heads and what we’re prepared to delegate to our devices and online services are increasingly negotiable.

Indeed, one man believes it’s inevitable that we’ll soon be literally rewiring our own grey matter, in order to supplement our own knowledge with search-engine results. In his own TED talk earlier this year, Ray Kurzweil – inventor of the CCD flatbed scanner, and now a bestselling author and director of engineering at Google – suggested that “today, you have a computer in your phone, but if you need 10,000 computers for a few seconds to do a complex search, you can access that for a second or two in the cloud. In the 2030s, if you need some extra neocortex, you’ll be able to connect to that in the cloud directly from your brain.

“Our thinking, then, will be a hybrid of biological and non-biological thinking,” he added, “but the non-biological portion is subject to my law of accelerating returns. It will grow exponentially.”

tech-is-rewiring-your-brain-ray-kurzweil-credit-to-ed-schiphul

Not everyone is convinced. Nicholas Carr sees Kurzweil’s prediction of brain-power-on-demand as “awfully arrogant and rash. “I think he’s making all sorts of unsupported assumptions,” he told us. “Kurzweil believes that at some point in the near future we’ll understand the mystery of human consciousness well enough to replicate the phenomenon in a computer.”

“Since we don’t understand consciousness, and don’t even understand what would be required to understand consciousness, any assumption that we’ll soon be able to either replace or expand the mind through computer processing is an expression of faith, not reason.”

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.