Menu Close

Your brain on the internet: a response to Susan Greenfield

Brains are supposed to change in response to experiences; that’s a sign they’re working as they are designed to. Stephen Anthony

Whenever I hear dire predictions concerning the social impact of new technologies, I recall a similar prediction made nearly 2,500 years ago. In the Phaedrus, Plato recounts a myth, according to which an Egyptian god approached King Thamos and offered him the gift of writing. But the wise king refused the gift, arguing that it would allow people to substitute the appearance of knowledge for its substance. If we rely upon writing to preserve our knowledge, our memory will weaken, and we will begin to mistake the living truth for its shadow, Plato suggests.

Today, of course, we rightly think of literacy as essential to education. But bearing in mind Plato’s warning should help us recall that fears concerning new technologies are a natural response to the unfamiliar. The unease generated by the strange causes us to think that it’s risky, and the reasons we offer are often rationalisations of that feeling.

When we hear Baroness Susan Greenfield talk about the dangers of the internet and of social media, we should recall these facts, and alter our responses accordingly. Because she is not a digital native, she likely finds these technologies more threatening than do younger people. But because the internet is transforming the world in ways that are unprecedented, a feeling of unease might arise in any of us.

Her claims that digital culture may cause negative changes in the brains of users – reducing attention spans, lowering empathy, and so on – sound plausible. But surface plausibility is not always a good guide to truth.

Baroness Susan Greenfield speaking at the National Press Club in 2010. AAP Image/Alan Porritt

In order to assess the genuine costs and benefits of digital culture, we need empirical evidence. This kind of evidence is extremely difficult to collect because it’s very difficult to isolate the causal factors at work in human populations. When we are studying animals, we can isolate them from other influences, and carefully control the factors of interest, but with regard to human beings, we are reliant on natural experiments, and natural experiments are usually very messy.

Suppose we compare the attention spans of people who are digital natives with those who are not: what drives any result we find? Since age will correlate with whether one is a digital native or not, is the effect an effect of age or of exposure to technology (there is independent evidence that younger people are more impulsive than older)? Suppose we use groups matched for age: comparing those who had free access to computers when they were children (say) with those whose computer use was strictly limited. In that case, the effect might be the result of different parenting styles (permissive versus strict), or of other factors correlated with this particular difference in parenting (religiosity, say).

That isn’t to say that it’s impossible to identify the genuine causal influences at work in large human populations. It is to counsel even more caution than usual in interpreting existing scientific work. We can be confident that we have identified a cause only when we have a sufficient number of studies, using different methodologies and controlling for as many of the confounding factors as possible. The data must also be sufficiently long-term: costs and benefits may emerge only gradually. Right now, the available evidence is not alarming.

Perhaps multitasking is to blame for the internet’s alleged effect on cognitive processing. Michael Feagans

Consider the empathy finding Greenfield cites. Using self-report data, Sara Konrath and colleagues find that college students today are less empathic than previous generations. Suppose the finding holds up (remember this is one study, using one methodology, on one population; it may turn out that the result is spurious). Can we blame digital culture?

Perhaps, but there are a multitude of other factors that might be at work. Perhaps an increasing individualism is to blame, or an intensification of consumer culture, or economic insecurity. Alternatively, perhaps it’s now more acceptable to report a lack of empathy than previously; perhaps the reports index a willingness to admit to a lack of empathy, rather than a decline in empathy. It is worth noting that the supposed decline in empathy occurred at a time when violent crime was decreasing in the United States, which should surely lower our confidence in the finding.

Konrath’s study was well designed; its limitations are endemic to this kind of study. The same cannot be said for much of the existing work on internet addiction. Many findings are from badly designed, badly controlled studies, by researchers who seem more interested in publicity than truth.

Greenfield doesn’t cite any data on whether and how the internet is affecting cognitive processing. She just cites Eric Schmidt’s worry that the effects might be negative. They might, but the evidence is not persuasive.

There is evidence that multitasking leads to a lowering of performance, and it may be true that people tend to engage in more multitasking today than formerly, because they have tablets, smartphones and laptops constantly available. But, in that case, it is multitasking that is the problem, not digital media per se. What’s more, we don’t know whether heavy multitasking is caused by digital technology or just finds a ready outlet in it (today’s teenagers tweet while also browsing Facebook; perhaps their grandparents had the radio on while they read their comic books).

Raphael’s portrait of Plato. Wikimedia Commons

One particular worry is certainly wrongheaded. A number of people have worried that the internet exerts “an actual physical influence on the neurons and synapses in our brains”, as Nicholas Carr put it. The worry is wrongheaded because all experiences change the brain. If it did not, we would not be capable of learning and storing memories. Brains are supposed to change in response to experiences; that’s a sign they’re working as they are designed to.

I started by mentioning Plato’s worry that literacy would weaken memory. As a matter of fact, Plato may not have been entirely wrong: there is evidence that people in preliterate cultures have better memories. It does not follow, however, that the invention of writing had costs as well as benefits, or rather it does not follow that any effect on short term memory is simply a cost. It may that writing did not merely provide us with an external memory store that was superior to brain-based memory (much more reliable, for one thing). By decreasing the burden on our memory, it may also have freed up brain-based processing resources for other tasks.

Something like that might be true for the internet too: it might be that insofar as iPhones take over the task of memory and time management, some brain-based capacities will atrophy and the neural real estate will be repurposed. Right now we have little reason to think that the costs of digital culture (if there are significant costs) outweigh the benefits, and no reason at all to think that these apparent costs will not bring unexpected benefits for our brains, as well as for our societies.

Of course, absence of evidence is not evidence of absence, but we have no reason to panic, and no reason to think that the challenge of unravelling the causal effects of internet use on minds is especially urgent.

Want to write?

Write an article and join a growing community of more than 180,400 academics and researchers from 4,911 institutions.

Register now