Do Digital Natives Learn Differently?

“Technology is rewiring young people’s brains.”

That’s a claim we hear a lot these days. It’s in the media, spoken by experts and pundits, and in the air, voiced by parents and teachers. Sometimes it’s uttered in alarm, by those concerned that children’s ability to learn and pay attention is being warped by the hours they spend in front of the computer. Sometimes it’s proclaimed in celebration, by others convinced that a generation of “digital natives” has developed new ways of absorbing and applying information.

Although the use of computers and other devices like tablets and smartphones is not changing the fundamental operations of young people’s brains, computerized instruction can be designed to work with these built-in features of the mind.

In fact, research in cognitive science and psychology shows that both of these sentiments are misplaced. While it is true that our brains are to some extent “plastic”—that is, responsive to experience—it is also the case that there are biological constraints on how our brains operate. These constraints are universal, found across cultures and across generations. What follows is a brief primer on how attention and memory work, and how we can maximize their effectiveness.

When our minds are engaged in a task—reading a sentence, say, or solving a math problem—the information relevant to that task is held in our short-term memory. This mental holding space can only contain four to seven pieces of information at a time. It’s where we do our thinking, by combining information drawn from our environment (the textbook page or math worksheet we’re looking at) with information stored in our long-term memory. Unlike short-term memory, the capacity of long-term memory is essentially infinite. How do we move information from short-term to long-term memory? Attention is key. We have to be paying attention to, and thinking about, a fact or a concept in order for it to be “encoded” in memory.

One common enemy of attention is multitasking. Young people report frequent media multitasking—texting, emailing, surfing the web, or updating Twitter and Facebook—while also doing schoolwork. And while they may think that they can do it effectively, research shows otherwise. In fact, studies led by Stanford University professor Clifford Nass demonstrate that individuals who multitask the most are actually the worst at it. The reason multitasking is detrimental to learning holds for young people as well as adults: the brain can’t really pay attention to more than one thing at a time. Rather, it switches its focus between the two tasks, making us slower and less accurate at both. Whether we’re learning with a computer or a book, it’s best to give it our undivided attention. (And it’s best to follow a day of learning with a good night’s rest: sleep is when our brains “consolidate” the memories we’ve acquired while awake, discarding irrelevant material and moving important information into long-term storage.)

“Growing up digital” doesn’t change how we come to understand new information, either. Understanding happens when we process new information in terms of its meaning, rather than its surface features: thinking deeply about the themes of King Lear, for example, instead of registering simply that it was about three daughters and their aging father. And understanding happens when we connect new information to what we know already—for example, by using an analogy of water flowing through pipes to conceive of how an electrical circuit works.

The process of remembering, like understanding, has certain features that remain consistent across age and across experience. Decades of research on subjects ranging from elementary school-aged children to elderly adults, for example, have shown that “retrieval practice”—repeatedly calling up information from memory—helps us remember that material much better than simply reading and re-reading it. Another technique, called “spaced repetition,” has also proven universally effective: it involves exposing oneself to new information in short bursts spread out over time, rather than in one marathon study session. Although the use of computers and other devices like tablets and smartphones is not changing the fundamental operations of young people’s brains, computerized instruction can be designed to work with these built-in features of the mind. Educational programs can promote retrieval practice by offering short quizzes, for example, and can expose users to new information on a spaced-out schedule calculated to produce maximum retention. They can facilitate a focus on meaning and on connecting old knowledge to new by, for example, allowing simulated science experiments to be performed onscreen, or by engaging students with an interactive historical timeline.

No, technology is not “rewiring” young people’s brains. This will come as a relief to some and a disappointment to others. But this reality does bring with it one significant advantage: a body of research on understanding, attention, and memory that can now be applied to a new generation of humans, not so different from the ones who came before.

Annie Murphy Paul is a book author, magazine journalist, consultant, and speaker. She is the author of “The Cult of Personality,” a cultural history and scientific critique of personality tests, and of “Origins,” a book about the science of prenatal influences. She is now at work on “Brilliant: The New Science of Smart,” to be published by Crown in 2013. A contributing writer for Time magazine, she writes a weekly column about learning for Time.com, and also blogs about learning at CNN.com, Forbes.com, MindShift.com, PsychologyToday.com and HuffingtonPost.com. She contributes to The New York Times Magazine, The New York Times Book Review, Slate, and O, The Oprah Magazine, among many other publications.

This article is commissioned by Amplify Education Inc. The views expressed are the author’s own, and do not represent those of Amplify Education Inc.