Search

Is Google Ruining Your Memory? The Science of Memory in the Digital Age

By providing ubiquitous access to information, Google is changing not only what we remember, but how we remember. Image courtesy of Mindtech Sweden.

Is France larger or smaller than Transalpine Gaul? What is the source of the Danube River? Where, geographically, is Mount Blanc?

These questions, straight from an 1869 Ivy League entrance exam, are strikingly different from those of the SAT. Today, no college-readiness test would ask this kind of question. It seems frivolous to ask students to think back to geography class and try to remember whether their teachers ever mentioned the answers. If a 21st-century student ever needed to know, his or her reaction would be automatic — just Google it.

Science can explain why we have grown increasingly reliant on Internet search engines like Google. Groundbreaking psychology research is giving us insight into how modern technology affects our memories. It seems that pervasive access to information has not only changed what we remember; it has changed how we remember. At least that’s what Dr. Betsy Sparrow, Assistant Professor of Psychology at Columbia University, believes.

Image courtesy of the Huffington Post.

In a recent study published in Science, Sparrow and her colleagues performed four experiments that demonstrate how our brains have adapted to technology. In one experiment, researchers tested how well subjects remember information that they expect to have later access to, as people might with information they know they could easily look up online. Subjects were given 40 pieces of interesting trivia: Some were completely new facts like, “An ostrich’s eye is bigger than its brain,” and others were facts that subjects may have known generally, but not in detail — for example, “The space shuttle Columbia disintegrated during re-entry over Texas in Feb. 2003.” Each subject then typed the facts into a computer. Half the participants were told that the computer would save what was typed. The other half believed the entries would be erased.

After the reading and typing phases, all participants were asked to write down as many of the statements as they could remember. Subjects were substantially more likely to remember information if they believed they would not be able to find it later. The implications are far-reaching. For example, if a professor posts lecture slides on the Internet, students may be less apt to remember information because they know that they can look up the information later if the need arises.

Google has invested millions of dollars into Glass, a device that provides continuous information overlain onto what you see. As these and other similar technologies become mainstream, access to the Internet will become even more ubiquitous. Image courtesy of the Associated Press.

Next, the researchers attempted to determine whether the Internet has become, in some sense, an external memory system for those who use it. This phenomenon is called transactive memory and has been known to happen in long-term relationships, group work environments, and other situations where people rely on others to remember information for them. While we like to imagine the human memory as having unlimited storage capacity, in truth, we have evolved to offload information onto other people, like family and coworkers, as well as other mediums, like handwritten notes and books. Sparrow wanted to know whether we employ the Internet in the same way.

“If asked the question whether there are any countries with only one color in their flag,” Sparrow wrote, “do we think about flags — or immediately think to go online to find out?”

The results were surprising: researchers found that subjects paid more attention to computer and Internet-related words when faced with difficult trivia, suggesting that our brains are primed to think about computers when we encounter questions that we do not know the answer to.

Sparrow’s two other experiments yielded interesting results as well. In one, Sparrow found that when we learn facts under the impression that we will not be able to easily look them up in the future, we become better at spotting differences between those facts and similar ones we are shown at a later time. In the other, when the researchers asked subjects to remember a trivia fact and which of five computer folders it was saved in, Sparrow was astonished to find that people were significantly better at recalling the folder than the fact itself. “That kind of blew my mind,” she said in an interview with the New York Times.

These results suggest that our memory patterns have indeed changed, but the Internet itself is not the sole culprit. Smartphones and tablets, too, have tremendously increased the ease and speed with which we can access information. Wearable computing is just around the corner as well —Google has invested millions of dollars in developing glasses with an integrated transparent digital display that augments reality by providing continuous information overlain onto what the user sees. If devices like these ever become as ubiquitous as smartphones, our society could be profoundly altered. College examinations today commonly test for knowledge comprehension. Perhaps one day, such tests will be as outdated to our future counterparts as geography on an Ivy League admissions test is to us.

Dr. Betsy Sparrow of Columbia University is conducting groundbreaking research on how pervasive access to the Internet is changing human memory. Image courtesy of the Association for Psychological Science.

Sparrow’s work raises broader questions, too. Pervasive access to information is clearly making society better in some ways. Many argue that it leads to a more educated populace, more capable scientists, and better informed political decisions. But at some point, society should question itself. In adopting the mentality of constant information at our fingertips, are we leaving something important behind? When we reduce how much information we hold in our brains, do we diminish the potential for subconscious reasoning and human insight? Answers to these important questions remain elusive, but more work like Sparrow’s will hopefully lead us in the right direction.