Memory in the age of Internet
- No new bars to open in Kerala: minister T.P. Ramakrishnan
- Future Retail gets RBI nod to increase RFPI to 49%
- Hindustan Aeronautics IPO subscribed 45% on Day 2
- Consider employees objection to Tata Steel bid: NCLT to Bhushan Steel creditors
- RCom moves SC against Bombay HC, NCLT orders barring sale of assets
Few people bother to memorize phone numbers these days. Earlier, people would jot down to-do lists or commit a set of errands to memory. Today, our cellphones beep and buzz at varying intervals, reminding us of a dental appointment or the due date for a child’s school fees. As our lives become more app-centred, the manner in which we negotiate the world is changing faster than we realize. And a faculty that is increasingly being relegated to devices is memory.
What are the consequences of this? Even as experts on the two sides of the debate make their case, they agree that the Internet is altering our minds in subtle and profound ways.
In a paper titled Google Effects On Memory: Cognitive Consequences Of Having Information At Our Fingertips, published in the journal Science in 2011, professors Betsy Sparrow, Jenny Liu and Daniel Wegner describe an intriguing set of experiments conducted by them to understand how the Internet influences our memories.
In general, when confronted with a perplexing question, people are more inclined to turn to computers for answers. For example, if participants are asked to think of countries that have flags with a single colour, subjects tend to automatically think of accessing that information online. Further, when people know that they can access a particular piece of information later, they are less likely to remember it than if they are told that the information will be erased. But people are now inclined to remember where the information is stored. In fact, we are now so used to accessing any bit of trivia almost instantaneously that, as the authors write, “it can feel like going through withdrawal when we can’t find out something immediately”.
In his 2013 book Smarter Than You Think: How Technology Is Changing Our Minds For the Better, journalist Clive Thompson argues that humans have always relied on external sources to help them remember. Even before books and notepads, we relied on our parents, spouses and friends to “fill in” gaps in our memories. The idea of relying on external crutches is called “transactive memory” by cognitive psychologists. What the Internet has done is broaden the base of our transactive memory.
Philosopher and technologist David Weinberger argues in his 2012 book, Too Big to Know: Rethinking Knowledge Now That the Facts Aren’t the Facts, Experts Are Everywhere, And the Smartest Person in the Room Is the Room, that our relationship with knowledge itself is undergoing a tectonic shift. He makes the provocative claim that “experts” in the traditional sense are actually passé as there is too much to know. Further, the demarcations between disciplinary boundaries are growing more porous. He says that rather than emphasizing disciplinary knowledge, the new-age thinker needs to collect, synthesize and analyse knowledge from myriad sources and use it in meaningful ways.\
But are we becoming dumber because we can always rely on our iPad or smartphone to dish out a factoid we can’t remember? Thompson argues to the contrary. He says that relying on machines can actually be beneficial. This is because when we type a term or question into a search engine, we end up with more information than we asked for. As a result, we often learn more about a subject than we intended.
While technology enthusiasts laud the benefits of the Internet, others, like writer Nicholas Carr, warn us of how our memories, and our very humanity itself, might be compromised by the Web. In his 2011 book The Shallows: What the Internet Is Doing to Our Brains, he says the Web has changed our conceptualization of memory and “people routinely talk about artificial memory as though it’s indistinguishable from biological memory”. But human memory functions very differently from the bits and bytes stored on silicon chips. Unlike a computer’s memory, which is fairly inert, Carr argues that every time we recall a memory, it changes the way it is stored in our brains, giving us “a new set of connections”. In fact, it is this dynamic state of our memories that contributes to our humanness. Furthermore, when we access information on the Web, we are presented with a surfeit of information, often irrelevant to our quest at hand. As a result, the Web “places more pressure on our working memory”, a term psychologists use to refer to the items we hold in consciousness at any given time. Because we are inundated with distracting information, we fail to consolidate what we browse on the Web into long-term memory.
Like all manmade inventions, from the humble kitchen knife to the motor car, tools and technology have always had both beneficial and baneful effects, depending on how the user engages with them. In their 2013 book The App Generation: How Today’s Youth Navigate Identity, Intimacy, And Imagination In a Digital World, educator Howard Gardner and his doctoral student Katie Davis draw a distinction between “app-enabling” and “app-dependent” digital tools. The former helps us extend ourselves while the latter limits and dictates what we do. How we use the Internet and other digital tools determines whether we extend ourselves or become slaves of our devices. Used prudently, technology can free us from having to memorize details and trivia, knowing that we can always access them when required. But we cannot relegate all forms of knowledge to smartphones and computers.
In his book, Gardner describes how he was approached by a confident college student after he gave a talk on education. The student asked him why we would need school when the “answers to all questions” were available on our devices. Gardner responded, “Yes, the answers to all questions…except the important ones.”
Aruna Sankaranarayanan is the founder and director of Prayatna, a centre for children with learning difficulties.