I love
Nicholas Carr's book. There are lots of studies and science mixed with many stories and asides and discussions of philosophers and other great thinkers. It reminded me of reading a Bill Bryson book. You get the facts painlessly. And it presents a strong argument for keeping kids (and everyone) off-line when they work, but I'm still unlikely to convince them to actually turn off facebook. Reading the bare bones here doesn’t do it justice, but here’s what I don’t want to forget about my memory:
The Medium is the Message
He quotes McLuhan from 1964 – "The electric technology is within the gates, and we are numb, deaf, blind and mute about its encounter with the Gutenberg technology, on and through which the American way of life was formed” (2). When we change technologies or tools of any kind, there are gains and losses. It changes the way we work. Nietzsche's style of writing changed noticeably between pen and paper and the new-fangled typewriter: “Our writing equipment takes part in the forming of our thoughts” (19). But we forget about the losses and just notice the gains.
When information is presented to us, how it's presented makes a difference, but we get carried away by the content and don’t notice the effect the method of presentation has on us. In class, I've watched students glaze over at power points like old-schoolers used to with video strips waiting for the next ‘bing’ from the record to indicate a changing slide. When they present, they often use technology as a crutch - putting their entire presentation on slides, and they lose the class in the process. But the same kids can be captured by chalk and talk – a much maligned teaching method today - as it allows greater movement of the presenter back and forth through the room as people share thoughts and responses, and student ideas make it on the board as much as my own. They shift from looking at the me to one another to the board and their notes to glean the basics for later review rather than focusing on a stagnant screen at the front. Well, it works better for
me anyway.
Our Dwindling Attention Spans
The more we use the web, the more we have to fight to stay focused on longer texts. It’s shortening our attention spans as we skim and scroll causing a decay of of faculties for reading and concentrating. I've noticed how students looking at a webpage will immediately scroll down even if vital information is right at the top. They're looking for a heading to jump out at them or a video to click on. They have to be told to stop and actually
read the words on the screen.
One study found that professors of English literature are now struggling to get students to read an entire book. Their students just look at study notes online then miss the nuances of the text, and, more importantly, they don’t learn how to
notice patterns of metaphors and motifs, how to
do deep reading, but only learn how to summarize other writers’ analysis. Cutting corners is nothing new, but it's surprising to read that lit students won't read books.
Brain Physiology: We Become What We Think
The most interesting part of the book is how our brains work to take in information. There's been a lot written about this lately - that the brain is affected by our environment. It's not as stable as we once thought.
"Though different regions are associated with different mental functions, the cellular components do not form permanent structures or play rigid roles. They’re flexible. They change with experience, circumstance, and need” (29). The brain gets accustomed to our typical activities and changes when they stop or when new activities start: “neurons seem to ‘want’ to receive input….When their usual input disappears, they start responding to the next best thing” (29).
The brain reorganizes itself after an accident or loss of function of any body part, but also after change in lifestyle. William James figured this out in
Principles of Psychology: “nervous tissues…seems endowed with a very extraordinary degree of plasticity…either outward forces or inward tensions can, from one hour to another, turn that structure into something different from what it was” (21). Leon Dumont used an analogy to explain: “Flowing water hollows out a channel for itself which grows broader and deeper; and when it later flows again, it follows the path traced by itself before. Just so, the impressions of outer objects fashion for themselves more and more appropriate paths in the nervous system, and these vital paths recur under similar external stimulation, even if they have been interrupted for some time” (21).
An experiment was conducted on London cab drivers long before GPS, back when they had to have the entire city memorized. They developed an enlargement of the posterior hippocampus of their brain and a shrinking of anterior hippocampus from the constant spatial processing required to navigate intricate road system. Their brain adapted to suit how it was being used.
Something really fascinating to me is that imagining has the same effect. Researchers taught a short piano melody to people without any piano knowledge. Half the group practiced the piece for two hours a day, and the other half only
imagined practicing without actually touching the keys. There were identical changes to the brain. It reminded me of what I do when I’m about to do something new, like build roof rafters or a waterfall. I say I have to stare at it a few days before I can start, but really I'm walking myself through the process in my head repeatedly, apparently until my brain’s learned how to do it.
“As particular circuits in our brain strengthen through the repetition of a physical or mental activity, they begin to transform that activity into a habit…the chemically triggered synapses that link our neurons program us, in effect, to want to keep exercising the circuits they’ve formed. Once we’ve wired new circuitry in our brain…’we long to keep it activated.’ That’s the way the brain fine-tunes its operations. Routine activities are carried out ever more quickly and efficiently, while unused circuits are pruned away” (34).
This explains why I can do dishes so much faster than my kids – and why they should be practicing dishes regularly.
This can also explain one aspect of mental afflictions like depression and OCD – “The more a sufferer concentrates on his symptoms, the deeper those symptoms are etched into his neural circuits” (35), with implication for addictions as well.
But our brain circuits can weaken or dissolve with neglect:
“If we stop exercising our mental skills…we do not just forget them: the brain map space for those skills is turned over to the skills we practice instead….the possibility of intellectual decay is inherent in the malleability of our brains. That doesn’t mean that we can’t, with concerted effort, once again redirect our neural signals and rebuild the skills we’ve lost. What is does mean is that the vital paths in our brains become…the paths of least resistance” (35). “What we’re not doing when we’re online also has neurological consequences. Just as neurons that fire together wire together, neurons that don’t fire together don’t wire together....The brain recycles the disused neurons and synapses for other, more pressing work. We gain new skills and perspectives but lose old ones” (120).
Carr adds a fascinating history of the written word. Socrates wasn't a fan of writing: “Far better than a word written in the ‘water’ of ink is ‘an intelligent word graven in the soul of the learner’ through spoken discourse” (55). Socrates recognized that a “dependence on the technology of the alphabet will alter a person’s mind….writing threatens to make us shallower thinkers…preventing us from achieving the intellectual depth that leads to wisdom and true happiness" (55).
McLuhan counters, “The achievements of the Western world, it is obvious, are testimony to the tremendous values of literacy....the written word liberated knowledge from the bounds of individual memory and freed language from the rhythmical and formulaic structures requires to support memorization and recitation" (57). But as great an achievement as writing is, as useful as it is, there is something lost when we no longer have our brains remember and hold ideas within to debate them. The underlying question of this entire book is, Is it worth the loss?
The invention of the book altered how we think: “To read a book was to practice an unnatural process of thought, one that demanded sustained, unbroken attention to a single, static object…They had to train their brains to ignore everything else going on around them to resist the urge to let their focus skip from one sensory cue to another…applying greater ‘top-down control’ over their attention” (64). This ability represents a “strange anomaly in the history of our psychological development” (64).
As with imagining activities, one study found that brain activity while
reading a story is similar to brain activity while doing the actions being described: “brain regions that are activated often ‘mirror those involved when people perform, imagine, or observe similar real-world activities….The reader becomes the book'” (74). This makes me wonder what happens to the brain when people read a lot of violent books. It's not to suggest that reading about it necessarily makes us want to do it, but will it make us better at fighting just because we've read about it...or, perhaps, better at sex if that's our reading preference?
The Shift to Screens
In the U.S., adults aged 25-34 average 35 hours of TV a week and less than an hour a week of reading (87). And there's a difference between reading on-line and reading print material as "we are plugged into an ‘ecosystem of interruption technologies’" (91):
“A page of online text viewed through a computer screen may seem similar to a page of printed text. But scrolling or clicking through a Web document involves physical actions and sensory stimuli very different from those involved in holding and turning the pages of a book or a magazine....It also influences the degree of attention we devote to it and the depth of our immersion in it" (92).
This has already influenced how magazines are writing articles to accommodate shorter attention spans: “
Rolling Stone, once known for publishing sprawling, adventurous features by writers like Hunter S. Thompson, now eschews such works, offering readers a jumble of short articles and reviews....Most popular magazines have come to be ‘filled with color, oversized headlines, graphics, photos, and pull quotes’" (94).
He warns that technology encourages and rewards shallow reading. Some see technology as only bringing benefits, but we have to be wary of the costs:
“No doubt the connectivity and other features of e-books will bring new delights and diversions…But the cost will be a further weakening, if not a final severing, of the intimate intellectual attachment between the lone writer and the lone reader. The practice of deep reading that became popular in the wake of Gutenberg’s invention, in which ‘the quiet was part of the meaning, part of the mind,’ will continue to fade, in all likelihood becoming the province of a small and dwindling elite” (108).
“Some thinkers welcome the eclipse of the book and the literary mind it fostered. In a recent address to a group of teachers, Mark Federman, an education researcher at the University of Toronto, argued that literacy, as we’ve traditionally understood it, “is now nothing but a quaint notion, an aesthetic form that is an irrelevant to the real questions and issues of pedagogy today as is recited poetry – clearly not devoid of value, but equally no longer the structuring force of society.’ The time has come, he said, for teachers and students alike to abandon the ‘linear, hierarchical’ world of the book and enter the Web’s ‘world of ubiquitous connectivity and pervasive proximity’ – a world in which ’the greatest skill’ involves ‘discovering emergent meaning among contexts that are continually in flux” (111).
“In the choices we have made, consciously or not, about how we use our computers, we have rejected the intellectual tradition of solitary, single-minded concentration, the ethic that the book bestows on us. We have cast our lot with the juggler” (114).
The distractions offered on-line add to the shallow-reading effect. This makes me reconsider all the links and images I make the effort to include in each post:
“But the extensive activity in the brains of surfers also points to why deep reading and other acts of sustained concentration become so difficult online. The need to evaluate links and make related navigational choices, while also processing a multiplicity of fleeting sensory stimuli, requires constant mental coordination and decision-making, distracting the brain from the work of interpreting text or other information. Whenever we, as readers, come upon a link, we have to pause, for at least a split second, to allow our prefrontal cortex to evaluate whether or not we should click on it. The redirection of our mental resources, from reading words to making judgments, may be imperceptive to use – our brains are quick – but it’s been shown to impede comprehension and retention, particularly when it’s repeated frequently” (122).
“Difficulties in developing an understanding of a subject or a concept appear to be ‘heavily determined by working memory load,’…and the more complex the material we’re tying to learn, the greater the penalty exacted by an overloaded mind…two of the most important [sources of cognitive overload] are ‘extraneous problem solving’ and ‘divided attention.’ Those also happen to be two of the central features of the Net as an informational medium" (125).
“Just as the pioneers of hypertext once believed that links would provide a richer learning experience for readers, many educators also assumed that multimedia, or ‘rich media,’ as it’s sometimes called, would deepen comprehension and strengthen learning. The more inputs, the better. But this assumption, long accepted without much evidence, has also been contradicted by research. The division of attention demanded by multimedia further strains our cognitive abilities, diminishing our learning and weakening our understanding. When it comes to supplying the mind with the stuff of thought, more can be less” (129).
In one study half of the participants had a text-only passage to read, and the other half had the text passage with relevant audiovisual material. When they were tested on the information, not only did the text-only group do better on the test, they found the material to be more interesting, educational, understandable, and more enjoyable. Multimedia: “would seem to limit, rather than enhance, information acquisition” (130).
In another study, they had students listen to a lecture. One half could surf web during lecture to look up relevant information, and the other half had to keep their laptops shut. Surfers performed “poorer on immediate measures of memory for the to-be-learned content. It didn't matter, moreover, whether they surfed information related to the lecture or completely unrelated content – they all performed poorly” (131).
A final study had students watch CNN. One group watched an anchor with info-graphics on the screen and textural news crawling along the bottom, while the other group watched the anchor without graphics and a news crawl. The multimedia version remembered significantly fewer facts as “this multimessage format exceeded viewers’ attentional capacity” (131).
We're encouraged in schools to be cutting edge with our tech use. Teachers are praised for using any new program. Even if it's just a switch from Powerpoint to Prezi, it's lauded as revolutionary. New is celebrated as better with little exploration of studies showing otherwise. We're so worried about being the best, about getting the most kids to achieve on standardized tests, really, that we're jumping at anything shiny that comes our way in hopes it will be the magic bullet that finally motivates the more challenging students. Carr further cautions,
“The Internet, however, wasn’t built by educators to optimize learning. It presents information not in a carefully balanced way but as a concentration-fragmenting mishmash. The Net is, by design, an interruption system, a machine geared for dividing attention” (131). “In addition to flooding our working memory with information, the juggling imposes what brain scientists call ‘switching costs’ on our cognition. Every time we shift our attention, our brain has to reorient itself, further taxing our mental resources…Switching between two tasks short-circuited their understanding: they got the job done, but they lost its meaning” (133). “The near continuous stream of new information pumped out by the Web also plays to our natural tendency to ‘vastly overvalue what happens to us right now’….We crave the new even when we know that ‘the new is more often trivial than essential’” (134). "There are signs that new forms of ‘reading’ are emerging as users ‘power browse’ horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense” (137).
“The more you multitask, the less deliberative you become; the less able to think and reason out a problem.’ You become…more likely to rely on conventional ideas and solutions rather than challenging them with original lines of thought….As we gain more experience in rapidly shifting our attention, we may ‘overcome some of the inefficiencies’ inherent in multitasking…but except in rare circumstances, you can train until you’re blue in the face and you’d never be as good as if you just focused on one thing at a time. What we’re doing when we multitask is learning to be skillful at a superficial level. The Roman philosopher Seneca may have put it best two thousand years ago: “To be everywhere is to be nowhere....The Net is making us smarter…only if we define intelligence by the Net’s own standards…if we think about the depth of our thought rather than just its speed – we have to come to a different and considerably darker conclusion” (141).
Reading scored fell between 1992 and 2005: “Literary reading aptitude suffered the largest decline, dropping twelve percent” (146).
On Memorization: The brain is a muscle, not a filing cabinet.
When I was a kid, I could tell you any of my friends' phone numbers by heart. Now I can barely remember my own. I don't
need to know phone numbers anymore because they're all programmed into my phone, but is the work computers are doing making our brains lazier? Should we try to remember things just for the sake of working out our brains?
Erasmus thought that “memorizing was far more than a means of storage. It was the first step in a process of synthesis, a process that led to a deeper and more personal understanding of one’s reading” (179). Tech writer, Don Tapscott, disagrees. “Now that we can look up anything ‘with a click on Google…memorizing long passages or historical facts’ is obsolete. Memorization is ‘a waste of time’” (181).
To the Ancient Greeks, “memory was a goddess: Mnemosyne, mother of the Muses" (181). “The shift in our view of memory is yet another manifestation of our acceptance of the metaphor that portrays the brain as a computer…storing bits of data in fixed locations and serving them up as inputs to the brain’s calculations, then offloading that storage capacity to the Web is not just possible but…liberating….The analogy has a simplicity that makes it compelling….But…it’s wrong" (182).
The brain isn’t a filing cabinet; it’s a muscle. Using it over and over doesn’t fill it until it can take no more, but quite the opposite – it strengthens it to take in more information.
So every year when I just throw up my hands at the idea of remembering 90 students’ names in a few days, and hope the students will be forgiving of my aging brain, I’ve simply gotten sucked into a vicious cycle that prompted me to give up on myself far too soon. I have problems remembering names because I typically
don’t remember them, so I don’t try, so I never
do. Kind of sounds like a Winnie the Pooh poem or an admonishment from Yoda. The implication here is that if I actually work on remembering people’s names instead of assuming that’s just something I can’t do, then I’ll actually develop the
ability to remember them better for the next set of classes. It’s why, every year when I rent a mini-van for a trip to a cottage with my family, I start the journey a bit of a nervous wreck, but over the week the van seems to grow smaller and more manageable until I’m parallel parking the sucker by the end. (Just kidding – at the end of the week I still search for pull-through parking spots.) It's nothing revelatory to say that practicing improves ability, yet we don't tend to think this way about using our memory.
Getting information from short-term to long-term requires “an hour or so for memories to become hard, or 'consolidated,' in the brain. Short-term memories don’t become long-term memories immediately, and the process of their consolidation is delicate. Any disruption, whether a jab to the head or a simple distraction, can seep the nascent memories from the mind” (184). "The more times an experience is repeated, the longer the memory of the experience lasts…Not only did the concentration of neurotransmitters in synapses change, altering the strength of the existing connections between neurons, but the neurons grew entirely new synaptic terminals" (185). These terminals increase the more memories are formed, and then decrease again when they’re allowed to fade, but these don’t completely decrease to former numbers. “The fact that, even after a memory is forgotten, the number of synapses remains a bit higher than it had been originally helps explain why it’s easier to learn something a second time” (185). This is why many teacher tell students to go over their notes regularly. It actually helps.
Computers vs Brains: Some benefits of being alive.
“While an artificial brain absorbs information and immediately saves it in its memory, the human brain continues to process information long after it is received, and the quality of memories depends on how the information is processed. Biological memory is alive. Computer memory is not....Those who celebrate the ‘outsourcing’ of memory to the Web have been misled by a metaphor. They overlook the fundamentally organic nature of biological memory…Once we bring an explicit long-term memory back into working memory, it becomes a short-term memory again. When we reconsolidate it, it gains a new set of connections – a new context….Biological memory is in a perpetual state of renewal....In contrast to working memory, with its constrained capacity, long-term memory expands and contracts with almost unlimited elasticity, thanks to the brain’s ability to grow and prune synaptic terminals and continually adjust the strength of synaptic connections” (192).
Web advocates think, “In freeing us from the work of remembering, it’s said, the Web allows us to devote more time to creative thought. But the parallel is flawed….The Web…places
more pressure on our working memory, not only diverting resources form our higher reasoning faculties but obstructing the consolidation of long-term memories and the development of schemas….The Web is a technology of forgetfulness" (193).
The ramifications of the brain's plasticity is that reading on-line, in a distracted way, can have an effect on our ability to read and think:
"The influx of competing messages that we receive whenever we go online not only overloads our working memory; it makes it much harder for our frontal lobes to concentrate our attention on any one thing. The process of memory consolidation can’t even get started. And, thanks once again to the plasticity of our neuronal pathways, the more we use the Web, the more we train our brain to be distracted – to process information very quickly and very efficiently but without sustained attention. That helps explain why many of us find it hard to concentrate even when we’re away from our computers" (194).
Marshall McLuhan “elucidated the ways our technologies at once strengthen and sap us…. our tools end up ‘numbing’ whatever part of our body they ‘amplify.' When we extend some part of ourselves artificially, we also distance ourselves from the amplified part and its natural functions" (210).
In a study they had two groups trying to solve a puzzle. One group had helpful software, the other group didn't. In the early stages, the helped group made correct moves more quickly, but as they proceeded, the other group increased their skill with the puzzle more rapidly. Learning a task with little help wires our brain to know how to
learn that type of task, so it becomes easier to later improve on our initial learning. The group with software help didn’t do the initial learning, so they couldn’t advance as easily. “As we ‘externalize’ problem solving and other cognitive chores to our computers, we reduce our brain’s ability ‘to build stable knowledge structures’...that can later ‘be applied in new situations’” (216).
The Need for Nature
“A series of psychological studies over the past twenty years has revealed that after spending time in a quiet rural setting, close to nature, people exhibit greater attentiveness, stronger memory, and generally improved cognition. Their brains become both calmer and sharper…They no longer have to tax their working memories by processing a stream of bottom-up distractions. The resulting state of contemplativeness strengthens their ability to control their mind” (219).
In yet another study: one group walked through a park, the other walked on city streets, and then both took a cognitive test. The park group did significantly better. Even
cooler, it works just by looking at pictures of nature or even
imagining nature scenes! “The people who looked at pictures of nature scenes were able to exert substantially stronger control over their attention, while those who looked at city scenes showed no improvement in their attentiveness” (220). This makes a case for designing our classrooms with bits of nature all around or taking the kids outside to learn.
The Emotional Effect
The brain doesn't just run our intellectual requirements, but also determines our emotional reactions. “It’s not only deep thinking that requires a calm, attentive mind. It’s also empathy and compassion” (220).
“…the more distracted we become, the less able we are to experience the subtlest, most distinctively human forms of empathy, compassion, and other emotions. ‘For some kinds of thoughts, especially moral decision-making about other people’s social and psychological situations, we need to allow for adequate time and reflection” (221).
Carr's final caution: “…as we grow more accustomed to and dependent on our computers we will be tempted to entrust to them ‘ tasks that demand wisdom.’ And once we do that, there will be no turning back” (224).