Showing posts with label technology. Show all posts
Showing posts with label technology. Show all posts

Wednesday, November 20, 2024

A Tale of Two Studies

I was confronted yesterday with the ubiquitous claim, "Lockdowns destroyed kid's ability to socialize. Now they're committing suicide because of it!!" Let's have a closer look again:

I posted this mini-thread a couple weeks ago that helps to understand the role media plays in propping up this claim:

"A tale of two studies: One study (October 2024 in School Psychology), picked up by the Toronto Star, gauged classroom incivility from anecdotal reports by teachers pre- and post-covid (handy they had data form Fall 2019) to conclude that lockdowns for three months in 2020 destroyed kids' socialization skills in 2022. The other study (October 2024 in an international medical journal), NOT picked up by the Star, assessed kids pre- and post-INFECTION, and against a control to find, 'more severe symptoms of inattention, hyperactivity-impulsivity, opposition, a wide range of emotional and behavioural problems, and poor school function.' Many studies have shown that SARS-CoV-2 affects the prefrontal cortex, which affects behaviour. Until mainstream media starts reporting on better studies, our children will suffer. A timeline of some studies on Covid's effects on the brain are here." 

Saturday, November 16, 2024

New and Improved Propaganda Machines

We carry propaganda machines in our pockets. Propaganda isn't just to misinform, but to distract us and exhaust the capacity for critical thinking. When you're struggling to decide between 25 types of cereal or what colour to paint the kitchen, you can miss the bigger picture. Chomsky's been saying that for years. Propaganda destroys the quest for truth, and it's worse than ever.

Pat Loller has a quick explainer about how we're ignoring the huge shift in how propaganda operates now:

"Go make a new account or reset your algorithm on any app and see how many swipes it takes to get right-wing propaganda. . . . There are all these studies coming out saying Americans are functionally illiterate . . . you don't read, you don't get critical thinking skills, and then the propaganda that you're consuming, you don't think about. You just go, 'Oh, okay, I guess that's true,' especially if you've been consuming it since you were 15 years old. . . . These kids congregate around these figures and they play video games together. Go and look at any popular video game, and Control F search for 'woke' or 'DEI', and you'll see that the gaming sphere has been a cesspool for decades. . . .  There's all these angry young men with no critical thinking skills who are being fed a constant diet of propaganda that is literally dished up to them on their phones the moment they open an account. Is it any wonder that they're going to fall Pied Piper behind this guy who's just like, 'Hey, all of those complex challenges in your life? It's this guy's fault. Stop centering you as the protagonist in every single video game and every single movie and TV show ever made?? Girls say they'd rather meet a bear in the woods than you?? Get mad and vote for the guy who is going to hurt those people.' 

Thursday, August 24, 2023

On Student Absences

I've been thinking about the concern with kids not going to school for reasons beyond the rampant illnesses caused by letting a highly-infectious virus run wild. 


The Fortune article suggests that schools are less welcoming now. "Everyone seemed less tolerant, more angry." They mention a host of reasons for absences including poverty, housing instability, transportation issues, and school staff shortages that mean a rotation of supply teachers. But they don't mention that Long Covid that has led many to be chronically disabled: From this article, a 13-year-old explained, "I don't remember a day without pain." H/t Laura Miers, who points out: 
"We're disabling everyone. It will NEVER improve with no mitigation. Kids and adults are at the same risk for Long Covid."
Meanwhile, the Fortune article blames online learning:
"The effects of online learning linger: School relationships have frayed, and after months at home, many parents and students don't see the point of regular attendance."

Tuesday, December 31, 2019

Age of Oblivion: Another End of Decade Rant

Of course  calendars are a construct and don't mean anything, but the end of the year and, even more so, the end of the decade are useful times to take stock.

In pop culture, we have the Ecco Homo moment as a cultural foreboding - the chutzpah to insist on a fix that pretends to be completely oblivious to the destruction of former beauty. We've done that with our whole planet. But more than that is the fame it brought to the amateur restoration worker, driving up tourism dramatically. We are positioned to celebrate destruction of beauty more than creation. This could be bookended with the acknowledgment and then immediate justification of "billionaires in wine caves" having more power than the rest of the populous; that a politician will be attacked for refusing to be bribed is a sign of our times.

The New York Times got a random smattering of people to answer: What Will the World Look Like in 2030?, twelve years after we were told we have twelve years to fix everything. It's a terrifying read. I've smushed some pertinent bits together here:

Saturday, November 23, 2019

The Greatest Propaganda Machine in History

Sacha Baron Cohen (aka Ali G. and Borat, among others) won an award from the Anti-Defamation League. Here's his 25 minute acceptance speech. It's in writing, abridged a bit, below the video if you'd rather skim than watch. (Emphasis is mine.)



"Today, around the world, demagogues appeal to our worst instincts. Conspiracy theories, once confined to the fringe, are going mainstream. It's as if the age of reason, the era of evidential argument is ending and now knowledge is increasing delegitimized, and scientific consensus is dismissed. Democracy, which depends on shared truths, is in retreat, and autocracy, which depends on shared lies, is on the march. Hate crimes are surging . . .  What do these dangerous trends have in common? . . . All this hate and violence is being facilitated by a handful of internet companies that amount to the greatest propaganda machine in history. . . .

Saturday, March 16, 2019

More Ford Cuts: Ban Cell Phones, but Mandate Online Courses

From iPolitics:
"On top of the change to class sizes, the government is also mandating that all high school students take four of their 30 credits online. This requirement will take effect in the 2020-21 school year. These e-learning classes will average 35 students per class, according to the government." 
This is huge!! I don't understand why this wasn't part of the original statement in the CBC's articles yesterday or in OSSTF's statement! Online courses have notoriously high failure rates (50% according to one study, but 90% including all the people who drop the course) except for the ones that grossly lower curricular standards by, for instance, having students read just a few pages of a book instead of an entire book in a university-level senior course!!

A New York Times article, from just over a year ago, outlined how online courses harm students:
"In high schools and colleges, there is mounting evidence that the growth of online education is hurting a critical group: the less proficient students who are precisely those most in need of skilled classroom teachers. . . . After all, taking a class without a teacher requires high levels of self-motivation, self-regulation and organization. Yet in high schools across the country, students who are struggling in traditional classrooms are increasingly steered into online courses. . . . In reality, students who complete these courses tend to do quite poorly on subsequent tests of academic knowledge. This suggests that these online recovery courses often give students an easy passing grade without teaching them very much. Consider a study conducted in the Chicago high schools. Students who had failed algebra were randomly assigned either to online or to face-to-face recovery courses. The results were clear: Students in the online algebra courses learned much less than those who worked with a teacher in a classroom. . . . Even though the courses are seemingly identical, the students who enroll online do substantially worse. The effects are lasting, with online students more likely to drop out of college altogether."

Saturday, January 19, 2019

10 Year Challenge


I posted this challenge on social media recently. This is what we do to be sociable: play online games and forward memes. Discussing the world and screaming into the void to try to shift this tragic path is such a loser thing to do. It's a balance to stay just this side of the line where we might be heard just a little.

Wired's Kate O'Neill guessed that, like all those social media games, this one is about data mining, specifically,
"I knew the facial recognition scenario was broadly plausible and indicative of a trend that people should be aware of. . . . Imagine that you wanted to train a facial recognition algorithm on age-related characteristics and, more specifically, on age progression (e.g., how people are likely to look as they get older). Ideally, you'd want a broad and rigorous dataset with lots of people's pictures. It would help if you knew they were taken a fixed number of years apart—say, 10 years. Sure, you could mine Facebook for profile pictures and look at posting dates or EXIF data. But that whole set of profile pictures could end up generating a lot of useless noise. People don’t reliably upload pictures in chronological order . . . it would help if you had a clean, simple, helpfully labeled set of then-and-now photos. . . . As with hashtags that go viral, you can generally place more trust in the validity of data earlier on in the trend or campaign—before people begin to participate ironically or attempt to hijack the hashtag for irrelevant purposes. . . . Is it bad that someone could use your Facebook photos to train a facial recognition algorithm? Not necessarily; in a way, it’s inevitable. . . . [It] could help with finding missing kids . . . [but] could someday factor into insurance assessment and health care."
But that's not even the thing I'm interested in. That all goes without saying now. I'm interested in how people hijacked the trend (see this too):

Polar ice formation - from Nasa

Sunday, January 6, 2019

Gertz's Nihilism and Technology

I really love this book. First of all, the chapter headings and sub-headings are all clever little in jokes, like "Beyond Google and Evil," that make anyone with a cursory knowledge of Nietzsche feel like part of the gang. But it's not just looking at tech through the lens of Nietzsche in a cut-and-paste way. This is an analysis of our relationship with technology that, while immersed in Nietzsche, and will allow a novice to solidify their understanding of some major works, is really an analysis of human nature that would benefit the a-philosophical as well. This is a brief summary as a memory aid for myself, but the book deserves a close read in full.

He uses Nietzsche's Genealogy of Morals to explain how technology is used "to soothe rather than cure" our nihilistic attitudes by applying five tactics the ascetic priest uses "to make nihilism palatable" (21): self-hypnosis, mechanical activity, petty pleasures, herd instinct, and orgies of feeling.

Monday, August 27, 2018

On Culture Wars

I just finally got around to Angela Nagle's Kill All Normies. It's a comprehensive book outlining the history and categorization of various groups online that have seeped into real life, but, although she mentions numerous scholars in her analysis, with zero endnotes and nary a reference section, it didn't surprise me to find that she's been accused of plagiarizing (see herehere, and here for some undeniable examples of lifted sentences and paragraphs). Some speculate that the book was rushed in order to be first out with this kind of content. The cribbing seems to be primarily explanations of terms or descriptions of events, but the analysis and compilation of these ideas into a whole appears to be her own work. I wouldn't let it slide in a classroom, and her editor/publisher should have caught it, but, as a reader, it's still compelling to see the various ideas assembled so succinctly.

There are so many terms being used to describe various views, so here's a brief and incomplete table of people, media affiliations, and basic characteristics I compiled as I read Nagle's book. It's all a little slippery and contentious, but it's a starting point. She's weeded out the racist alt-right from the more playful, yet shockingly offensive and sometimes harmful alt-light. I'm not convinced there's any clear consensus on any of this, though. We're all using the terms in slightly different ways, further muddying up the waters of the whole mess.

Thursday, August 10, 2017

The Plight of the Millennials

Further explanation here. 
First, a bit about statistical norms and the normal distribution. In social sciences, for something to be considered a statistically significant characteristic of a group, it just needs to be present in about 68% of the population, or one standard deviation from the norm. There's tons of variation in the other 32%, so all the generalizations below might not apply to the people in your life. But, according to researchers, they apply to most people in each group, so we can still look at trends. I remember studies in my day showing a clear correlation between violent movie viewing and violent teens, yet I loved slasher flicks and still lean towards more gruesome films despite the stats. And, more to the point, nobody stopped making those movies. This recent article is unlikely to change a thing, but we're still wise to consider it.

The article in question is The Atlantic article, "Have Smartphones Destroyed a Generation?" which adds to a running list of problems with kids today caused by technology. It hits home from some of the trends I've noticed anecdotally in my classroom over the past 26 years: that phones are distracting, lead to unrealistic idealization and familial alienation, and affect sleep habits. But the writer misses any discussion that phones also drive constant change, consumerism, and cognizance of tragedies, and the significance of other factors affecting trends in this demographic. Here's a chart I sometimes use in class for an overview of demographics by year of birth. We've moved way beyond the boom, bust, and echo labels.

Saturday, February 13, 2016

On Command F: a Bittersweet Function

I tell my students of the magic of "command F" on Macs and "control F" on PCs. This F function key, that can find a word or phrase anywhere in the text, is a game-changer when hunting for the best quotation or for that juicy bit of information or when searching through lines of code to add a fancy new feature to a blog. But it has a dark side.

Now when I try to skim through a hard-copy document, like a book, I grow impatient with the task. If I suddenly realize the brilliant acuity of a passage a few pages back, it's painful to have to skim over what I've already read to find it. Sure I can try to laugh it off, ignore that irritating feeling, and persist. But that niggling feeling continues to torment my brain, hampering my focus, and making it even more difficult to skim with any skill. This is a skill I was once lauded for (perhaps owing to my scarcity of abilities). Now I get grumbly after a few pages, and, after a few more, I desperately want to relinquish the pursuit. It takes a steely resolve now to do something I used to do effortlessly. It could be as time-consuming as always, but the effort is marred with the knowledge that there is an easier way.

More than two millennia ago, Socrates warned against putting words in print because those new-fangled books would destroy the memories - and hence the minds - of the populous. Without exercising our minds by demanding more and more of their talents, they'll atrophy. Current brain science concurs, and my anecdote adds further evidence to the pile. If we stop doing a task as often, the pathways in our brain get sluggish, and we can no longer do the task as well.
“As particular circuits in our brain strengthen through the repetition of a physical or mental activity, they begin to transform that activity into a habit…the chemically triggered synapses that link our neurons progam us, in effect, to want to keep exercising the circuits they’ve formed. Once we’ve wired new circuitry in our brain…’we long to keep it activated.’ That’s the way the brain fine-tunes its operations. Routine activities are carried out ever more quickly and efficiently, while unused circuits are pruned away....“If we stop exercising our mental skills…we do not just forget them: the brain map space for those skills is turned over to the skills we practice instead….the possibility of intellectual decay is inherent in the malleability of our brains. That doesn’t mean that we can’t, with concerted effort, once again redirect our neural signals and rebuild the skills we’ve lost. What is does mean is that the vital paths in our brains become…the paths of least resistance” (34-35).
It sometimes feels like we're left with two options: either we improve the technology so we never need our brains to do such a menial task again (a command F-bot for print), or we avoid the technology, or at least avoid reliance on the technology, in order to strengthen our brain's ability to attack that text and prevent skills from being 'pruned away.' It seems like a no-brainer to take up the technology for all it's worth but for the loss of our sense of industry and usefulness.

It makes me worry a bit about self-driving cars. Will we all become incompetent drivers, or perhaps more incompetent is appropriate here? But then again, will it even matter if that skill becomes obsolete? Maybe it's better if we don't have to learn to pay attention to the road. Our car can takes us to work and home without commanding our attention so we can pay for all our stuff we bought online during our daily commute!

But then, what about sex robots? Will we become a culture annoyed with the incompetence of a human partner when we could have the accuracy and effortlessness of a sex toy that never demands a turn? Are we already impatient with imperfection?

I'm glad that books grew to be commonplace, and I spend no time lamenting my inability to remember epic tales in detail. Maybe in 2,000 years, if we're still here and still have enough resources for advances in technology, we'll think it funny that people ever used birth control or fertility drugs or prostitutes when they could have just had sex with programmable machines without all the hassles. And why be concerned with our ability to find information in a book when we can likely find the book online (or put it online ourselves) and then let the computer skim for us?

Self-efficacy.

Our ability to hone skills is tied to our feelings of self-worth. If we have fewer skills that matter, then we won't matter. Students will stop coming to me to find a specific quote in a lengthy essay and show delight with how quickly I can do it. They won't need my help. We might laud independence to our detriment. I have a friend who can rhyme off the birthdays of every person she knows, but facebook notifications have already rendered her skill redundant. And car-chases will no longer draw a crowd to a film like these:



(They always leave out a favourite:)




The loss of an ability to give pleasure to one another could be the most profound disruption to our culture. Honing personal skills that are exclusively developed to suit the particular taste of one other person enhances a recognition and knowledge of the other in a depth that conversation merely skims. A robot could be programmed to hit the mark perfectly every time, but this is a classic case of the perfect being enemy of the good. And it's never as simple and universal as it's presented in media:




Obviously skill-destroying technology is not always an all or nothing situation nor is it always a problem.  I can't remember any friends' phone numbers or e-mail addresses any more because I don't have to, but I'm happy I no longer have any use for a washboard besides possibly in a rhythm section of a band. How soon before we all forget skills and later deeply regret their loss? It might be useful to consider how necessary each piece technology is to our lives relative to how useful the eroding skills may be to our sense of self-worth.


Sunday, December 21, 2014

The Newsroom: On Journalism, the Environment, and Sexism

I just finished watching the final season of The Newsroom as it appears catching up on shows is becoming a personal tradition on the first day of any holiday.  It was a cringe-worthy six hours with a few redeeming story-lines.  Here be ton o' SPOILERS including the fact that it ends with a wedding, a funeral, and a baby - the holy trinity of lazy plot lines.

Journalism

The themes of the show were timely in that we're discussing media integrity in my class.  But we watched Peace, Propaganda, and the Promised Land instead.  The Newsroom is a fantasy news show the way The West Wing was a fantasy political show, so we can have higher hopes than is typical.  Unfortunately, we end up disappointed.  They harped on the reality that facts much be checked carefully, and they had one bit of moral soul-searching as Maggie decided against using a scoop she got in an unethical manner all scrunched down in a chair.

But the big message of the season is that social media is full of dangerous lies.  In several episodes they contrast a populist and speedy bit of intel with a more accurate but slow and steady bit to show us how real news must work.  They came across more as flummoxed luddites baffled by the existence of blogs than righteous journalists.  The only social media that seems to exist is the likes of Buzzfeed and Gawker.  There aren't journalist bloggers informing the world, nor are there people live-tweeting events as they happen, and the ACN's twitter feed gets used by a sleepy girl who says stupid things that makes the company look bad.  It's boringly two-dimensional.

Media is changing, but it's not wise to slander all social media with such a broad brush in the hopes that it will send us scurrying to network news channels.  What has to happen now is that we each have to follow well - choose carefully where we get information in the first place.  Then we have to fact check our own sources through multiple credible sources.  AND we have to THINK about what we're reading or watching.  Always with the thinking.  Most people won't do any of that, and they'll stay in the dark, ignorant to world news.  Nothing new.  But, as the show suggests, it can be dangerous if the masses people believe rumours.  Still nothing new.  Even their own newsroom was wrong in the past.  They don't forget that, but they also don't really remember it when it comes to this argument.

Environment

Maggie struggles to make the environment interesting and Jim mocks her efforts in the most douchie way possible and makes her beg for his help.  Cute.  She's a top reporter now, but can't find an angle for a major story because we all know the environment is SOOO booorrrring!

The EPA top dog is interviewed, and tells Will of an apocalypse coming within 80 years or so and that he thinks we're doomed no matter what we do now because we've missed our chance to save the day.  Mother Jones fact checked the speech and found the numbers pretty accurate, but the numbers are publicly available and not really the big secret the episode made them out to be.

Grist has this to say,
"There is no line you cross where bad weather becomes a "failure of the planet," such that we'll be able to identify the first person to die from such a failure. It's not going to be that dramatic.  Making it sound like there's going to be some sudden break only makes people blind to the incremental changes already underway. It makes them think climate change is something that might happen, something we might or might no avoid, rather than something that's already underway and has to be managed." 
And both say his level of resignation is not yet necessary.  We still have a fighting chance.  The real EPA still suggests we can save the world by changing the kinds of lightbulbs we use!  It would have been easier done sooner, yes, but it's still possible to slow things down.

But just because it's not as dramatic as a meteor strike doesn't mean it's not newsworthy or interesting.

And then they all moved on and never spoke of it again.

Sexism

Emily Nussbaum's article in the New Yorker points out that the show is "consistently worried about scurrilous sexual gossip directed at prominent men."  It might be something men fear because it seems to be one of few ways of actually taking them down.  Inside Job painted a portrait of financiers who bring prostitutes on the road with them, but then only the disliked in the group - the ones not playing ball - are actively destroyed by their libido being outed.  I got the sense from that film that it's a normal part of the culture that's dragged out into public forums only if necessary.  It's like the law for open carry - which, in my parts, means having an open case of beer in your trunk.  It's technically illegal, but everyone does it.  But if cops want to arrest you for something else, they can bring you in for that opened case.  If the analogy is remotely accurate, then it's very clever of men to get everyone involved in something they can use against them later!

The show tries with a variety of men and women, but they all still fall into pat and dull stereotypes.  They are many annoyingly dumb men who still have more options and control than their clever female counterparts.  Only the one guy in the group doesn't clue in to the fact that Mac is pregnant.  The  male twin is baffled by anything going on during a billion dollar acquisition.  And all the men are stereotypically fearful of relationships.  I've never actually met an adult man like that in real life, but there are scores of them on TV.

And it seems like most of the women get or keep jobs because of their sexual relationships with the men in the office.

Some superficial attempts at being pro-woman actually make things worse:  Like when Maggie tries to convince Jim to be supportive of his girlfriend even when Hallie just wrote an exposé of their relationship that barely concealed his identity.  Men should be supportive of women no matter what nasty stuff they do.  Or when Will admonishes his cell-mate for hitting a girl.  Of course domestic violence is a horrible crime, and they pointed out this must be his third strike to end up in jail, but Will's speech has something about it that doesn't sit well.  It promotes a chivalry that still allows for more subtle sexism.

Jim got Maggie a job - implying she'd be lost without him - but then he tried to save her from leaving by offering her a promotion.  At least she chose to leave anyway, and he supported her.  There's that; so she didn't waste her time training him to support women for nothing (a necessary move because he's so dumb). Then a male subordinate is told of his female boss' promotion before the boss - who only hears about it as the subordinate announces it!  When would anything like that ever happen?

And then there's the weird chat Don and a rape victim have in her dorm room.  But that's been talked about all over the interwebs.  The moral is to never judge anyone until after s/he's been to court.  Reporters shouldn't interview anybody whose words could damage someone who hasn't officially been charged with a crime, even if they're unlikely to ever be charged.  Well, unless they're rich and powerful.  But if a girl has been assaulted, and went to the police, and no arrests were made, then she should just be quiet about it.  Only a judge can determine if a crime was committed. Once again, things can't be left to the court of public opinion.  The right people have to tell us what to think, not teach us how to think for ourselves.


And then there was...

- a Human Resources officer following around a dating couple to prove they're dating, threatening to separate them, because, it turns out, he thought it would be funny.  He was actually a fan of their awesome love  (or something like that).  And he apparently had nothing else to do with his days.  And this was after doing nothing about an employee who openly admitted to sleeping with many women on staff which was clearly causing problems in the office.

- an ethic professor who is totally clueless about personal discretion - but was sensitive enough to  somehow recognize that Maggie is really in love with Jim even though there's zero chemistry between them.

- a brilliant lawyer turned journalist who can't, for the life of him, remember anybody's name, but doesn't think it would be a problem to refer to people by racist names as a fill-in.

- an executive producer who ties her hair back whenever there's work to be done, but always leave the front bit in her face and often right in front of her eyes - the bit that would typically be the whole point of tying it back in the first place.  

- a newsroom dedicated to integrity, but quick to hide a friend who commits a felony.


After all this, it was entertaining.  It just wasn't excellent entertainment.    

Sunday, November 30, 2014

Inequality for All

I watched Robert Reich's film last summer on a camping trip.  I woke up in the middle of a pitch-dark night and couldn't fall back asleep.  I tried a movie on my phone to lull me into a coma, but this was the wrong one to choose.

Reich's film clearly explains how we've gotten into this economic pickle, and he offers solutions to get ourselves out.  Here's a synopsis the 90 minute film.  It's about the U.S., but much of it applies to Canada as well, so I use "we" throughout.

The (corporate controlled) media has created the illusion that the U.S. is poor, and we don't have any taxes to pay for anything, but that's a myth.  The U.S. is very wealthy, richer than it's ever been, but it's just no longer sharing the wealth in a way that can support itself.

The media also contributes to the problem by spinning any attempt to discuss income inequality into a conversation about class warfare - which, apparently, is a topic to avoid.  Jon Oliver gets at this issue very well and in only 14 minutes (with jokes!):



We have the same disparity now as we did in the 1920s - just before the great depression.  Policies that benefit a few at the expense of the many, according to Oliver, get passed because we've been brainwashed to believe erroneously that we, too, will end up in the upper echelons with the very wealthy:
"60% believe our system unfairly favours the wealthy, but, and here's the key, 60% also believe that those who work hard enough can make it.  Or, in other words, 'I can clearly see this game is rigged, which is what's going to make it so sweet when I win this thing!'"
Or, as Steinbeck said,


Federal estate tax is created to tax anything over 5 million, and is on the verve of being abolished because people think it might apply to them one day.  The problem with inheritance is it keeps the wealth circulating in few hands, and the poor have less chance of every getting out of poverty.  Marx was on the problems of inheritance, but from the other end.  He warned about the error of dismantling inheritance first while leaving the economic system intact:
"The disappearance of the right of inheritance will be the natural results of a social change superseding private property in the means of production; but the abolition of the right of inheritance can never be the starting point of such a social transformation."  
And we're back to Reich.

The Class Struggle

Reich and Oliver agree that, like cinnamon, a little inequality is a good thing, but too much is dangerous.  Reich uses a graph that looks like a suspension bridge, with the peaks - the danger points - in the 1920s and now.  In 1928 as in 2007, the top 1% took home more than 23% of all income, and the middle class stagnated. That's what too much inequality looks like.

The middle class is imperative to a healthy economy.  The rich buy very little proportionate to their numbers ("a person making a thousand times as much money, doesn't buy a thousand times as much stuff"), so we count on the masses to keep shopping. A good economy will support the middle class and the poor who will create jobs by spending money.  But they are struggling too much to survive for them to shop any day except, of course, Black Friday.

Policy Changes in the Late 1970s and Early 1980s

There's no such thing as a truly free market.  There are always governmental rules necessary to run things.  The real question is who do the rules benefit and who do they hurt. Middle class wages rose from the late 40s until the late 70s, and then flattened out.

The best book to understand this period of history, for my money, is The Shock Doctrine.  Naomi Klein outlines in detail exactly how the US, UK, and Canada (under Reagan, Thatcher, and Mulroney) changed the economic system with worldwide repercussions.  From the film:  The tax rate on top earners dropped from 70% to 28% under Reagan.  In the late 50s it peaked at 91% under Eisenhower for top earners, which was set at incomes over $400,000 (about $3.5 million in today's money).  That was dismantled in the name of equality: Why should some people be taxed higher than others.  But that confuses equality and equity.

The financial markets were granted more power as governments moved to deregulate the market, so they engaged in more excessive behaviour.  Labour unions declined (often by force) which mirrored a decline in the middle class share of income.  And globalization and technology added to the destruction of the middle class.

Where a company's headquartered means less and less.  We can outsource jobs which undercuts wages of workers in the US.  And automation has reduced the need for as many employees.

On the positive side, we have more cool stuff that's cheaper, and CEOs and financiers are much richer.  But CEOs raised their own salaries as they fired workers.  When companies depend on shareholders, there's a growing pressure to increase profits, which plays out by pushing down wages and benefits to the bare minimum.  But then they have fewer consumers available to buy their products, so they have to widen their market worldwide, make products that self-destruct, and encourage people to buy crap they really don't need.

What Worked in the Past

Policies that were around from 1947-77 (Keynes' economics) worked for the general prosperity of all.

  • Higher education was a priority, and universities expanded. 
  • Labour unions were strong, and more than a third belonged to a union.
  • The middle class bought more, so companies could hire more, providing a stronger tax base for governments to invest more in people.

Today tuition is rising as the government is taking money out of education.  Infrastructure is crumbling and becoming dangerous.  There isn't a tax base to use to fix the problems because we've lowered taxes on the very wealthy, shipped jobs overseas, and flattened middle class wages so they no longer keep up with inflation.  If the wealthy don't pay their fair share, and the middle class doesn't have enough to pay tax on, then there's less revenue for social services like public education and health care.  Then tuitions go up, and the population becomes less educated, and, globally, less competitive.  In the 1960s, tuition at Berkeley was free.  In the 1970s, it was $700 in today's dollars. Now it's $15,000/year.

In the 1980s, we coped with declining incomes for a time by introducing double wage families with more women in the workforce.  Families worked longer hours, taking on second and third jobs.  And we borrowed money with fewer restrictions on loans - that, to some, seemed like a good idea at the time, but later blew up.  Now the coping mechanisms the middle class used are exhausted.

The Effect on Democracy

Inequality is a problem for democracy too.  When so many resources accumulate at the top, there comes the capacity to control politics through wealthy lobby groups who give the maximum amount allowed to election campaigns.  All politics have shifted to the right, so that Reich maintained the same views, but shifted from a Republican to a Democrat over a few decades.  (And some of us NDP supporters are left without a truly leftist party to back.)  High inequality brings with it a high degree of political polarization with politicians disagreeing for the sake of disagreeing instead of working together for the good of the country (like Howarth rejecting a very left-leaning budget).  For $300 million, you can buy a president.  We can't have government on an auction block.

There's a polarization of citizens too.  Losers of a rigged game can get very angry.  These trends of society pulling apart are very dangerous.  Reich sees fights on the Berkeley campus.

Solutions

The economy does better when everyone does better, and history is on the side of positive social change.  There's no "single magic bullet," be we need to mobilize, organize, and energize other people, from what I gleaned from the film, to...
  • shop locally - avoid automated check-out line, on-line shopping, or anything that reduces jobs
  • decrease technological use in manufacturing to increase jobs for the working class at home which will increase wages, increase shopping, and increase our tax base
  • put tax money into infrastructure to decrease risk of collapse and create jobs
  • support union creation and maintenance
  • convince the government to invest in education, skills, and infrastructure
  • regulate corporations to prevent companies from being allowed to deduct executive pay
  • raise the tax rate for the very wealthy to increase the tax base which will allow for more money in education and health care
But the question, as always, is... how do we get from here to there?  I can do the shop locally thing, and support unions, but everything else seems horrifically out of reach.

But then...  There's always art:


Monday, September 1, 2014

The Shallows: What the Internet is Doing to Our Brains

I love Nicholas Carr's book. There are lots of studies and science mixed with many stories and asides and discussions of philosophers and other great thinkers. It reminded me of reading a Bill Bryson book. You get the facts painlessly. And it presents a strong argument for keeping kids (and everyone) off-line when they work, but I'm still unlikely to  convince them to actually turn off facebook.  Reading the bare bones here doesn’t do it justice, but here’s what I don’t want to forget about my memory:


The Medium is the Message

He quotes McLuhan from 1964 – "The electric technology is within the gates, and we are numb, deaf, blind and mute about its encounter with the Gutenberg technology, on and through which the American way of life was formed” (2).  When we change technologies or tools of any kind, there are gains and losses.  It changes the way we work.  Nietzsche's style of writing changed noticeably between pen and paper and the new-fangled typewriter: “Our writing equipment takes part in the forming of our thoughts” (19).  But we forget about the losses and just notice the gains.

When information is presented to us, how it's presented makes a difference, but we get carried away by the content and don’t notice the effect the method of presentation has on us. In class, I've watched students glaze over at power points like old-schoolers used to with video strips waiting for the next ‘bing’ from the record to indicate a changing slide. When they present, they often use technology as a crutch - putting their entire presentation on slides, and they lose the class in the process. But the same kids can be captured by chalk and talk – a much maligned teaching method today - as it allows greater movement of the presenter back and forth through the room as people share thoughts and responses, and student ideas make it on the board as much as my own. They shift from looking at the me to one another to the board and their notes to glean the basics for later review rather than focusing on a stagnant screen at the front. Well, it works better for me anyway.


Our Dwindling Attention Spans

The more we use the web, the more we have to fight to stay focused on longer texts. It’s shortening our attention spans as we skim and scroll causing a decay of of faculties for reading and concentrating. I've noticed how students looking at a webpage will immediately scroll down even if vital information is right at the top. They're looking for a heading to jump out at them or a video to click on. They have to be told to stop and actually read the words on the screen.

One study found that professors of English literature are now struggling to get students to read an entire book. Their students just look at study notes online then miss the nuances of the text, and, more importantly, they don’t learn how to notice patterns of metaphors and motifs, how to do deep reading, but only learn how to summarize other writers’ analysis. Cutting corners is nothing new, but it's surprising to read that lit students won't read books.


Brain Physiology:  We Become What We Think

The most interesting part of the book is how our brains work to take in information. There's been a lot written about this lately - that the brain is affected by our environment. It's not as stable as we once thought.

"Though different regions are associated with different mental functions, the cellular components do not form permanent structures or play rigid roles. They’re flexible. They change with experience, circumstance, and need” (29). The brain gets accustomed to our typical activities and changes when they stop or when new activities start: “neurons seem to ‘want’ to receive input….When their usual input disappears, they start responding to the next best thing” (29).

The brain reorganizes itself after an accident or loss of function of any body part, but also after change in lifestyle. William James figured this out in Principles of Psychology:  “nervous tissues…seems endowed with a very extraordinary degree of plasticity…either outward forces or inward tensions can, from one hour to another, turn that structure into something different from what it was” (21).  Leon Dumont used an analogy to explain: “Flowing water hollows out a channel for itself which grows broader and deeper; and when it later flows again, it follows the path traced by itself before. Just so, the impressions of outer objects fashion for themselves more and more appropriate paths in the nervous system, and these vital paths recur under similar external stimulation, even if they have been interrupted for some time” (21).

An experiment was conducted on London cab drivers long before GPS, back when they had to have the entire city memorized. They developed an enlargement of the posterior hippocampus of their brain and a shrinking of anterior hippocampus from the constant spatial processing required to navigate intricate road system. Their brain adapted to suit how it was being used.

Something really fascinating to me is that imagining has the same effect. Researchers taught a short piano melody to people without any piano knowledge. Half the group practiced the piece for two hours a day, and the other half only imagined practicing without actually touching the keys. There were identical changes to the brain. It reminded me of what I do when I’m about to do something new, like build roof rafters or a waterfall. I say I have to stare at it a few days before I can start, but really I'm walking myself through the process in my head repeatedly, apparently until my brain’s learned how to do it.
“As particular circuits in our brain strengthen through the repetition of a physical or mental activity, they begin to transform that activity into a habit…the chemically triggered synapses that link our neurons program us, in effect, to want to keep exercising the circuits they’ve formed. Once we’ve wired new circuitry in our brain…’we long to keep it activated.’ That’s the way the brain fine-tunes its operations. Routine activities are carried out ever more quickly and efficiently, while unused circuits are pruned away” (34).
This explains why I can do dishes so much faster than my kids – and why they should be practicing dishes regularly.

This can also explain one aspect of mental afflictions like depression and OCD – “The more a sufferer concentrates on his symptoms, the deeper those symptoms are etched into his neural circuits” (35), with implication for addictions as well.

But our brain circuits can weaken or dissolve with neglect:
“If we stop exercising our mental skills…we do not just forget them: the brain map space for those skills is turned over to the skills we practice instead….the possibility of intellectual decay is inherent in the malleability of our brains. That doesn’t mean that we can’t, with concerted effort, once again redirect our neural signals and rebuild the skills we’ve lost. What is does mean is that the vital paths in our brains become…the paths of least resistance” (35). “What we’re not doing when we’re online also has neurological consequences. Just as neurons that fire together wire together, neurons that don’t fire together don’t wire together....The brain recycles the disused neurons and synapses for other, more pressing work. We gain new skills and perspectives but lose old ones” (120).
Carr adds a fascinating history of the written word. Socrates wasn't a fan of writing: “Far better than a word written in the ‘water’ of ink is ‘an intelligent word graven in the soul of the learner’ through spoken discourse” (55). Socrates recognized that a “dependence on the technology of the alphabet will alter a person’s mind….writing threatens to make us shallower thinkers…preventing us from achieving the intellectual depth that leads to wisdom and true happiness" (55).

McLuhan counters, “The achievements of the Western world, it is obvious, are testimony to the tremendous values of literacy....the written word liberated knowledge from the bounds of individual memory and freed language from the rhythmical and formulaic structures requires to support memorization and recitation" (57).  But as great an achievement as writing is, as useful as it is, there is something lost when we no longer have our brains remember and hold ideas within to debate them. The underlying question of this entire book is, Is it worth the loss?

The invention of the book altered how we think: “To read a book was to practice an unnatural process of thought, one that demanded sustained, unbroken attention to a single, static object…They had to train their brains to ignore everything else going on around them to resist the urge to let their focus skip from one sensory cue to another…applying greater ‘top-down control’ over their attention” (64). This ability represents a “strange anomaly in the history of our psychological development” (64).

As with imagining activities, one study found that brain activity while reading a story is similar to brain activity while doing the actions being described: “brain regions that are activated often ‘mirror those involved when people perform, imagine, or observe similar real-world activities….The reader becomes the book'” (74).  This makes me wonder what happens to the brain when people read a lot of violent books. It's not to suggest that reading about it necessarily makes us want to do it, but will it make us better at fighting just because we've read about it...or, perhaps, better at sex if that's our reading preference?


The Shift to Screens

In the U.S., adults aged 25-34 average 35 hours of TV a week and less than an hour a week of reading (87). And there's a difference between reading on-line and reading print material as "we are plugged into an ‘ecosystem of interruption technologies’" (91):
“A page of online text viewed through a computer screen may seem similar to a page of printed text. But scrolling or clicking through a Web document involves physical actions and sensory stimuli very different from those involved in holding and turning the pages of a book or a magazine....It also influences the degree of attention we devote to it and the depth of our immersion in it" (92).  
This has already influenced how magazines are writing articles to accommodate shorter attention spans: “Rolling Stone, once known for publishing sprawling, adventurous features by writers like Hunter S. Thompson, now eschews such works, offering readers a jumble of short articles and reviews....Most popular magazines have come to be ‘filled with color, oversized headlines, graphics, photos, and pull quotes’" (94).

He warns that technology encourages and rewards shallow reading. Some see technology as only bringing benefits, but we have to be wary of the costs:
“No doubt the connectivity and other features of e-books will bring new delights and diversions…But the cost will be a further weakening, if not a final severing, of the intimate intellectual attachment between the lone writer and the lone reader. The practice of deep reading that became popular in the wake of Gutenberg’s invention, in which ‘the quiet was part of the meaning, part of the mind,’ will continue to fade, in all likelihood becoming the province of a small and dwindling elite” (108).

“Some thinkers welcome the eclipse of the book and the literary mind it fostered. In a recent address to a group of teachers, Mark Federman, an education researcher at the University of Toronto, argued that literacy, as we’ve traditionally understood it, “is now nothing but a quaint notion, an aesthetic form that is an irrelevant to the real questions and issues of pedagogy today as is recited poetry – clearly not devoid of value, but equally no longer the structuring force of society.’ The time has come, he said, for teachers and students alike to abandon the ‘linear, hierarchical’ world of the book and enter the Web’s ‘world of ubiquitous connectivity and pervasive proximity’ – a world in which ’the greatest skill’ involves ‘discovering emergent meaning among contexts that are continually in flux” (111).

“In the choices we have made, consciously or not, about how we use our computers, we have rejected the intellectual tradition of solitary, single-minded concentration, the ethic that the book bestows on us. We have cast our lot with the juggler” (114).
The distractions offered on-line add to the shallow-reading effect. This makes me reconsider all the links and images I make the effort to include in each post:
“But the extensive activity in the brains of surfers also points to why deep reading and other acts of sustained concentration become so difficult online. The need to evaluate links and make related navigational choices, while also processing a multiplicity of fleeting sensory stimuli, requires constant mental coordination and decision-making, distracting the brain from the work of interpreting text or other information. Whenever we, as readers, come upon a link, we have to pause, for at least a split second, to allow our prefrontal cortex to evaluate whether or not we should click on it. The redirection of our mental resources, from reading words to making judgments, may be imperceptive to use – our brains are quick – but it’s been shown to impede comprehension and retention, particularly when it’s repeated frequently” (122). 
“Difficulties in developing an understanding of a subject or a concept appear to be ‘heavily determined by working memory load,’…and the more complex the material we’re tying to learn, the greater the penalty exacted by an overloaded mind…two of the most important [sources of cognitive overload] are ‘extraneous problem solving’ and ‘divided attention.’ Those also happen to be two of the central features of the Net as an informational medium" (125). 
“Just as the pioneers of hypertext once believed that links would provide a richer learning experience for readers, many educators also assumed that multimedia, or ‘rich media,’ as it’s sometimes called, would deepen comprehension and strengthen learning. The more inputs, the better. But this assumption, long accepted without much evidence, has also been contradicted by research. The division of attention demanded by multimedia further strains our cognitive abilities, diminishing our learning and weakening our understanding. When it comes to supplying the mind with the stuff of thought, more can be less” (129).
In one study half of the participants had a text-only passage to read, and the other half had the text passage with relevant audiovisual material.  When they were tested on the information, not only did the text-only group do better on the test, they found the material to be more interesting, educational, understandable, and more enjoyable. Multimedia: “would seem to limit, rather than enhance, information acquisition” (130).

In another study, they had students listen to a lecture. One half could surf web during lecture to look up relevant information, and the other half had to keep their laptops shut. Surfers performed “poorer on immediate measures of memory for the to-be-learned content. It didn't matter, moreover, whether they surfed information related to the lecture or completely unrelated content – they all performed poorly” (131).

A final study had students watch CNN. One group watched an anchor with info-graphics on the screen and textural news crawling along the bottom, while the other group watched the anchor without graphics and a news crawl. The multimedia version remembered significantly fewer facts as “this multimessage format exceeded viewers’ attentional capacity” (131).

We're encouraged in schools to be cutting edge with our tech use. Teachers are praised for using any new program. Even if it's just a switch from Powerpoint to Prezi, it's lauded as revolutionary. New is celebrated as better with little exploration of studies showing otherwise. We're so worried about being the best, about getting the most kids to achieve on standardized tests, really, that we're jumping at anything shiny that comes our way in hopes it will be the magic bullet that finally motivates the more challenging students. Carr further cautions,
“The Internet, however, wasn’t built by educators to optimize learning. It presents information not in a carefully balanced way but as a concentration-fragmenting mishmash. The Net is, by design, an interruption system, a machine geared for dividing attention” (131). “In addition to flooding our working memory with information, the juggling imposes what brain scientists call ‘switching costs’ on our cognition. Every time we shift our attention, our brain has to reorient itself, further taxing our mental resources…Switching between two tasks short-circuited their understanding: they got the job done, but they lost its meaning” (133). “The near continuous stream of new information pumped out by the Web also plays to our natural tendency to ‘vastly overvalue what happens to us right now’….We crave the new even when we know that ‘the new is more often trivial than essential’” (134).  "There are signs that new forms of ‘reading’ are emerging as users ‘power browse’ horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense” (137). 
“The more you multitask, the less deliberative you become; the less able to think and reason out a problem.’ You become…more likely to rely on conventional ideas and solutions rather than challenging them with original lines of thought….As we gain more experience in rapidly shifting our attention, we may ‘overcome some of the inefficiencies’ inherent in multitasking…but except in rare circumstances, you can train until you’re blue in the face and you’d never be as good as if you just focused on one thing at a time. What we’re doing when we multitask is learning to be skillful at a superficial level. The Roman philosopher Seneca may have put it best two thousand years ago: “To be everywhere is to be nowhere....The Net is making us smarter…only if we define intelligence by the Net’s own standards…if we think about the depth of our thought rather than just its speed – we have to come to a different and considerably darker conclusion” (141).
Reading scored fell between 1992 and 2005: “Literary reading aptitude suffered the largest decline, dropping twelve percent” (146).


On Memorization:  The brain is a muscle, not a filing cabinet.

When I was a kid, I could tell you any of my friends' phone numbers by heart.  Now I can barely remember my own.  I don't need to know phone numbers anymore because they're all programmed into my phone, but is the work computers are doing making our brains lazier?  Should we try to remember things just for the sake of working out our brains?

Erasmus thought that “memorizing was far more than a means of storage. It was the first step in a process of synthesis, a process that led to a deeper and more personal understanding of one’s reading” (179). Tech writer, Don Tapscott, disagrees. “Now that we can look up anything ‘with a click on Google…memorizing long passages or historical facts’ is obsolete. Memorization is ‘a waste of time’” (181).

To the Ancient Greeks, “memory was a goddess: Mnemosyne, mother of the Muses" (181). “The shift in our view of memory is yet another manifestation of our acceptance of the metaphor that portrays the brain as a computer…storing bits of data in fixed locations and serving them up as inputs to the brain’s calculations, then offloading that storage capacity to the Web is not just possible but…liberating….The analogy has a simplicity that makes it compelling….But…it’s wrong" (182). The brain isn’t a filing cabinet; it’s a muscle. Using it over and over doesn’t fill it until it can take no more, but quite the opposite – it strengthens it to take in more information.

So every year when I just throw up my hands at the idea of remembering 90 students’ names in a few days, and hope the students will be forgiving of my aging brain, I’ve simply gotten sucked into a vicious cycle that prompted me to give up on myself far too soon. I have problems remembering names because I typically don’t remember them, so I don’t try, so I never do. Kind of sounds like a Winnie the Pooh poem or an admonishment from Yoda. The implication here is that if I actually work on remembering people’s names instead of assuming that’s just something I can’t do, then I’ll actually develop the ability to remember them better for the next set of classes. It’s why, every year when I rent a mini-van for a trip to a cottage with my family, I start the journey a bit of a nervous wreck, but over the week the van seems to grow smaller and more manageable until I’m parallel parking the sucker by the end. (Just kidding – at the end of the week I still search for pull-through parking spots.) It's nothing revelatory to say that practicing improves ability, yet we don't tend to think this way about using our memory.

Getting information from short-term to long-term requires “an hour or so for memories to become hard, or 'consolidated,' in the brain. Short-term memories don’t become long-term memories immediately, and the process of their consolidation is delicate. Any disruption, whether a jab to the head or a simple distraction, can seep the nascent memories from the mind” (184). "The more times an experience is repeated, the longer the memory of the experience lasts…Not only did the concentration of neurotransmitters in synapses change, altering the strength of the existing connections between neurons, but the neurons grew entirely new synaptic terminals" (185). These terminals increase the more memories are formed, and then decrease again when they’re allowed to fade, but these don’t completely decrease to former numbers. “The fact that, even after a memory is forgotten, the number of synapses remains a bit higher than it had been originally helps explain why it’s easier to learn something a second time” (185). This is why many teacher tell students to go over their notes regularly. It actually helps.


Computers vs Brains: Some benefits of being alive.
“While an artificial brain absorbs information and immediately saves it in its memory, the human brain continues to process information long after it is received, and the quality of memories depends on how the information is processed. Biological memory is alive. Computer memory is not....Those who celebrate the ‘outsourcing’ of memory to the Web have been misled by a metaphor. They overlook the fundamentally organic nature of biological memory…Once we bring an explicit long-term memory back into working memory, it becomes a short-term memory again. When we reconsolidate it, it gains a new set of connections – a new context….Biological memory is in a perpetual state of renewal....In contrast to working memory, with its constrained capacity, long-term memory expands and contracts with almost unlimited elasticity, thanks to the brain’s ability to grow and prune synaptic terminals and continually adjust the strength of synaptic connections” (192).
Web advocates think, “In freeing us from the work of remembering, it’s said, the Web allows us to devote more time to creative thought. But the parallel is flawed….The Web…places more pressure on our working memory, not only diverting resources form our higher reasoning faculties but obstructing the consolidation of long-term memories and the development of schemas….The Web is a technology of forgetfulness" (193).

The ramifications of the brain's plasticity is that reading on-line, in a distracted way, can have an effect on our ability to read and think:
"The influx of competing messages that we receive whenever we go online not only overloads our working memory; it makes it much harder for our frontal lobes to concentrate our attention on any one thing. The process of memory consolidation can’t even get started. And, thanks once again to the plasticity of our neuronal pathways, the more we use the Web, the more we train our brain to be distracted – to process information very quickly and very efficiently but without sustained attention. That helps explain why many of us find it hard to concentrate even when we’re away from our computers" (194).
Marshall McLuhan “elucidated the ways our technologies at once strengthen and sap us…. our tools end up ‘numbing’ whatever part of our body they ‘amplify.' When we extend some part of ourselves artificially, we also distance ourselves from the amplified part and its natural functions" (210).

In a study they had two groups trying to solve a puzzle. One group had helpful software, the other group didn't. In the early stages, the helped group made correct moves more quickly, but as they proceeded, the other group increased their skill with the puzzle more rapidly. Learning a task with little help wires our brain to know how to learn that type of task, so it becomes easier to later improve on our initial learning. The group with software help didn’t do the initial learning, so they couldn’t advance as easily. “As we ‘externalize’ problem solving and other cognitive chores to our computers, we reduce our brain’s ability ‘to build stable knowledge structures’...that can later ‘be applied in new situations’” (216).


The Need for Nature
“A series of psychological studies over the past twenty years has revealed that after spending time in a quiet rural setting, close to nature, people exhibit greater attentiveness, stronger memory, and generally improved cognition. Their brains become both calmer and sharper…They no longer have to tax their working memories by processing a stream of bottom-up distractions. The resulting state of contemplativeness strengthens their ability to control their mind” (219).  
In yet another study: one group walked through a park, the other walked on city streets, and then both took a cognitive test. The park group did significantly better. Even cooler, it works just by looking at pictures of nature or even imagining nature scenes!  “The people who looked at pictures of nature scenes were able to exert substantially stronger control over their attention, while those who looked at city scenes showed no improvement in their attentiveness” (220). This makes a case for designing our classrooms with bits of nature all around or taking the kids outside to learn.


The Emotional Effect

The brain doesn't just run our intellectual requirements, but also determines our emotional reactions.    “It’s not only deep thinking that requires a calm, attentive mind. It’s also empathy and compassion” (220).
“…the more distracted we become, the less able we are to experience the subtlest, most distinctively human forms of empathy, compassion, and other emotions. ‘For some kinds of thoughts, especially moral decision-making about other people’s social and psychological situations, we need to allow for adequate time and reflection” (221).
Carr's final caution:  “…as we grow more accustomed to and dependent on our computers we will be tempted to entrust to them ‘ tasks that demand wisdom.’ And once we do that, there will be no turning back” (224).