Showing posts with label culture. Show all posts
Showing posts with label culture. Show all posts

Monday, August 14, 2023

Cultural Differences of ASD

I was once introduced to a new colleague who made very direct, sustained eye contact, and I thought to myself as I spoke with him: he's on track to be in admin. He just seemed the type to make connections and get ahead and would likely end up at the board office. But then, after talking to him a few more times and seeing him in moments of awkwardness, I thought, "Oh, he's autistic." That didn't change my reaction to him at all, of course, but it did help me figure something out.

That interaction was a lightbulb moment that helped me understand the many many times someone gloms on to me as if somehow I'm the most interesting person in the room, then, after a few conversations in which nothing appears to go wrong, they completely ghost me. They thought, because of my sustained eye contact boring into them, that I must be someone important, an alpha even. Then they figured out their error and shunned me, embarrassed by their own mistake. I'm fine with someone changing their mind about hanging out because that happens to the best of us, but the absolute worst part of it, worse that literally being pointed and laughed at by other adults, is when they suddenly understand that I'm on the spectrum and start talking to me like I'm a flippin' space cadet

So, just last month you wanted to take me to lunch at some fancy place, and now you've concluded that I'm practically brain dead. Curious. It's always startling how abrupt that transition is for people. And how reliable.

Then mix that in with "you can only do that because you're autistic" like writing every day, which is a thing that writers do, but if you can't motivate yourself to do it, then the reason I can do it despite clearly being inferior to you must be my hidden superpower of being autistic. So things I can't easily do are laughed at openly and the things I can do are discounted. Lovely.

It's not tragic, but it is trying. These little things can eat away at people. Surely we know it's not right to behave this way with others, though, right?? 

A few studies show that people notice something different about people with autism in the first few seconds of an encounter (h/t Callum Stephen). In one study (Sasson et al, 2017), they filmed a variety of people, some with ASD (level 1) and some neurotypical (NT), in a 60 second mock audition. The study participants were assigned to one of five groups to watch the videos and assess each candidate on likability, intelligence, attractiveness, trustworthiness, etc.: audio only, visual only, audio-visual, static image, and transcript. Only in the transcript option were people with ASD rated as highly as NT interviews. Even the still image set them apart. (Is that why there are no good photos of me: Internalized ableism??) The study concluded,

Thursday, June 29, 2023

Lost Counterculture

Henry Madison, just some random dude on twitter, wrote an interesting bit on concerts and the enmeshment of generations. I disagree with several of his claims below:

Blondie
"Imagine a 77-year old favourite of the boomers’ parents, playing at Woodstock in the 1960s. The oldest performer at Woodstock was Ravi Shankar, who was 49. (He didn’t like hippies and never did it again.) There’s a serious point here. A 77-year old artist playing at Woodstock would have been born in 1892. That would be the equivalent of Louis Armstrong or Jelly Roll Morton playing there. We have artists at Glastonbury this week approaching 80 years of age. None of this is ageism, I should add. If people want to keep playing into their 80s, good luck to them. But we’re talking about headlining music festivals, festivals predominantly (like Woodstock) designed for the young. To me this means unhealthy things.

What’s said about the counterculture movement of the 1960s was that it was profoundly anti-establishment. Today the closest beliefs are labelled neoliberalism. It’s strange the bedfellows beliefs keep. And who invented neoliberalism? The same boomers. That was the same movement, as it morphed into a highly profitable middle and old age. The richest generation in all of history, by a mile. Anti-establishment beliefs were great business: the boomers dismantled many of society’s institutions, and then privatised them. Most of the corporate behemoths that now dominate our lives in our ‘neoliberal’ societies were set up by boomers, who also profited the most from them. They monetised the wreckage of their earlier anti-establishment assault. 

I don’t think people see this clearly at all. 

Saturday, November 5, 2022

The Cult of Covid

I've been posting threads from health care professionals on here to save them as we watch whether or not Twitter will self-destruct. This one is from someone anon, @1goodtern who posts lots of fantastic info, but I have no idea what they do for a living. (ETA - they're a priest), The stats they share aren't too dissimilar from Canada's, and the post resonates crucially with my own experiences as someone screaming into the void about the very simple steps we could be taking to prevent the spread of a fatal virus. This could have been something I wrote if I were able to write this well! 

It's been a strange day today and a hard one. Time for a re-evaluation and update on where I stand with covid mitigations and why. I'm not up-to-date on international data and differing levels of protections anymore, so I'll be talking about the UK, but the principles should apply worldwide.

First, today, and a few observations:

Today I was at an event with 200 people, as part of my job, something I couldn't avoid. I was the only one in a mask FFP3, and today I wore my stoggles [sic] too. There were zero mitigations in place. Also zero sympathy, zero understanding, lots of strange looks, some aggression, lots of coughing, lots of people looking unwell. 

Wednesday, January 2, 2019

On Arguing Facts

It never ceases to amaze me how often I'll be writing or thinking about something, and then the perfect articles drop in my lap. It might help that I've been scrolling through social media endlessly on my days off!

In my prior post, I discussed the need for teachers to step up and actively dismantle arguments based on a mistaken premise or altogether unfounded assumption rather than heed concerns about the self-esteem of our charges or other potential ramification born by speaking our mind, and then I hit this Aeon article from October.

Monday, August 27, 2018

On Culture Wars

I just finally got around to Angela Nagle's Kill All Normies. It's a comprehensive book outlining the history and categorization of various groups online that have seeped into real life, but, although she mentions numerous scholars in her analysis, with zero endnotes and nary a reference section, it didn't surprise me to find that she's been accused of plagiarizing (see herehere, and here for some undeniable examples of lifted sentences and paragraphs). Some speculate that the book was rushed in order to be first out with this kind of content. The cribbing seems to be primarily explanations of terms or descriptions of events, but the analysis and compilation of these ideas into a whole appears to be her own work. I wouldn't let it slide in a classroom, and her editor/publisher should have caught it, but, as a reader, it's still compelling to see the various ideas assembled so succinctly.

There are so many terms being used to describe various views, so here's a brief and incomplete table of people, media affiliations, and basic characteristics I compiled as I read Nagle's book. It's all a little slippery and contentious, but it's a starting point. She's weeded out the racist alt-right from the more playful, yet shockingly offensive and sometimes harmful alt-light. I'm not convinced there's any clear consensus on any of this, though. We're all using the terms in slightly different ways, further muddying up the waters of the whole mess.

Wednesday, April 11, 2018

On Extra Time and IEP Designations

We administered the literacy test yesterday with one new twist that most teachers weren't privy to until the previous evening: There would be no specific accommodations for students with IEPs (Individual Educational Programs) that call for extra time. Instead, we would allow extra time for anyone that needs it.

This is a significant shift in accommodating special needs. It's something I've done for years in my classroom, so I'm already on board. My rationale is about access to the IEP designation. To get an IEP, many students get a professional assessment. This can be done for free through publicly available psychometrists, but there's a wait list that's years long. Or, if you've got the kind of job that insures this, it can be done privately at a cost of two to three thousand dollars. So, right off the bat, there's a bit of a class issue around the designation. Adding to that, there's the fact that many parents aren't aware that this is a thing. Or, if they've heard about IEPs, they don't quite understand what they are or what they're for, and they're not sure they're necessarily a good thing. This all boils down to the reality that in any class, I'll have some kids with noticeable barriers to their ability to do the work who don't have an IEP in place. So I make my tests a bit shorter, then let them do other work once they finish, but let everyone have the full period if they need it.

The downside of this is that some of them slow right down. They take their sweet time and might not  learn to work efficiently, to train their brains to read and think and write all at once in a timely fashion. Thinking quickly is a skill that's useful in most jobs and definitely necessary in college and university. I can only hope that the offer of time to finish other work is enough to make them get their test off the table. My oldest progeny, with IEP in hand, ran into difficulties at university after many years of teachers giving them all the time in the world instead of their specifically allotted time and a half. Their first term was a disaster because they had never learned to write quickly.

For the lit test, one effect was immediate. In previous years, we might have up to a third of our students needing an extra time accommodation. This year, offering an extra 15 minutes per booklet for those who wanted to take it instead of leaving for the full break, plus allowing them to go to a "late room" (I prefer "extra time room") if they wanted more time than that, meant only a handful of students actually took extra time to finish.

What I wonder is, had students known ahead of time (like they will next year), would they have slowed down to take the full double time available? And, this year, sitting in a room where everyone is scrambling to finish in the usual allotted time, with most people getting up to leave at the earliest dismissal time for break and at the end, how many rushed to finish, when otherwise, in a room where everyone was designated extra time, they would have taken the time to craft a better final sentence and more thoroughly check over all their work? That's a concern, for sure.

Despite these potential issues, however, there's something else I really like about the shift. I'm not a fan of labels. I hate when I see people's behaviour mocked, then a bit of backstory about their childhood or their condition, maybe ADHD or ASD or whatever, and suddenly they're treated with more kindness and compassion. But we've all got some issues. We're all a little bit something. Imagine if we could treat one another with kindness and compassion without knowing any backstory! With the IEP designation, some students reveal the framework of their abilities, but others don't. Ignoring the designation, but accommodating everyone as needed, takes away the expectation of always needing the extra time, takes away the 'specialness' of certain kids, and provokes us to see the unique needs of each of our students. And each other.

With my own experiences as a mother with two children with an IEP, I found the process of discovering specific barriers can be enlightening and incredibly useful for the child and parent about half the time. I don't want to throw that away. But how we attend to all these types of designations could use a reworking. It will be interesting to see how this change plays out in the coming years.

Saturday, January 13, 2018

Monbiot's Out of the Wreckage

The book cover says the book "provides the hope and clarity required to change the world." Well, he certainly tries. He's got a plan of action that's possible, but I didn't get the requisite hope necessary to be spurred to action. It's a bit of an overview of many ideas from different places, many of which are already in action somewhere in the world, and it left me with a solid  book list to peruse, but it also left me with a sinking feeling that this will never work. We're never going to get our shit together enough to do any of this. But I've been wrong before.

The first part is a mix of Charles Taylor's notion of social imaginaries, Naomi Klein's Shock Doctrine, Robert Reich's Inequality for All, and Noam Chomsky's talks on solidarity. Then he gets into specifics about our ideas around our communities, environment, economics, and democracy.

Sunday, December 31, 2017

On Shame, Honour, and Vulnerability

I was forwarded this 47 minute podcast with Brené Brown on 1A, and some of the ideas she has are remarkably similar to Timothy Snyder's views in On Tyranny (e.g. connect with others in real life, speak truth to bullshit), so I bought her newest book, Braving the Wilderness. I was sorrily disappointed. She has done a bit of useful research, but it's written in such a self-helpy way that makes it all seem so dubious: anecdotes from childhood, some forced acronyms, lots of repetition of ideas, a slightly bigger font than most books, the sort of thing that feels questionable but likeable. She's very popular. She's a TED Talker, which can also boost popularity but detract from credibility in equal measure (see herehere, and here). Luckily, I found her original research (but just that one journal article), which is a much better starting point.

I'm interested in her findings but also concerned with some ideas left out of her analysis. Granted I haven't read all her books, but I think I get the gist of her ideas.


Thursday, August 10, 2017

The Plight of the Millennials

Further explanation here. 
First, a bit about statistical norms and the normal distribution. In social sciences, for something to be considered a statistically significant characteristic of a group, it just needs to be present in about 68% of the population, or one standard deviation from the norm. There's tons of variation in the other 32%, so all the generalizations below might not apply to the people in your life. But, according to researchers, they apply to most people in each group, so we can still look at trends. I remember studies in my day showing a clear correlation between violent movie viewing and violent teens, yet I loved slasher flicks and still lean towards more gruesome films despite the stats. And, more to the point, nobody stopped making those movies. This recent article is unlikely to change a thing, but we're still wise to consider it.

The article in question is The Atlantic article, "Have Smartphones Destroyed a Generation?" which adds to a running list of problems with kids today caused by technology. It hits home from some of the trends I've noticed anecdotally in my classroom over the past 26 years: that phones are distracting, lead to unrealistic idealization and familial alienation, and affect sleep habits. But the writer misses any discussion that phones also drive constant change, consumerism, and cognizance of tragedies, and the significance of other factors affecting trends in this demographic. Here's a chart I sometimes use in class for an overview of demographics by year of birth. We've moved way beyond the boom, bust, and echo labels.

Saturday, November 15, 2014

It'll Be Fast: On Yes Means Yes

Globe and Mail.
I was struck by the report of an intimate exchange between a man and woman in today's Globe & Mail; the woman later questioned how consensual the act really was.  She said, "Please stop," and he responded, "It'll be fast."  Later she says "yes," then later again "no."

But that "fast" line struck me because of when else it's typically said.  We don't offer the cushion that an event will be over quickly unless we're well aware that it's not an event that's desirable.  I might say it when my child's about to get a needle, or when I'm enticing her to clean her room.  It implies that an event has little to redeem it except that it will all be over before you know it, and you can get back to more enjoyable pursuits.  So it's curious that it wasn't clear that the woman wasn't interested when speed was the best persuasion he could muster.

This is very complex issue, and I applaud how many of the bits and pieces are at least given a mention in the article.

It's a Huge Issue, and It has Barely Budged
"At least one in five women say they have experienced sexual assault that includes penetration by the time they graduate...Roughly one-third of the students surveyed agreed that rape happens 'because men can get carried away in sexual situations once they've started,'....believe that men 'can't help it,' and that drunk women who cross their paths have themselves to blame."
This is no different from attitudes in my high school in the early 80s.  But it felt like it all shifted for a time; it felt like people were gaining an awareness of these myths through an openness towards sexual discussion.  Now it feels like it's all come full circle back to the crappy place pre-rape shield law.  Actually it's so much worse.  We never had to worry about videos of an assault going viral.   The only evidence I have is anecdotal: in 1991, several teens in my school felt the need for a Gender Equality Club to discuss these issues.  Then, after a few years, that went away.  It no longer provoked like it once had. Now, in 2014, we've got another group of teens feeling the need for these kinds of discussion outside of a classroom setting.

Maybe in the in between time, too many of us were resting on our laurels, relaxing that we waged that war and won a couple legal changes and some attitudinal shifts that might protect us a little more.  How hard is it for people to remember that nobody should be doing anything sexual that they don't feel like doing?  But I think we might have to be vigilant about this one forever - even when times seem good.  It's an easy victory to have slip away.

On Coercion and Culture
"If you include unwanted touching or being 'coerced' into sex...the [sexual assault] rate rises to more than 50 per cent."
I cringe at the word "coerced" for two reasons.  First, I hate the image of adults, women and men, as childlike puppets, easily manipulated into doing something they don't want to do, to the point that if they say 'yes' loud and clear, it doesn't count if they later reveal they were coerced.  They didn't want to, but got talked into it.  It makes us seem so weak.

But, secondly, I hate the reality of that situation.  Saying 'No, thanks' doesn't just deny two people of some carnal pleasure, it can often be punitive to the objector.  If it were just about sex, then choosing a yes or no would still be a complex decision of physical attraction, timing, and feelings.  But in our culture, it's also about reputation.  For girls, being a prude isn't cool, and if a guy rejects a girl, he's seen as gay; both terms are still seen as insults.  What if it gets around?  Furthermore, people may be punished for a 'no' response in subtle ways.

Turn down a colleague, and he could make your days at work very difficult despite your efforts to smooth things over.  Some people are sore losers.  Or just losers.  So a choice to have sex often isn't always just a choice between having sex or not having sex.  It can be a choice between having sex you're not into OR being hassled for years by the proposing partner and whomever hears his/her slanted side of things.   This is the realm of the few men who get angry if "friend zoned," who somehow think a friendship should blossom into more in order to be worth anything of value.

From here

It's Not Always a Big Misunderstanding
"Human beings can read body language in the bedroom as easily as they can in other social interactions....[Sexual assault] is about someone making a decision to ignore the cues." 
Sometimes our cues get misinterpreted, absolutely.  Look a little too long at someone, and they might think you're into them when you're not.  And we have this strange idea that body language tells truths that our mind might not be aware of, so sometimes no verbal explanation can help sway a belief in the depth of feelings you appear to have for someone you barely know.  It does happen, and it can be frustrating experience for all involved.  

But too often misunderstandings can be an excuse for an act of aggression.  Most people can tell when someone's pulling away, and they stop.  Some people notice the gesture, but choose to ignore it.    It seems like such a little transgression, ignoring a gesture, but it's huge.

The Legal Issues

The 'Yes Means Yes' campaign, "frames sex more positively, shifting the focus from what a victim did (or didn't do, or couldn't do) to the steps a perpetrator failed to take to proactively ensure consent."  Instead of someone needing to say "no" to stop it, now they need to say "yes" before beginning AND throughout.  Without a clear "yes," it's assault.   "If it's not loud and clear, its not consent."  

But it will ever be difficult to determine what happened behind closed doors.  Nothing short of cameras everywhere will alleviate that problem.  A false accusation that gets thrown out of court can be enough to ruin a life, but so can a real sexual assault.  The worst reality is that it sometimes takes more than one transgression by a perpetrator (of accusations or assaults) to get any action from the courts because of the complexity of the issue.  I do think we need to err on the side of believing the alleged assault victim when in doubt, however, but that's a post for another day.  Laura does a good job of explaining that in this post, where she says, in part, "I understand that there are false accusations of rape. They are rare, but they do occur. Sexual assault, however, is not rare."

There's also this Alternet post, which clarifies that rape and false rape accusations are not equivalent problems.

But It's So Awkward! 
"[T]here's a large part of us [that] wants things to be spontaneous and free - and it enhances our experience....asking permission is 'awkward' in that it suggests the guy, still usually expected to initiate sex, 'doesn't have game." 
Asking for, and giving, consent repeatedly throughout various stages of intimacy doesn't have to ruin the moment.  It's not a matter of taking a break to re-draft a contract to be signed in triplicate.  It's merely a matter of saying, "Is this good?  Does this work?  Do you want me to keep going?" from time to time.  If we're weighing reducing sexual assault with reducing the spontaneity of sex, then I think spontaneity has to take a back seat.


We've come a long way in our acceptance of all manner of sexual relationships and habits, but the one I think is still in the closet, is the desire not to have sex.  Abstinence-only education has become such a joke, that the choice to abstain has become denigrated right along with it.  If we put up ads to suggest it's okay, it comes across as pushing religious doctrine rather than acceptance.  But it's not the case that all men are always horny, or that sex is all every hormone-laden teen is thinking about.  There are a lot of other things we can do together.  Sex has to remain just one of many choices in order for it to be freely chosen at al.  

Monday, September 1, 2014

The Shallows: What the Internet is Doing to Our Brains

I love Nicholas Carr's book. There are lots of studies and science mixed with many stories and asides and discussions of philosophers and other great thinkers. It reminded me of reading a Bill Bryson book. You get the facts painlessly. And it presents a strong argument for keeping kids (and everyone) off-line when they work, but I'm still unlikely to  convince them to actually turn off facebook.  Reading the bare bones here doesn’t do it justice, but here’s what I don’t want to forget about my memory:


The Medium is the Message

He quotes McLuhan from 1964 – "The electric technology is within the gates, and we are numb, deaf, blind and mute about its encounter with the Gutenberg technology, on and through which the American way of life was formed” (2).  When we change technologies or tools of any kind, there are gains and losses.  It changes the way we work.  Nietzsche's style of writing changed noticeably between pen and paper and the new-fangled typewriter: “Our writing equipment takes part in the forming of our thoughts” (19).  But we forget about the losses and just notice the gains.

When information is presented to us, how it's presented makes a difference, but we get carried away by the content and don’t notice the effect the method of presentation has on us. In class, I've watched students glaze over at power points like old-schoolers used to with video strips waiting for the next ‘bing’ from the record to indicate a changing slide. When they present, they often use technology as a crutch - putting their entire presentation on slides, and they lose the class in the process. But the same kids can be captured by chalk and talk – a much maligned teaching method today - as it allows greater movement of the presenter back and forth through the room as people share thoughts and responses, and student ideas make it on the board as much as my own. They shift from looking at the me to one another to the board and their notes to glean the basics for later review rather than focusing on a stagnant screen at the front. Well, it works better for me anyway.


Our Dwindling Attention Spans

The more we use the web, the more we have to fight to stay focused on longer texts. It’s shortening our attention spans as we skim and scroll causing a decay of of faculties for reading and concentrating. I've noticed how students looking at a webpage will immediately scroll down even if vital information is right at the top. They're looking for a heading to jump out at them or a video to click on. They have to be told to stop and actually read the words on the screen.

One study found that professors of English literature are now struggling to get students to read an entire book. Their students just look at study notes online then miss the nuances of the text, and, more importantly, they don’t learn how to notice patterns of metaphors and motifs, how to do deep reading, but only learn how to summarize other writers’ analysis. Cutting corners is nothing new, but it's surprising to read that lit students won't read books.


Brain Physiology:  We Become What We Think

The most interesting part of the book is how our brains work to take in information. There's been a lot written about this lately - that the brain is affected by our environment. It's not as stable as we once thought.

"Though different regions are associated with different mental functions, the cellular components do not form permanent structures or play rigid roles. They’re flexible. They change with experience, circumstance, and need” (29). The brain gets accustomed to our typical activities and changes when they stop or when new activities start: “neurons seem to ‘want’ to receive input….When their usual input disappears, they start responding to the next best thing” (29).

The brain reorganizes itself after an accident or loss of function of any body part, but also after change in lifestyle. William James figured this out in Principles of Psychology:  “nervous tissues…seems endowed with a very extraordinary degree of plasticity…either outward forces or inward tensions can, from one hour to another, turn that structure into something different from what it was” (21).  Leon Dumont used an analogy to explain: “Flowing water hollows out a channel for itself which grows broader and deeper; and when it later flows again, it follows the path traced by itself before. Just so, the impressions of outer objects fashion for themselves more and more appropriate paths in the nervous system, and these vital paths recur under similar external stimulation, even if they have been interrupted for some time” (21).

An experiment was conducted on London cab drivers long before GPS, back when they had to have the entire city memorized. They developed an enlargement of the posterior hippocampus of their brain and a shrinking of anterior hippocampus from the constant spatial processing required to navigate intricate road system. Their brain adapted to suit how it was being used.

Something really fascinating to me is that imagining has the same effect. Researchers taught a short piano melody to people without any piano knowledge. Half the group practiced the piece for two hours a day, and the other half only imagined practicing without actually touching the keys. There were identical changes to the brain. It reminded me of what I do when I’m about to do something new, like build roof rafters or a waterfall. I say I have to stare at it a few days before I can start, but really I'm walking myself through the process in my head repeatedly, apparently until my brain’s learned how to do it.
“As particular circuits in our brain strengthen through the repetition of a physical or mental activity, they begin to transform that activity into a habit…the chemically triggered synapses that link our neurons program us, in effect, to want to keep exercising the circuits they’ve formed. Once we’ve wired new circuitry in our brain…’we long to keep it activated.’ That’s the way the brain fine-tunes its operations. Routine activities are carried out ever more quickly and efficiently, while unused circuits are pruned away” (34).
This explains why I can do dishes so much faster than my kids – and why they should be practicing dishes regularly.

This can also explain one aspect of mental afflictions like depression and OCD – “The more a sufferer concentrates on his symptoms, the deeper those symptoms are etched into his neural circuits” (35), with implication for addictions as well.

But our brain circuits can weaken or dissolve with neglect:
“If we stop exercising our mental skills…we do not just forget them: the brain map space for those skills is turned over to the skills we practice instead….the possibility of intellectual decay is inherent in the malleability of our brains. That doesn’t mean that we can’t, with concerted effort, once again redirect our neural signals and rebuild the skills we’ve lost. What is does mean is that the vital paths in our brains become…the paths of least resistance” (35). “What we’re not doing when we’re online also has neurological consequences. Just as neurons that fire together wire together, neurons that don’t fire together don’t wire together....The brain recycles the disused neurons and synapses for other, more pressing work. We gain new skills and perspectives but lose old ones” (120).
Carr adds a fascinating history of the written word. Socrates wasn't a fan of writing: “Far better than a word written in the ‘water’ of ink is ‘an intelligent word graven in the soul of the learner’ through spoken discourse” (55). Socrates recognized that a “dependence on the technology of the alphabet will alter a person’s mind….writing threatens to make us shallower thinkers…preventing us from achieving the intellectual depth that leads to wisdom and true happiness" (55).

McLuhan counters, “The achievements of the Western world, it is obvious, are testimony to the tremendous values of literacy....the written word liberated knowledge from the bounds of individual memory and freed language from the rhythmical and formulaic structures requires to support memorization and recitation" (57).  But as great an achievement as writing is, as useful as it is, there is something lost when we no longer have our brains remember and hold ideas within to debate them. The underlying question of this entire book is, Is it worth the loss?

The invention of the book altered how we think: “To read a book was to practice an unnatural process of thought, one that demanded sustained, unbroken attention to a single, static object…They had to train their brains to ignore everything else going on around them to resist the urge to let their focus skip from one sensory cue to another…applying greater ‘top-down control’ over their attention” (64). This ability represents a “strange anomaly in the history of our psychological development” (64).

As with imagining activities, one study found that brain activity while reading a story is similar to brain activity while doing the actions being described: “brain regions that are activated often ‘mirror those involved when people perform, imagine, or observe similar real-world activities….The reader becomes the book'” (74).  This makes me wonder what happens to the brain when people read a lot of violent books. It's not to suggest that reading about it necessarily makes us want to do it, but will it make us better at fighting just because we've read about it...or, perhaps, better at sex if that's our reading preference?


The Shift to Screens

In the U.S., adults aged 25-34 average 35 hours of TV a week and less than an hour a week of reading (87). And there's a difference between reading on-line and reading print material as "we are plugged into an ‘ecosystem of interruption technologies’" (91):
“A page of online text viewed through a computer screen may seem similar to a page of printed text. But scrolling or clicking through a Web document involves physical actions and sensory stimuli very different from those involved in holding and turning the pages of a book or a magazine....It also influences the degree of attention we devote to it and the depth of our immersion in it" (92).  
This has already influenced how magazines are writing articles to accommodate shorter attention spans: “Rolling Stone, once known for publishing sprawling, adventurous features by writers like Hunter S. Thompson, now eschews such works, offering readers a jumble of short articles and reviews....Most popular magazines have come to be ‘filled with color, oversized headlines, graphics, photos, and pull quotes’" (94).

He warns that technology encourages and rewards shallow reading. Some see technology as only bringing benefits, but we have to be wary of the costs:
“No doubt the connectivity and other features of e-books will bring new delights and diversions…But the cost will be a further weakening, if not a final severing, of the intimate intellectual attachment between the lone writer and the lone reader. The practice of deep reading that became popular in the wake of Gutenberg’s invention, in which ‘the quiet was part of the meaning, part of the mind,’ will continue to fade, in all likelihood becoming the province of a small and dwindling elite” (108).

“Some thinkers welcome the eclipse of the book and the literary mind it fostered. In a recent address to a group of teachers, Mark Federman, an education researcher at the University of Toronto, argued that literacy, as we’ve traditionally understood it, “is now nothing but a quaint notion, an aesthetic form that is an irrelevant to the real questions and issues of pedagogy today as is recited poetry – clearly not devoid of value, but equally no longer the structuring force of society.’ The time has come, he said, for teachers and students alike to abandon the ‘linear, hierarchical’ world of the book and enter the Web’s ‘world of ubiquitous connectivity and pervasive proximity’ – a world in which ’the greatest skill’ involves ‘discovering emergent meaning among contexts that are continually in flux” (111).

“In the choices we have made, consciously or not, about how we use our computers, we have rejected the intellectual tradition of solitary, single-minded concentration, the ethic that the book bestows on us. We have cast our lot with the juggler” (114).
The distractions offered on-line add to the shallow-reading effect. This makes me reconsider all the links and images I make the effort to include in each post:
“But the extensive activity in the brains of surfers also points to why deep reading and other acts of sustained concentration become so difficult online. The need to evaluate links and make related navigational choices, while also processing a multiplicity of fleeting sensory stimuli, requires constant mental coordination and decision-making, distracting the brain from the work of interpreting text or other information. Whenever we, as readers, come upon a link, we have to pause, for at least a split second, to allow our prefrontal cortex to evaluate whether or not we should click on it. The redirection of our mental resources, from reading words to making judgments, may be imperceptive to use – our brains are quick – but it’s been shown to impede comprehension and retention, particularly when it’s repeated frequently” (122). 
“Difficulties in developing an understanding of a subject or a concept appear to be ‘heavily determined by working memory load,’…and the more complex the material we’re tying to learn, the greater the penalty exacted by an overloaded mind…two of the most important [sources of cognitive overload] are ‘extraneous problem solving’ and ‘divided attention.’ Those also happen to be two of the central features of the Net as an informational medium" (125). 
“Just as the pioneers of hypertext once believed that links would provide a richer learning experience for readers, many educators also assumed that multimedia, or ‘rich media,’ as it’s sometimes called, would deepen comprehension and strengthen learning. The more inputs, the better. But this assumption, long accepted without much evidence, has also been contradicted by research. The division of attention demanded by multimedia further strains our cognitive abilities, diminishing our learning and weakening our understanding. When it comes to supplying the mind with the stuff of thought, more can be less” (129).
In one study half of the participants had a text-only passage to read, and the other half had the text passage with relevant audiovisual material.  When they were tested on the information, not only did the text-only group do better on the test, they found the material to be more interesting, educational, understandable, and more enjoyable. Multimedia: “would seem to limit, rather than enhance, information acquisition” (130).

In another study, they had students listen to a lecture. One half could surf web during lecture to look up relevant information, and the other half had to keep their laptops shut. Surfers performed “poorer on immediate measures of memory for the to-be-learned content. It didn't matter, moreover, whether they surfed information related to the lecture or completely unrelated content – they all performed poorly” (131).

A final study had students watch CNN. One group watched an anchor with info-graphics on the screen and textural news crawling along the bottom, while the other group watched the anchor without graphics and a news crawl. The multimedia version remembered significantly fewer facts as “this multimessage format exceeded viewers’ attentional capacity” (131).

We're encouraged in schools to be cutting edge with our tech use. Teachers are praised for using any new program. Even if it's just a switch from Powerpoint to Prezi, it's lauded as revolutionary. New is celebrated as better with little exploration of studies showing otherwise. We're so worried about being the best, about getting the most kids to achieve on standardized tests, really, that we're jumping at anything shiny that comes our way in hopes it will be the magic bullet that finally motivates the more challenging students. Carr further cautions,
“The Internet, however, wasn’t built by educators to optimize learning. It presents information not in a carefully balanced way but as a concentration-fragmenting mishmash. The Net is, by design, an interruption system, a machine geared for dividing attention” (131). “In addition to flooding our working memory with information, the juggling imposes what brain scientists call ‘switching costs’ on our cognition. Every time we shift our attention, our brain has to reorient itself, further taxing our mental resources…Switching between two tasks short-circuited their understanding: they got the job done, but they lost its meaning” (133). “The near continuous stream of new information pumped out by the Web also plays to our natural tendency to ‘vastly overvalue what happens to us right now’….We crave the new even when we know that ‘the new is more often trivial than essential’” (134).  "There are signs that new forms of ‘reading’ are emerging as users ‘power browse’ horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense” (137). 
“The more you multitask, the less deliberative you become; the less able to think and reason out a problem.’ You become…more likely to rely on conventional ideas and solutions rather than challenging them with original lines of thought….As we gain more experience in rapidly shifting our attention, we may ‘overcome some of the inefficiencies’ inherent in multitasking…but except in rare circumstances, you can train until you’re blue in the face and you’d never be as good as if you just focused on one thing at a time. What we’re doing when we multitask is learning to be skillful at a superficial level. The Roman philosopher Seneca may have put it best two thousand years ago: “To be everywhere is to be nowhere....The Net is making us smarter…only if we define intelligence by the Net’s own standards…if we think about the depth of our thought rather than just its speed – we have to come to a different and considerably darker conclusion” (141).
Reading scored fell between 1992 and 2005: “Literary reading aptitude suffered the largest decline, dropping twelve percent” (146).


On Memorization:  The brain is a muscle, not a filing cabinet.

When I was a kid, I could tell you any of my friends' phone numbers by heart.  Now I can barely remember my own.  I don't need to know phone numbers anymore because they're all programmed into my phone, but is the work computers are doing making our brains lazier?  Should we try to remember things just for the sake of working out our brains?

Erasmus thought that “memorizing was far more than a means of storage. It was the first step in a process of synthesis, a process that led to a deeper and more personal understanding of one’s reading” (179). Tech writer, Don Tapscott, disagrees. “Now that we can look up anything ‘with a click on Google…memorizing long passages or historical facts’ is obsolete. Memorization is ‘a waste of time’” (181).

To the Ancient Greeks, “memory was a goddess: Mnemosyne, mother of the Muses" (181). “The shift in our view of memory is yet another manifestation of our acceptance of the metaphor that portrays the brain as a computer…storing bits of data in fixed locations and serving them up as inputs to the brain’s calculations, then offloading that storage capacity to the Web is not just possible but…liberating….The analogy has a simplicity that makes it compelling….But…it’s wrong" (182). The brain isn’t a filing cabinet; it’s a muscle. Using it over and over doesn’t fill it until it can take no more, but quite the opposite – it strengthens it to take in more information.

So every year when I just throw up my hands at the idea of remembering 90 students’ names in a few days, and hope the students will be forgiving of my aging brain, I’ve simply gotten sucked into a vicious cycle that prompted me to give up on myself far too soon. I have problems remembering names because I typically don’t remember them, so I don’t try, so I never do. Kind of sounds like a Winnie the Pooh poem or an admonishment from Yoda. The implication here is that if I actually work on remembering people’s names instead of assuming that’s just something I can’t do, then I’ll actually develop the ability to remember them better for the next set of classes. It’s why, every year when I rent a mini-van for a trip to a cottage with my family, I start the journey a bit of a nervous wreck, but over the week the van seems to grow smaller and more manageable until I’m parallel parking the sucker by the end. (Just kidding – at the end of the week I still search for pull-through parking spots.) It's nothing revelatory to say that practicing improves ability, yet we don't tend to think this way about using our memory.

Getting information from short-term to long-term requires “an hour or so for memories to become hard, or 'consolidated,' in the brain. Short-term memories don’t become long-term memories immediately, and the process of their consolidation is delicate. Any disruption, whether a jab to the head or a simple distraction, can seep the nascent memories from the mind” (184). "The more times an experience is repeated, the longer the memory of the experience lasts…Not only did the concentration of neurotransmitters in synapses change, altering the strength of the existing connections between neurons, but the neurons grew entirely new synaptic terminals" (185). These terminals increase the more memories are formed, and then decrease again when they’re allowed to fade, but these don’t completely decrease to former numbers. “The fact that, even after a memory is forgotten, the number of synapses remains a bit higher than it had been originally helps explain why it’s easier to learn something a second time” (185). This is why many teacher tell students to go over their notes regularly. It actually helps.


Computers vs Brains: Some benefits of being alive.
“While an artificial brain absorbs information and immediately saves it in its memory, the human brain continues to process information long after it is received, and the quality of memories depends on how the information is processed. Biological memory is alive. Computer memory is not....Those who celebrate the ‘outsourcing’ of memory to the Web have been misled by a metaphor. They overlook the fundamentally organic nature of biological memory…Once we bring an explicit long-term memory back into working memory, it becomes a short-term memory again. When we reconsolidate it, it gains a new set of connections – a new context….Biological memory is in a perpetual state of renewal....In contrast to working memory, with its constrained capacity, long-term memory expands and contracts with almost unlimited elasticity, thanks to the brain’s ability to grow and prune synaptic terminals and continually adjust the strength of synaptic connections” (192).
Web advocates think, “In freeing us from the work of remembering, it’s said, the Web allows us to devote more time to creative thought. But the parallel is flawed….The Web…places more pressure on our working memory, not only diverting resources form our higher reasoning faculties but obstructing the consolidation of long-term memories and the development of schemas….The Web is a technology of forgetfulness" (193).

The ramifications of the brain's plasticity is that reading on-line, in a distracted way, can have an effect on our ability to read and think:
"The influx of competing messages that we receive whenever we go online not only overloads our working memory; it makes it much harder for our frontal lobes to concentrate our attention on any one thing. The process of memory consolidation can’t even get started. And, thanks once again to the plasticity of our neuronal pathways, the more we use the Web, the more we train our brain to be distracted – to process information very quickly and very efficiently but without sustained attention. That helps explain why many of us find it hard to concentrate even when we’re away from our computers" (194).
Marshall McLuhan “elucidated the ways our technologies at once strengthen and sap us…. our tools end up ‘numbing’ whatever part of our body they ‘amplify.' When we extend some part of ourselves artificially, we also distance ourselves from the amplified part and its natural functions" (210).

In a study they had two groups trying to solve a puzzle. One group had helpful software, the other group didn't. In the early stages, the helped group made correct moves more quickly, but as they proceeded, the other group increased their skill with the puzzle more rapidly. Learning a task with little help wires our brain to know how to learn that type of task, so it becomes easier to later improve on our initial learning. The group with software help didn’t do the initial learning, so they couldn’t advance as easily. “As we ‘externalize’ problem solving and other cognitive chores to our computers, we reduce our brain’s ability ‘to build stable knowledge structures’...that can later ‘be applied in new situations’” (216).


The Need for Nature
“A series of psychological studies over the past twenty years has revealed that after spending time in a quiet rural setting, close to nature, people exhibit greater attentiveness, stronger memory, and generally improved cognition. Their brains become both calmer and sharper…They no longer have to tax their working memories by processing a stream of bottom-up distractions. The resulting state of contemplativeness strengthens their ability to control their mind” (219).  
In yet another study: one group walked through a park, the other walked on city streets, and then both took a cognitive test. The park group did significantly better. Even cooler, it works just by looking at pictures of nature or even imagining nature scenes!  “The people who looked at pictures of nature scenes were able to exert substantially stronger control over their attention, while those who looked at city scenes showed no improvement in their attentiveness” (220). This makes a case for designing our classrooms with bits of nature all around or taking the kids outside to learn.


The Emotional Effect

The brain doesn't just run our intellectual requirements, but also determines our emotional reactions.    “It’s not only deep thinking that requires a calm, attentive mind. It’s also empathy and compassion” (220).
“…the more distracted we become, the less able we are to experience the subtlest, most distinctively human forms of empathy, compassion, and other emotions. ‘For some kinds of thoughts, especially moral decision-making about other people’s social and psychological situations, we need to allow for adequate time and reflection” (221).
Carr's final caution:  “…as we grow more accustomed to and dependent on our computers we will be tempted to entrust to them ‘ tasks that demand wisdom.’ And once we do that, there will be no turning back” (224).

Tuesday, August 26, 2014

On Glorifying Psychopaths

Murray Dobbin wrote a very provocative article relating our TV viewing of psychopaths to our politics.  Owen explored the glorification of psychopaths in a post discussing the article, and I commented there on the difficulty of establishing kindness in our self-absorbed culture.  I wrote years ago about the crux of the problem: that it's not cool to be kind.  When Fonzie started wearing glasses and caring about things, he lost his status with the viewers, and then he literally jumped the shark.

But the article has sparked a few other thoughts.

First of all, is two shows enough to show a cultural trend?  More to the point, how can we determine which shows are most influencing our politics - or most influenced by our politics, and how can we ever show more than just a correlation? It's an interesting thesis to posit that TV mirrors politics, but it's a difficult thing to discern.  Are the psychopath shows the most watched, or is our current culture more clearly defined by reality TV?  It's a laborious feat if we want to do justice to the concept; it's tricky business that requires a willingness for tedious analysis that's beyond my motivation level. but it's fun just to consider correlations based on the shows that stand out to us.


Dobbin suggests the 50s had movies depicting cold war paranoia during the cold war and current TV shows mirror 21st century psychopathic capitalism.  The shows he focuses on are Breaking Bad and House of Cards, and I don't know how Dexter didn't warrant a mention - nor The Sopranos.  

When I think back to the shows I watched growing up, I can't think of a single one that had a psychopath as the hero.  They were gentler shows.  Happy Days, Star Trek, MTM, Bob Newhart, Barney Miller, Family Ties, M*A*S*H, WKRP in Cincinnati, Cheers, St. Elsewhere, Hill Street Blues, Moonlighting....  There were some dark themes here and there, but for the most part, everyone worked together reasonably happily to a satisfying conclusion.  I'm not sure if they were really gentler times, or if it just feels that way through the soft focus of memory juxtaposed against the stark photos of children in Gaza, missing or murdered Aboriginal women, and so many neighbourhoods closer to home either flooded or on fire.


Beyond my typical viewing, the 80s was marked by soap operas I never watched - many of which at once idealized wealth while allowing us a vicarious delight in the destruction or humiliation of the very wealthy.  This was during the recession, when people were more likely to despise the wealthy than have any potential to join them.  The 80s ushered in this new form of capitalism: insane growth at any cost.  This new type of show we see today is getting a following, not during the height of "successful" capitalism, but after we've seen the fall of this system.  It is a time of helplessness. 

Today, we're either angry or oblivious. Too many have lost too much to money scams or natural disasters.  During the depression, movies were fantasies of a better place - The Wizard of Oz.  Now, we have fantasies of being able to beat the system.  We want to watch people get away with enacting their anger purposefully - in a way that gets them wealth or status or just offers a release to our collectively repressed frustration and rage.   They're able to stay one step ahead of the law.  Maybe it's a satisfying fantasy at a time when too many aren't at all able to stay ahead of the game.

Dobbin quotes a relevant author who suggests, "People enjoy watching sociopaths on television as a kind of compensation for their own feelings of powerlessness and helplessness."  The source of this powerlessness, according to Dobbin, is capitalist hyper-competetiveness.  He sees competition as the catalyst tearing apart families and communities.  I don't think it's the competitiveness directly causing problems - as if people are competing with their own neighbours, copying Frank Underwood's tactics to cause strife -  but the indirect result of the inability to compete, the inability to even reach the bottom stratum of the mythological level playing field. As Dobbin clarifies, "a competitiveness in which almost all but the 1% lose."


Secondly, is this level of violence and ruthlessness new or just new to TV at a time when TV is a whole new medium?  There is certainly a rash of psychpaths in shows today, characters once relegated to the bad guys in horror films.  But I don't believe it's entirely from helplessness that we watch.  We watch for the clever ways they get out of sticky situations week after week - and not necessarily from a sense of wish-fulfillment, but as an admiration of talent.  It's not dissimilar to old shows in which the bad guy set the trap and the good guy gets to foil him yet again, except the criminals and cops have switched roles.  Either way, it's entertaining to watch the set up through to the escape.  That's nothing new or necessarily tied to our politics or power.  

We've always enjoyed a bit of violence too.  As a child, with Little House on the Prairie on in the background, I ate up books on Greek, Roman, and Norse mythology.  One of my favourites was the story of Prometheus' punishment for stealing Zeus' fire:  to be tied to a cliff with his guts eaten by a bird all day, only to have them grow back during the night.  Cool!  Maybe it's the case that our shows were too sanitized and tidy.  Now we're getting a dose of the dark reality of human nature.  Does bringing it out in the open normalize it and foster mimicry, or could it instead - or also - help us acknowledge and understand evil as within each of us?  Could the evil characters be enlightening, or are these notions best left under wraps?  


Finally, does TV viewing create or just reflect cultural behaviour?  I wrote about this before but with a focus on children's programming and the types of comedies I like to watch, questioning the effects on my own behaviour: Does the crass, rude, verbal abuse that entertains me on TV make me less polite and patient with people in real life?  And if it is the case the TV affect our behaviour, do we need to balance the psychopaths with more pro-social TV shows?  When religion was strong, we had lots of pro-social TV shows.  Now that it's waning, when we need moral guidance the most, we're stuck with the Kardashians as the pivotal role models of our times.  If there is a chance that TV creates our attitudes and behaviours, shows us when to feel guilty and shame and pride, shows us whom to respect and admire, then, rather than censoring the violence, I'd opt for adding shows that model virtue, that show us how to be kind.  Dobbin offers that we should just begin to act with kindness.  I think too many of us might not know how without seeing it modelled day after day on Youtube.  But do we want popular media to be our moral guidance?  Do we have a choice?

In my classroom, if I admonish a student by suggesting a behaviour wasn't kind, some just don't care.  They don't feel ashamed that they're being unkind.  It's normal; it's the way people are supposed to be.  And I'm odd for thinking otherwise.  That's a hard, uphill battle that needs to be won.  A shift in popular media could do wonders.      

Some of the shows today are vile.  We get drawn into the intensity of the drama and the shock of evil on display.  I relished every episode of the shows Dobbin despises.  But like Dobbin, I also re-watched The West Wing recently for a bit of hope.  And lately, I'm loving Rectify for its slow pace that forces the viewer to be patient.  It allows tension to build by heartbeats. But I'm also struck by the novelty, in today's world, of the depth of the moral struggling the characters go through and some of the strikingly virtuous choices some of the characters make within grave circumstances.  The good guys take responsibility for their actions and don't even begin to try blaming others or explaining away their actions.  Weird.  And refreshing.  Maybe we'll just tire of the psychopaths soon and the pendulum will shift back to equally complex characters who do the right thing.

And maybe we'll recognize the limits to growth and competition while we're at it.

Saturday, November 30, 2013

On John Stuart Mill, Free Speech, and Climate Change

I got caught up in a few arguments about climate change recently that just reinforced to me, that there’s still such a strong bashlash against the entire idea that we’re unlikely to move forward quickly enough to be effective.

Paper is trees!
My school board is fundraising for the Philippines, and I’m totally on board with it. But I commented publicly on the irony of sending each kid home with a piece of paper on the issue. That’s over 60,000 full pieces of paper or about 8 trees for something that will be crumpled at the bottom of a knapsack or tossed before it even makes it home.  We’re cutting down trees to make paper to ask people to help those affected by conditions exacerbated by the cutting down of trees. And there are other ways to get the word out like our websites and automatic phone callers. If we really want to use paper, the notices could at least be sent on half pages or on re-use-it paper (‘goos’ paper in some places).

Pretty straightforward and reasonable, right??

Not so fast. A colleague ridiculed me for quibbling about paper when people are struggling to cope with a “NATURAL” disaster. I responded with a quote from the IPCC linking extreme weather to climate change and a suggestion that we're negligent if we don't take responsibility for our small daily actions having an accumulative and disastrous effect elsewhere.  But I'm pretty sure it's all for nought.  Sigh.

But, as is often the case, a much more interesting conversation happened with my students.

Sunday, October 6, 2013

On Celebrating Talent

Convalescing from a wicked cold that's beating the crap out of me, I watched a trio of movies about amazing musicians: Joe Strummer, Ginger Baker, and Sixto Rodriguez.  In the films, other musical geniuses were highlighted along the way.  What a delight!  But as Ginger, Jack and Eric talked about people with the gift of perfect time, my first reflexive response was, "How many kids are told they can be a great musician if they just put their mind to it?".

In class this week, yet another student insisted that intelligence has minimal genetic basis compared to effort.  Anybody can do anything if they try hard enough.  I suggested there are people her age still struggling with the alphabet and lamented the ivory tower effect of streamed academic courses.  I don't think it was very convincing.  I'm battling a life-time of programming.  In high-school, I struggled with grade 13 physics.  Both my parents were math and physics profs at U of W, yet with their unwavering help, and the help of my teacher, I still couldn't get my head around that whole inclined plane issue.*  It's just not how my brain works.

And that's okay.

Friday, August 23, 2013

On Canada: A Fair Country

I used to be so proud to be Canadian and that's wavered over this difficult period in our history.  I was searching for this book to loan out, and once found, I got totally engrossed in re-reading it.  It made me feel so much better.  It's an important book about who we really are:  A Fair Country: Telling Truths About Canada by John Ralston Saul (2009).  What a delight!

Like Hedges' Empire of Illusion, this book focuses on our cultural stories or myths.  How we understand ourselves affects how we live and act, our beliefs and allegiances.  And we Canadians have lost our way swimming through the miasma of American influence.  As a civics teacher, when I do a pre-test at the beginning of the year, a good half the grade ten class give American answers to questions about Canadian politics.  Once in a while, someone admits that they thought Obama was our president too.  And they're not far off.  We have a long journey ahead of us to correct this indoctrination.  We can't be true to ourselves if we don't know who we are.  

Here are my notes and thoughts along the way, but do read the book - I've just captured the ideas, but the stories are what make it.