What if I told you that the “backfire effect” as depicted in that Oatmeal cartoon everyone’s sharing this week might not be real? (Same goes for all that “mindset” stuff educators have been touting lately too, for what it’s worth.)
Narratives and counter-narratives, weaponized: “Facing Facts: American Identity is Based on Alternate History,” says Sarah Gailey. That’s thanks to textbooks. “Freshly Remember’d: Kirk Drift” by Erin Horáková. “Airbrushing Shittown” – Aaron Bady on the S-Town podcast. “Divided and Platformed” – Susie Cagle on media and democracy.
“Can Technology Rescue Democracy?” The Atlantic asks in a new collection of essays. No. No, it cannot.
Stories and algorithms and (in)justice: “Taser Will Use Police Body Camera Videos ‘to Anticipate Criminal Activity’,” says The Intercept’s Ava Kofman. The New York Times’ Adam Liptak on sentencing algorithms. “Death Undone” – “The demand for new relations between death and technology begins with the acknowledgement of Black life,” says Nehal El-Hadi.
“How the Robot Apocalypse Will Go Down,” according to Cathy O’Neil. “Artificial intelligence is like a weapon,” she says. “Worry about the people wielding it.” (Of course, you’ll need to have the “right” certification to wield it: “Yellow-Light Crusader Fined for Doing Math Without a License.”)
“Here to Help” by xkcd
“Ed-neurotech” is coming, cautions Stirling University’s Ben Williamson. Or at least there’ll be plenty of companies promising that cognitive research will lead to new teaching machines, more efficient learning. “DARPA Is Planning to Hack the Human Brain to Let Us ‘Upload’ Skills,” Futurism predicts. Amazon’s Alexa lets you “upload skills” too. Wonder why the same language is being used? Hmm. Williamson writes,
As parts of an educational neurofuture in-the-making, optimistic aspirations towards neuroenhancement and cognitive modification of ‘flawed brains’ through neurotechnologically enhanced education need to be countered not just with technical and scientific scepticism. Greater awareness of the political and commercial interests involved in new and developing neurotechnology markets and interventions are required, as well as theoretically engaged studies of the sociotechnical processes involved in producing neurotechnologies and tracing their uptake and effects in education. Deeply social questions also need to be asked about the use of brain data to exercise neuropower over young people’s mental states, and about how to safeguard their cognitive liberty amid persuasive and coercive promises about neuroenhancement in the direction of personal cognitive improvement.
“I write on the internet. I’m sorry,” Michael Brendan Dougherty apologizes. I’m sorry too, but I have to engage critically and politically in countering this techno-neuro-capitalist dystopia, and that’s why I write online. But sharing online, as Alan Jacobs reminds us, is not caring. Caring is caring.
Yours in struggle,