September 15, 2014

Metafoundry 6: Accident Blackspot

Identity crisis; age of non-consent.

COORDINATES: Bits of this were written all around southern England and somewhere over the North Atlantic, but I’m now back in my kitchen with a cup of tea, David Bowie’s Low on the record player, watching the contrails of planes on descent to Boston Logan track across the darkening sky outside my window. I was in the UK to attend two conferences in Brighton, dConstruct and Improving Reality, and I gave a talk about materials science and 3D printing and co-facilitated a workshop with Georgina Voss at Electromagnetic Field. My EMF talk was recorded and I believe it should be online soon; in the interim, I recommend to you the audio of the dConstruct talks, especially Warren Ellis and Mandy Brown. Cloudy and 14C outside.
 
IDENTITY CRISIS: Hello, new readers! A bit of orientation: I’m a professor at a small engineering college outside of Boston. Besides my formal technical background (primarily biological/biomedical materials and engineering physics) and my current research areas (which include understanding engineering student experiences as well as gender and STEM), I have a long-standing interest in design, culture and technology. Also, I moved to the United States from Canada, which explains my highly inconsistent orthography. You can get even more of it on Twitter.
 
AGE OF NON-CONSENT: On my way home from the airport last week, I got into a cab that had a TV screen in the passenger area (as is now common in Boston and other cities). As I always do, I immediately turned it off. A few minutes later, it turned itself on again. That got me thinking about this amazing piece by Betsy Haibel at Model View Culture, about ‘when mistreating users becomes competitive advantage’, about technology and consent (seriously, go read it; it’s more important that you read that than you read this). I had started thinking more about how technology is coercive and how it pushes or crosses the boundaries of users a few weeks ago, when I got a new phone. Setting it up was an exercise in defending my limits against a host of apps. No, you can’t access my Contacts. No, you don’t need access to my Photos. No, why the hell would you need access to my Location? I had to install a new version of Google Maps, which has crippled functionality (no memory of previous places) if you don't sign into Google, and it tries to convince you to sign in on every single screen, because what I obviously really want is for Google to track my phone and connect it to the rest of my online identity (bear in mind that the only objects that have have a closer average proximity to me than my phone does are pierced through bits of my body). Per that Haibel article, Google’s nagging feels exactly like the boundary-crossing of an unwanted suitor, continually begging for access to me it has no rights to and that I have no intention of providing.

This week, of course, provided a glorious example of how technology companies have normalized being indifferent to consent: Apple ‘gifting’ each user with a U2 album downloaded into iTunes. At least one of my friends reported that he had wireless synching of his phone disabled; Apple overrode his express preferences in order to add the album to his music collection. The expected 'surprise and delight' was really more like 'surprise and delete'. I suspect that the strong negative response (in some quarters, at least) had less to do with a dislike of U2 and everything to do with the album as a metonym for this widespread culture of nonconsensual behaviour in technology. I've begun to note examples of these behaviours, and here are a few that have come up just in the last week: Being opted in to promo e-mails on registering for a website. Being forced by Adobe Creative Cloud into a trial of the newest version of Acrobat; after the trial period, it refused to either run Acrobat or ‘remember’ that I had a paid-up institutional license for the previous version. A gas pump wouldn't give me a receipt until after it showed me an ad. A librarian’s presentation to one of my classes was repeatedly interrupted by pop-ups telling her she needed to install more software. I booked a flight online and, after I declined travel insurance, a blinking box appeared to 'remind' me that I could still sign up for it. When cutting-and-pasting the Jony Ive quote below, Business Insider added their own text to what I had selected. The Kindle app on my phone won’t let me copy text at all, except through their highlighting interface. When you start looking for examples of nonconsensual culture in technology, you find them absolutely everywhere.

Once upon a time, Apple was on the same side as its users. The very first iMac, back in 1998, had a handle built into the top of it, where it would be visible when the box was opened. In Ive’s words, ‘if there's this handle on it, it makes a relationship possible…It gives a sense of its deference to you.’ Does anyone feel like their iPhone is deferential to them? What changed? Part of it is what Ethan Zuckerman called ‘the original sin’ of the Internet, the widespread advertising-based model that depends on strip-mining user characteristics for ad targeting, coupled with what Maciej Ceglowski describes as ‘investor storytime’, selling investors on the idea that they’ll get rich when you finally do put ads on your site. The other part is the rise of what Bruce Sterling dubbed “the Stacks”: Google, Apple, Facebook, Amazon, Microsoft. Alexis Madrigal predicted, “Your technology will work perfectly within the silo...But it will be perfectly broken at the interfaces between itself and its competitors”, and that can only be the case if the companies control what you do both inside and outside the silo. And, finally, of course, our willingness to play ball with them—ie why I didn't want to sign into Google from my phonehas eroded in direct proportion to our trust that the data gathered by companies will be handled carefully (not abused, shared, leaked, or turned over). Right now, a large fraction of my interactions with tech companies, especially the Stacks, feel coerced.

One of the reasons why I care so much about issues of consent, besides all the obvious ones (you know, having my time wasted, my attention abused, and my personal behaviours and characteristics sold for profit) is because of the imminent rise of connected objects. It’ll be pretty challenging for designers and users to have a shared mental model of the behaviour of connected objects even if they are doing their damnedest to understand each other; bring in an coercive, nonconsensual technology culture and it doesn't take a lot of imagination to consider how terrible they could be. The day before Apple’s keynote this week, London-based Internet of Things design firm BERG announced that they were closing their doors (although I prefer to think of them as dispersing, like a blown dandelion clock). The confluence of their demise with Apple’s behaviour made me extra-sad, because BERG were one of the few companies that worked in technology that really seemed to think of their users as people. Journalist Quinn Norton recently wrote a fantastic piece on the theory and practice of politeness, "How to Be Polite...for Geeks", which could just as easily be "...for Technology Companies". The Google+ 'real name' fiasco and Facebook's myriad privacy scandals could have been averted if the companies had some empathy for their users, and listened to what they said, instead of assuming that we are all Mark Zuckerbergs. As well as laying down some Knowledge about Theory of Mind and Umwelt, Quinn notes that politeness is catchy--social norms are created and enforced by what everyone does. I commute by car daily in Boston but I spent a year on sabbatical in Seattle. The traffic rules in Boston and Seattle are virtually identical, but a significant chunk of driver behaviours (in particular, the ones that earn Boston drivers the epithet of 'Massholes') are the result of social norms, tacitly condoned by most of the community. And driving is regulated a lot more closely than tech companies are.

I don’t know what it’ll take to change technology culture from one that is nonconsensual and borderline-abusive to one that is about enthusiastic consent, and it might not even be possible at this point. All I really know is that it absolutely won’t happen unless we start applying widespread social pressure to make it happen, and that I want tech companies to get their shit together before they make the leap from just being on screens to being everywhere around us.
 

The 'seeds' of a strawberry are actually the fruits ('achenes'); the part you eat is accessory tissue.