My head is slowly imploding and feels like it's undergoing the first stage of the z-machine initiating fusion. I *think* that's a good thing.
1.0 Critical Infrastructural Vulnerability
I am still thinking about Heartbleed and everything that's going on with it. There are a bunch of explainers - the xkcd one is pretty good and the jwz rant about this all, ultimately, being Dennis Ritchie's fault is pretty funny (and where I mean funny-not-if-you-have-anything-to-fix funny). But I did get a good note back from Nick Sweeney regarding my throwaway comment about an equivalent being a vulnerability being found in concrete: unscrupulous builders making substandard concrete and pocketing the difference in material costs[3, 4].
Now, with my lawyer hat on (for my sins I trained as a lawyer and it's a style of thinking that you can't really extract from people), I'm thinking of other similar large-scale vulnerabilities with costly limitations. One might be: hey, so it turns out there's a structural vulnerability or danger in plastic, and plastic happens to be everywhere from kids toys to industrial applications. But then another example sprang to mind which, if I were in the IT industry, would probably scare me a bit shitless.
See, we didn't know asbestos was dangerous. And then we found out that it was dangerous. And then the lawyers and governments and regulators went *mental*, and the chain of liability was basically a nightmare (but, on the other hand, still not a great resolution for people who, you know, were actually affected by lung disease.
So let me tell you a story about C and unmanaged languages.
C was a high-performance language, relatively easy to learn, and had been around for ages. It turned out that lots of people knew how to write software for it, and it was highly portable. It spread everywhere. It was, like asbestos, a great construction material. A great medium to make things with and in.
The problem with C, it turned out, was the null-terminated string. Because, sure as damnit, the chain from a language that handled strings with null termination ended up with a stupendously critical *built-in* weakness that, at the time actually might have made sense as a local optimisation.
But now we have the mess that we're in with a bunch of buffer over/underruns and having to add hacks in like address space randomisation and no-execute bits and Jesus Christ this is a mess and no, maybe no one's actually *died* but you can bet there are lawyers out there who would be salivating at the financial implications and totting up causal loss.
So then you look at who to blame. And you look at the Wikipedia list of defendants in US asbestos cases and it's, well, a bit of a nightmare.
Is this an asbestos moment? Probably not. But hey, in the land of the US you never know when one might actually come along.
But know this: it doesn't look like there've been any recalls. I certainly haven't had any emails from whoever about embedded OpenSSL in any of my consumer electronics devices that now need patching or, because of firmware issues, outright replacing. And with that, there's another look at how much liability you can actually disclaim with such fundamental infrastructure.
Some good notes from people (James Bridle, Matt Biddulph, Dan Hill) from yesterday's bit about me having to change my pronunciation for Xbox's Kinect voice recognition to acknowledge me.
James brought up the good point that took the analogy I had further. I was speaking purely from a linguistic point of view: that in speech acts with another speaker, we adjust our acrolect to a happy medium where we're both sure we can communicate, accent, dialect, slang and so on. James points out that what we're looking at instead with something like adjusting our behaviour so that an Xbox can understand our speech or so that a Palm Pilot can understand our writing or that Google's service understand what we're searching for is that this is acculturation: cultural adaptation.
There was a bit in Re/Code about this, put forward by Monica Rogati, vp of data at Jawbone about (and I apologise in advance) "data natives" as a new, er, tribe, as distinct from "digital natives". Data natives, says Rogati, are people who grow up with expectations as to big-data-smartness inhabiting the objects that they interact with. One of her examples is that a data native would expect a Nest thermostat to program itself, whereas a digital native would expect to be able to program a thermostat. Which, fine, but I'm pretty sure you don't have to be a data native to "expect" that, and anyway, I've grown pretty good at having a feeling of what you can sell people, and stuff you don't have to program (hi, VCRs), is always good. You're selling the benefit, not the how.
Anyway. There is perhaps something in the a group of people who intuitively believe in, if you will, an *animating spark* of sentience or at the very least agency in the objects that they interact with in daily life, but again, here's a reckon: people are like that anyway. They assign agency to things all the time and we love anthropomorphising stuff. To the extent that my mate has a house that Tweets (in, I would like to add, a valuable experiment in terms of tone of voice and accessibility in the age of the internet of things).
But Rogati's vision:
"A Jawbone Up wristband turns on your Philips Hue lights and starts your WeMo-enabled coffeemaker when you wake up. Water heaters and thermostats learn from your usage patterns and save energy. Connected door locks and doorbells make safety more convenient."
is of a world that we would like to exist, but we're pretty sure won't exist because things just don't work perfectly. They fail and miss or are short. And, for that matter, most of her examples in that paragraph aren't about big data, but are about connectedness and being plugged in to the network. The only data-driven stuff is the water heater and the thermostat.
But then the big point is this acculturation point. We already adapt ourselves to our environments - you know this whether you live in a city or out in the countryside. And we already adapt our environments to ourselves. The question about how we adapt to an algorithmic environment is the big one: especially when so far we've been adapting to (more or less) visible forces. Even culture is visible even though some of its effects (misogyny, for example) aren't instantly clear. But algorithmic effects when we're not as a species algorithmically literate?
We're ingenious creatures, us humans. We do probe systems and do unexpected things with them: it's part of what's made us the top, most successful mammal on the planet (if there'll still be one we can inhabit in a hundred year's time). So in the same way that people hack airline miles systems (albeit not everyone: but enough people), we'll be hacking and adapting our behaviour to the algorithms that increasingly touch our world.
A quick one this. Over breakfast, talking with someone who has the misfortune (paraphrasing his words) of working with the US banking industry, I realised that perhaps *one* of the reasons why the Valley investor elite are obsessed with bitcoin is not necessarily the mining and the blockchain or anything (though perhaps more the latter than the former), but really, because it's a chance to disrupt the batch-processing fucked-up nature of banking that's existed since the 1960s with computerisation. That stuff is *mental*. And any chance to build something new, and perhaps some sort of bridging currency-abstraction-layer, is super interesting.
And that's before you get into whether technology can even solve a problem like a sizeable percentage of people not even having a bank account in the first place. (But do they have mobile phones, right?)
Ok, so a few things to unpack with Dropbox's Carousel.
One: it's called fucking Carousel. I would like to give the Valley more credit, but let's be honest: I'm pretty sure it's named after that Don Draper moment. Smart? Yes. Original? Not really. A bit depressing as to the lack of originality? Yes. I have this whole thing going on where Oh My Jesus Christ are your stereotypical developers and engineers really, really fucking dismissive of *good* marketing, branding and ultimately communication. On the one hand, there's the worshipping of Apple and their products and their design, and I guess some conversation about their advertising and the strategy behind it, but Jesus Christ. Try harder. And also: in a conversation this morning with my boss, the Steve Jobs tapes, where *in an undifferentiated market where goods and services are easily substitutable* for which I would say "a thing that automatically backs up your photos and presents them to people" *totally* fits in that category, the *only* thing that differentiates you is your branding and marketing once you hit a certain degree of product sophistication. And there are certainly enough well-funded startups out there that can do this. Sure, there are moats that you can build around you, but this is a Nike/Adidas situation: the products have a <10% utility distinction in use with the audience.
Two: The launch video is already being lauded as an instant classic in advertising, bringing the American Dream back to life for which my only reasonable and printable response is a big sigh and *really*? From an inside baseball sense of perspective, and I say this with full knowledge that my day job is doing the same job for Facebook, but the Dropbox Carousel spot is a competently put together video of *meh*. They could've done so much better.