The Pew Research Center, an American non-partisan think tank, recently released the first part of a report covering the changes America has borne through the first 25 years of the World Wide Web.
For whatever reason, there has been a wave of nostalgia palpably moving through a lot of my friends. There's been Fast Company's Oral History of SXSW Interactive, which celebrates its twentieth birthday. There's been Nick Sweeney's wonderful look back at the tenth anniversary of O'Reilly Emerging Technology conference, and at the same time, Time's story of Flickr's tenth birthday.
This is going to be a bit introspective and may come across as self-indugent. I apologise in advance.
I started blogging in 1998/9, on my college's student webserver, with a tilde address reflecting my university computing service username. It was, I'm (somewhat) happy to admit (after a bunch of therapy, probably) undoubtedly immature, but there's a long, direct line from 1998/9 on www-stu.cai.cam.ac.uk/~dyh21/ through to where I established my home at danhon.com.
Back then - and it wasn't really *that* long ago - publishing was still hard. I'm not telling you anything you don't already know - we can all play the Four Yorkshiremen sketch and talk about how we hand-rolled our websites with notepad or BBedit or vim or emacs and laboriously copied them over via FTP and had to remember whether we needed active or passive FTP or slapping our foreheads when we had transferred images as ASC rather than BIN. If you look hard enough, it was always going to be all fields as far as you could see.
Let's skip that bit.
Let's instead concentrate on how the web felt at that time.
Because it felt small and it felt intimate. It wasn't, of course. Millions of people were on the internet back then. But, for a while, you could count on both hands (even if you weren't using binary) the number of self-identified "webloggers" in the UK at the time. And we would do things like go and meet up and awkwardly talk about how weird it was that we were all publishing things on the web.
Funnily enough, hardly any of that original group write much on the web anymore.
I don't really know how to talk about this other than in an intensely personal way. We were living at least a portion of our lives in public (or, at least, exposing the parts of our lives we were happy exposing, in public) and a number of us remember having to explain to people that the information we chose to publish about ourselves was not, in fact, the totality of our selves.
The web (at least, our bit) felt small because it *was* small. We didn't have to worry, as much, about other people, because, for the most part, other people weren't looking. Similarly, when Twitter first started its service, for the first fifteen thousand users, there wasn't really a need to have a private account. That changed, later.
The web was a room back then, and it felt like we were the only people in it, and we formed intense relationships and camaraderie with the others we found, even if, in fact, the only thing we had in common with them was that we were almost obsessively interested in connecting and what this network could do for us as people. And we felt like we could know them.
The web is different now, of course. It's so much bigger. In 2014, in the US, 87% of adults use the internet now. In households earning about USD 75k, it's 99%. 97% of young adults aged 18-29 use it. You can't hide on it anymore.
I can see that this means that the feeling of a wide-open expanse, populated sparsely but still with strong connections, has gone away. That wide-open expanse is incredibly populated now. There are so many groups. Instead of the one knitting blog ring, there are multitudes. What's happened instead is that a singular community has not so much splintered and fractured as been fractally copied and pasted, self-similar versions spammed across the internet in a reflection of every single thing about us that makes us unique, good and bad. By necessity, those groups are potentially harder to find because, well, there are multitudes of them. Instead of just the one, that some of us grew up with. (And, it has to be said, it turned out that some of us had nothing in common *other* than an attraction to the network, back then, and didn't end up forming long-term friendships).
I don't personally know if the web feels anything like the way it did when I started participating in it. If the experiences that I was a part of and created can still be reproduced in some way. One of my friends laments the smallness of that old web and, I think, the defined small social space that we implicitly had, rather than explicitly created. None of us really write online anymore, at least, not in the sense that we used to blog regularly. Instead, Twitter takes up time and, honestly, the intervening fifteen years have inevitably changed us as we've grown up.
My initial reaction to my friend's sadness was instead a sort of optimistic "but a thousand, million, million flowers bloomed!". But now their sadness has caught on - and I wouldn't be surprised if it had anything to do with the wet Portland weather outside - and the more I think about it, the more I do miss it. The non-corporate, small web. I'm not even necessarily saying that I wish things stayed the way they were (barring, of course, the wonderful methods of surveillance now perpetrated upon us) - the growth of the web has been a good thing. I don't even want to make some sort of trite gentrification metaphor of interests coming in, buying up and then bulldozing nascent communities in the way that you could interpret Yahoo! doing to Geocities. Whatever; more people than ever before are online. And on balance, that's a good thing.
I was watching Bruce Sterling's talk at Transmediale 2014 and a number of things collided in my head. He railed, as you would expect him to, against the latest protrusion of the West Coast Randian boosterism into our world, whether we liked it or not, and its inevitable metamorphosis into the sort of security and surveillance state protested by its very architects back in the 60s. And he introduced me, embarrassingly, to Barbrook and Cameron's critique of the Californian Ideology[2, 3] - the very least of which because I spent close to a year working with Andy Cameron (at an ad agency, of all places) who introduced me to Richard Barbrook.
What does the Californian Ideology look like now?
It looks like Software Eating The World, and by extension, The People Who Live In It. Software Eating The World, of course, refers to Marc Andreessen's 2011 essay in The Wall Street Journal on the opportunity for software and computing power to disrupt with a lower-case d existing industries and provide opportunity for massive structural change in businesses like finance, logistics, energy, healthcare and education. Andreessen's essay - notably in the *way* in which it is presented, but not necessarily the content itself - is in a way the non-offensive, good cop face of the Californian Ideology. The story of inexorable progress and American exceptionalism. Fair enough: the logical end-game of the seeping of Information Technology into every day life and its promise of increase in productivity in the name of efficiency.
Andy Kessler, on the other hand, is the bad cop face of the Californian Ideology, the "I'd be happy to call him a cunt," kind of Wall Street and West Coast capital veteran, who gleefully exhibits stereotypical traits of corporate sociopathy, so much so that his 2011 book on entrepreneurship is entitled, with corresponding lack of empathy or moral centre, "Eat People".
Kessler's 2011 book is a rulebook, a manifesto for aspirants wanting to share in the wealth of the new American software-backed entrepreneurial gold rush, of which one central tenet is that "the road to wealth passes through the graveyard of today's jobs." Indeed, Kessler maintains, "the best way to leverage abundance and scale, and to create productivity, is to get rid of people."
This is the new Californian Ideology, and it isn't for everyone. Not by a long shot.
To be fair, Kessler doesn't mean *all* people should be eaten. He just means some of them. Alright, most of them. The ones who can stay are those he terms as "creators", bluntly, those - and only those - who in his view create productivity. They are those who write "code that automatically lays out magazines, getting rid of expensive graphic designers. Or design a robot to pick and place a product into an Amazon box for shipping, getting rid of workers in a warehouse. Or writing algorithms for trading stocks in milliseconds, something humans couldn't do even if you threw a thousand of them at at the problem."
The more astute of you will see purely financially oriented creation here, but Kessler will throw some of the others a bone. Creators may also "come up with a drug that lowers the risk of some debilitating disease, or a test to identify cancer ive years early." A creator, in Kessler's view, can be a creator by "increasing output by saving lives and keeping people healthy, too".
Kessler places the ultimate value of a human on those who increases output per worker hour. To do otherwise is to impede inexorable human progress and the raising of standards of living for everyone, everywhere.
Everyone else, he says, is a Server. In Kessler's world, the Creators are the Eloi and the Servers their Morlocks.
Why am I bothering with what uncharitably feels like a hit piece on such a sociopath? Because Kessler's updated Californian Ideology is an especially pernicious view upon the world, growing increasingly shrill when it feels increasingly threatened. One populated by Tom Perkinses who only half-jokingly (but not really?) suggest that the rich should have more votes than the poor (because they are, naturally, more valuable) and that the attacks on the 1% by movements like Occupy are some sort of Kristallnacht (pro-tip: if you are suggesting something is like Kristallnacht and it is not, in fact, Kristallnacht, it probably isn't Kristallnacht). One also that shows up in advertising, with Cadillac's recent Poolside ad, catching flack for apparently going after (variously) the 1, or the 1.2%.
Kessler and Perkins are the sharp, pointy, massively wealthy (and wealth-accumulating) stick of the logical end-run of the bright-eyed optimism of this formerly counter-culture California Ideology.
You see, signs are increasingly apparent that the bounce-back recovery (not that it was ever a bounce-back) from the 2001 era, isn't a bounce-back recovery. Jobs that vanished, vanished, and have been replaced by zero-hours contracts. Kessler talks of Creators/Makers and Servers/Takers, and exhorts as many people as possible to become Makers, lest they become parasitic drags upon society and humankind.
Kessler doesn't go far enough. He is implying some sort of productivity Moore's law where ever increasing human ingenuity, applied in the right ways to root out inefficiency, will increase the standard of living for humanity as a whole.
Bluntly: fuck that shit.
The principles espoused by Kessler are nothing but a hollowing out of society, the pursuit of nothing but the almighty motive of profit, dressed up in some vague handwaving about increasing quality of life for the citizens of the world. Bullshit. While in broad strokes, capitalism has indeed been a means by which to lift a sizeable proportion of the world from subsisting on cents to dollars per day, it remains to individuals like Bill Gates in the guise of his Foundation, to finish the job.
Walmart and Amazon certainly are succeeding in widening access to access to low-cost goods that enhance your quality of life. But the way Walmart and Amazon treat their workers is counter-productive if you take Kessler literally. In general, more work, done by less people, for less money: because the productivity increases are accruing at the top. The Amazon employee who cannot afford what it is that they're selling, who does not have a safety net.
Ah, but Kessler says we have a safety net. Well, we don't, not really. Not in America, at least. A few days ago, an email went out to all staff in our office, appealing for donations to the order of $75,000 for a friend of the agency with a recent diagnosis of brain cancer. There's no safety net. Kessler is not advocating a universal basic income.
I don't know what it is, exactly, that Kessler hopes to achieve other than its fixation on the profit motive makes me gag. I am tempted to recall part of Joseph N. Welch's outburst during the McCarthy hearings:
"You’ve done enough. Have you no sense of decency, sir? At long last, have you left no sense of decency?"
This sociopathic shrug-shoulders "don't look at me" attitude of increasing productivity at any cost is dangerous because it inspires people to write Medium thinkpieces like On Eating People, who don't know any better.
Teachers are important. Doctors are important. Poets, Artists, Musicians, Builders, Lawyers, Parents - I don't care who you are, you're all people and you don't deserve to be ground up and spat out if you're not efficient or productive enough. Sure, there are other systemic problems and no one likes regulatory capture or the fact that Comcast has a government mandated monopoly and is able to raise your cable service fee every single year.
But for the love of god, can we agree that our future is one predicated on empathy with one another, and not to monomaniacally follow the sociopathic worship of profit.
Three years ago, AT&T won a Supreme Court decision that affirmed their ability to include a clause in standard terms and conditions blocking class-action lawsuits and instead compelling binding arbitration.
Sony Network Entertainment International (the corporate entity operating the Sony PlayStation network) more-or-less immediately followed with an amendment to the terms and conditions of the PlayStation network, similarly enforcing the practice of waiving class-action rights and instead compelling individual binding arbitration. SCE did provide a way to opt-out of that particular waiver (if, of course, you were able to spot it in the updated terms and conditions) - you would have to make such opt-out in writing, to a provided address, within thirty days of acceptance of the EULA/Terms and Conditions. From a user-advocacy point of view, if *acceptance* of terms and conditions is perfectly fine when clicked-through on a tens-of-thousands-of-words EULA, then it's hard to see why opting out is *only effective* when made difficult.
Last week, Dropbox, VC-backed providers of the online filesharing and syncing service, announced[3, 4] a similar change in their terms of service on their blog, that they would be, in their words, now using arbitration to resolve disputes.
The difference in Dropbox's adoption of binding arbitration was that their TOS change was accompanied by a (dare I say it) accessible, useable and friendly method of opt-out.
This is pretty much a perfect illustration of the point I was trying to make in episode nineteen, in Not Trying Is A Signal. In that episode, I said: it isn't hard to do this (in this case, provide an opt-out service).
The fact that Dropbox have done this in this particular instance proves the case, and Dropbox makes a point of it in their blog post: "If you prefer to opt-out... ... there's no need to fax us or trek to the post office". Sure, it's a shitty thing to do, but the fact that they've provided a usable alternative is a signal that, more than Sony Network Entertainment International, they actually give a shit about their users.
This is what being user-focussed means, and I think the Dropbox example shows a way forward where companies can still do shitty company things, but can *execute* those things in ways that aren't, for the most part, stupendously user-hostile. And I hope that whoever at Dropbox was behind this particular policy - ie the one that said that they didn't have to make the opt-out process user-hostile - gets credit for this, and that Dropbox don't shy away from doing such things in the future.
This one simply collates a bunch of thoughts I had about the skeletal recognition engines in depth cameras like the Kinect for Xbox 360 and Kinect with Xbox One.
The Skeletal Recognition Engine from Dance Central but:
- used in a body language game where you have to sit across the table from someone and lay them off;
- in a game set in a restaurant where you have to gesture wildly and order food from a menu you can't understand;
- to teach you ever more complicated gang signs (and chain together moves, obviously). DDR but for your fingers;
- used in a Quantic Dream game where you have to match jerky escape movements in order to prevent your rape;
- in a cutscene where you have to attract someone's attention at a gig by shuffling around, because it's too noisy;
My wife asked me why I randomly tweeted these out the other day. I think they were prompted by the news of GCHQ's Project Optic Nerve and their wistful thinking that there was probably a bunch of useful data they could capture via Kinect.