A short one, today. I'm on vacation in Missouri with my family and my wife's parents before I fly out to Perth, Australia, to speak at the xmedialab Video+Social conference. The farm is quiet and relaxing (albeit a bit snowy today), the cows are fed and my 13 month old son has just figured out to clap.
1.0 Wearable Reckons
Okay, so there's a lot going on with wearables. Google just released/announced an Android SDK for them, and apparently if Apple doesn't release an iWatch in the next 60 days, the end will be nigh for the company that's been forever teetering on the precipice of disaster and irrelevancy. So here's a bunch of short reckons about wearables:
Obvious reckon number one: if you're thinking about wearables you should distinguish between sensing and interface.
If you're sensing, then you'd better have either novel sensors or outstanding data manipulation, because the sensors have become pretty much commoditised now that it's trivial to whack a three-axis accelerometer in a thing and now that everything has a three-axis accelerometer in it, you'd better be able to deliver on the promise of your sensing which, to be honest, probably doesn't live up to what the public thinks of your device, no matter what you actually tell them. Fitbits and Fuelbands should measure *effort*, in a consumer's eyes, but don't, and the degree to which they're able to fulfill that promise relies upon your smarts with data.
If you're interface, well: that's the one that's a lot more complicated and where it feels like no one's really done anything good yet. It should be pretty obvious that Google's Android wear demo/showcase videos[2, 3] are pretty much doing the best that anyone's done yet, but that's more damning the field with faint praise where players like Samsung are just taking advantage of Moore's law and scrunching and integrating COTS hardware and whacking a not-really-thought-through interface on top, while Pebble is clearly catering for the early adopter (and niche) geek market. Designing the interface for a wearable requires that you actually have a pretty good idea of what it's going to be used for, which is why I find examples like the red/green LED interface of Nike's Fuelband and the vibration interface of the various Jawbones useful, novel and as if people have given some consideration to the use-cases. And wearables, for reasons of industry myopia, have focussed only on the wrist, where that doesn't necessarily have to be the only use-case. Earpieces are technically wearables, and it's quite easy to spot the fake-analyst amongst the ones who have proper, professional reckons, because the real ones are looking to see what the opportunities are across the entire body, and not just having to release a watch because everyone else is.
Wearables are a great opportunity, for example, for coarse, contextual information. Given that most of them require tethering to a master device like a smartphone that's already on or relatively near your person, they have to do a different, better job than you could accomplish by simply pulling out your phone. Or, they have to do so in a qualitatively different way. This doesn't mean stuffing the wearable with an OLED screen, for example - because screens are screens are screens. But if you reduce the information density of the wearable device (for example, something that can only vibrate), that forces you to think about what it can be used for and how it can contribute to (if you want) some sort of contextual situational awareness without having to pull out a phone every single time you get a notification.
One of the promises of wearables is that you end up being surrounded by a personal cloud (ha) of information processing and interface devices: a suite of sensors and a way of interpreting and internalising that information.
We aren't even at the point where there were enough crap MP3 players for Apple to know that they could nail the solution, so I'm not holding my breath for anything coming from them soon. And even if it does, we'd all do well to remember that first Slashdot review of the iPod
A very quick one this. Mirror neurons spring in and out of the collective pop science/pop cognitive science/pop evolutionary biology consciousness every now and then, but the main gist is this: there are neurons that have been discovered in monkeys, and then possibly in humans, that fire when an action is performed (e.g. reaching out and grasping an object) but *also* fire when that action is observed. Some scientists, like V. S. Ramachandran, believe that the existence of mirror neurons is a big signpost as to how human beings uplifted themselves into consciousness - because once you have mirror neurons, it's easier to see how things like empathy and theory of mind emerged: indeed, studies have linked mirror neurons in monkeys as being key to understanding the goals and intentions of others.
So the quick, terrifying reckon is this: if our western capitalist world's corporations by their nature reward sociopaths with leadership positions, then what does this say for a possible instinctual, neurological level bias toward mirroring the behaviour of those sociopathic leaders?
Also, something something new science fiction book to do with zombies, mirror neurons and YouTube.
3.0 More Google
It's a surprise that there isn't more fiction about or inspired by Google. At least, it is to me. They're such a stupendous influence upon the world right now (well, they are if you live in certain circles) that you'd think they would figure in literature more. Or perhaps the only literature they *could* figure in is the SFNal kind, which is never going to get any love. I mean, if you think about it in that freshman three-in-the-morning deep philosophical kind of way, writing books that include Google - especially contemporary fiction, never mind that science fiction stuff - is kind of like writing a book that doesn't include a mobile phone or electricity.
That said, here's some great Google-related fiction that's also an excuse for me to point some books your way.
Robin Sloan (hi, Robin!) has written a few wonderful pieces and it's a strand that has followed from his seminal EPIC 2014[1, 2] through to his sublime novella, Annabel Scheme which I can't really describe for fear of spoilers suffice to say it's set in a San Francisco heavily influenced by a Google-like entity, all the way into his most recent novel, Mr. Penumbra's 24-Hour Bookstore that definitely does reference Google in a pretty plot-fundamental way.
Alongside that, I've been reading William Hertling's Avogadro Corp which name should be a dead giveaway that it's been influenced by a certain search-based corporation named after a mathematical concept. Bonus points for being based in Portland, of course.
And lastly, Dave Eggers' The Circle, which as Eggers points out, was researched in no-way at all and bears no resemblance whatsoever to any Valley-based social network or search company in any way whatsoever. Eggers hasn't written a realistic book at all, but he's certainly written a skeweringly painful one. And if you work in the Valley and you *don't* think it's skeweringly painful and don't see the tropes that he's lambasting, then you probably need to move away for a bit, unplug and there's a chance you should engineer your personality so you're not Robert Scoble.
PS. You'll be excited to know that the Amazon Affiliate links have earned my son's college fund *nearly sixty cents*. So if you've been worrying about clicking on those links, er, don't.
Okay, that's it. Tomorrow I do the whole MCI->LAX->SYD->PER thing which means in theory I should be able to write a newsletter for Tuesday, but given that I don't actually land until *Thursday* local time, I'm not exactly sure what will happen.
As ever, send wonderful notes. Or send ones that are really mean and nasty (I haven't actually received any mean and nasty ones yet) - if you do, I'll just delete them.