November 23, 2014

Jobs for Horses :: US+TECH #7

A few months ago, CGP Grey (one of my favorite YouTube channels) made a video called Humans Need Not Apply. It's all about the idea that increasing automation of the workforce will leave a large swath of humanity permanently unemployed in the near future. The argument goes like this: robots can do many jobs cheaper and better than humans can. In fact, they already are. As more and more robots are developed that can do more and more things, more and more people in the workforce will find themselves less and less employable. And while there are certainly jobs that are unlikely to be replaced by robots any time soon (like computer programmer), there aren't enough of those jobs available for everyone to suddenly transition into them.

The argument against this sort of nightmare scenario usually says that improvements in technology will create new jobs we can't even think of today that will replace the old jobs they eliminate. So no one actually loses out, they simply transition into the new jobs. But this doesn't work. There are just too many jobs likely to be eliminated, and too few jobs available to replace them. In the Great Depression, US unemployment reached as high at 25%. In "Humans Need Not Apply," CGP Grey predicts a permanent unemployment rate as high as 45%.

This is a scenario that many programmers today are wittingly or unwittingly contributing to. The job of a software developer is to make software which can replace human work. Instead of having a person keeping track of all the contractors on the build site, you have an automated system with a database of everything. It doesn't call in sick, it doesn't take vacation, it's cheaper, and it's more consistent. And all of this is (theoretically) good for the people who can now do their work faster and better because of this reliable machine their using. This is why software developers do what they do, and why the skill is valued so highly in society. Being able to overcome the limitations of human workers through infallible and tireless machines is a win in the short term.

But in the long term more and more machines will replace more and more people, and the good the programmers have done becomes a societal evil as it leaves large chunks of the population out of work. Essentially, my job as a software developer is to make you unemployed. Not so nice now, huh?

When you think about ethics and computing, you likely think about privacy and security. About respecting other people's information, and making sure to safeguard it appropriately. You don't really think about what is in the end a much more central question: is automation itself ethical?

There's an example used by Grey to combat the idea that the growth of technology will make new jobs for humans. He describes two horses near the turn of the century talking about this brand new "car" thing that they're hearing so much buzz about. One horse is worried that it will leave them out of a job. But the other says not to worry, because they'll just get to transition away from the hard jobs of pulling tractors around in fields or hauling carts around in cities and go to cushier jobs. Everything will be fine, the horse says.

Except that's not what happened. Horse populations peaked at the turn of the century, and then declined as the automobile grew in popularity. The idea that new technology makes better jobs for horses is clearly silly. But when you swap out "humans" for "horses" people suddenly find it sensible.

I, and many others like me, are building the cars of the new millennium. They are fast, they are cheap, they don't make mistakes, and they are coming for your job. I am sorry about it, and I wish I could say that the things we make will only make life better, but there is a real possibility that in the near future we'll see Grey's nightmare come true. And even if we try to stop it the economic forces encouraging its onset are too strong. It is coming, and in its wake there will be far fewer jobs for horses like you.

---

What do you think? Do you believe that this scenario will come true? What should we do if it does?

Just hit reply to share your thoughts. I'd love to read them.

— Andrew Brinker