If You Don't Fear a Robot Takeover, This Futurist Explains Why You Should

Impact

Take a cursory glance at the technology transforming the world and you'll likely find the outlook positively sunny. Self-driving cars and factory robots have adorable, non-threatening faces. Apps that keep massive stockpiles of our nudes are represented by chill little ghost emoticons. Buzzing drones hold sprigs of mistletoe over unsuspecting pedestrians for a Christmas surprise.

But on the dark side of that same moon, robots are taking our jobs, social media platforms monitor and manipulate our every move, the government uses our cellphones to administer broad spying programs and remotely piloted planes kill civilians halfway around the world.

A Dangerous Master, a new book by academic and futurist Wendell Wallach, takes us on a tour of the nefarious possibilities technological innovations can lead to. It's not a light read if you're not already familiar with predator drones and hacking the human genome. But it's a perfect guidebook to the potential threats mankind faces if we continue along our current trajectory of unchecked innovative progress.

We spoke to Wallach about the threats we face and what we'll need to do in order to subdue them.

Mic/Getty Images

Mic: You argue that when it comes to defending us from the dangers of certain kinds of technology, we're asleep at the wheel. So who's holding the reins right now?

Wendell Wallach: Nobody, not when it comes to comprehensive oversight. You just have a lot of stakeholders pushing for their own goals. Even when something comes to the attention of a regulatory agency, the responsibility gets divvied up among so many of them, so it doesn't get dealt with.

Like what?

WW: Well, take small drones in the U.S. airspace. The Federal Aviation Authority was challenged on privacy from the get-go by the ACLU, so they used that as an excuse to not do anything. Now, Congress is making them introduce rules, but they've been really slow. Then, when their terms angered companies that wanted to make deliveries, like Amazon, they were immediately challenged with a consortium.

Meanwhile, we have drones dropping onto the White House lawn. There's a certain kind of incoherence, and lack of addressing the issues.

Mic/Getty Images

And you say that all of the luminaries and academics in universities and organizations just pay "lip service" to solving these problems?

WW: It's a troublesome thing, but the reality is we're not training people to think about these issues, and we're not creating institutions to address them. But once in a while you see someone actually promoted for having one or two of these interdisciplinary skills, sure.

You talk in the book about the dopamine release we get from consumer technology and phone addiction — that this is how "the computer trains us to be its servant." Is that what you mean by a "dangerous master"?

WW: Not so much in the sense of becoming actual servants of technology. Look at self-driving cars: The technology there is moving into the actual driver's seat as the primary determinant of human destiny. When technology dictates our possibilities, we become the servant.

Politicians, startup CEOs, academics, novelists, tech journalists — everyone has an idea of what the future looks like. Your book is full of possibilities. Do you see us on one particular trajectory?

WW: I don't see that we're on a course for any clear trajectory — there are various trajectories. Clearly we're on a trajectory to create new biologies and materials. But if they're not regulated, there will be editing of the human genome. In robotics, we're on a trajectory for self-driving cars, but the privilege of driving could be slowly taken away, or at least we'll have higher insurance policies.

My greatest concern is that the accelerating pace of tech and lack of oversights makes us vulnerable to more challenges than we can react to, so we miss inflection points and opportunities.

We're on a trajectory for technological unemployment, and some analysts believe that as much as 47% of jobs can be computerized. That will radically alter our political economy. We have to take measures to minimize the harm. 

So what do we do? Just wait for some major catalyst or shakeup?

WW: That's the question. Europe adheres to precautionary principle: If you can see something that can go wrong, the onus is the person introducing the technology. In the U.S., we go the opposite way, which gives us productivity advantages. but then we wait for some cataclysm to occur before we address anything.

Many of us don't like precautionary principle — we don't have to go that far — but we can engage in more participatory ethics.

Wendell Wallach/Basic Books

So I'm not an academic, or a CEO, or a politician. I'm just a lazy 20-something with a laptop and a feeling that something needs to be done. What can I do to take action today?

WW: For many people it might just be to stay informed so that as opportunities arrive, you can make your opinion known — if not become a part of the social issue. My greatest concern is that the accelerating pace of tech and lack of oversights makes us vulnerable to more challenges than we can react to, so we miss inflection points and opportunities.

But look at lethal autonomous weapons. Before Human Rights Watch began calling for the banning of killer robots, a conference in Berlin brought together arms control negotiators and people who created successful campaigns to deal with land mines and cluster bombs. That conference was put together by four people. It didn't start as a mass movement.

We're responsible for so many things in this world, but there are points at which you have to take initiative. We're not all going to, but informed citizens pressure us to take action before a real crisis.

So I took my responsibility. I wrote the book.