Amazon's Echo Look could offer hackers another means of snatching your private photos

Impact

While tech companies struggle to get a handle on leaked nude photos, Amazon wants to put a camera in the room where you get undressed and send photos of you to the cloud.

On Wednesday, Amazon announced its newest product — Echo Look — which lets you ask Alexa to take full-length photos as well as videos of you so that it can judge whether or not you look like garbage.

Amazon.com, Inc.

While a high-quality (The background is blurred! There's built-in LED lighting!) full-length photo of you might trump taking a mirror selfie, it doesn't come without a sacrifice — like nude photos, or even fully-clothed but personal photos, being vulnerable to hackers.

"It's another one of those things that you're going to get with new technology, but further expose yourself to some greater risks," Barrett Lyon, head of research and development at Neustar, said in an email. 

When you take a photo or video with Echo Look, it is uploaded to the Echo Look app and stored in Amazon's Cloud, an Amazon spokesperson confirmed in an email. 

Is the Echo Look secure?

Should users be worried that images are stored in the cloud? "No more than they should be worried about Facebook chat messages," Lyon said.

A simple bug discovered by a security researcher in 2016 allowed hackers to read all of your private chats on Facebook Messenger. Another vulnerability spotted in 2017 let hackers listen to your voice messages on the platform.

Amazon.com, Inc.

The Amazon spokesperson said that the company "takes customer privacy seriously" and that it has taken measures to make the product secure.

"These include hardware control via the mic/camera off button, disallowing third party application installation on the device, rigorous security reviews, and encryption of images and communication between Echo Look, the Echo Look app, and Amazon servers," the spokesperson said.

The spokesperson also added that the Echo Look camera is "always off" unless you activate it using the wake word — Alexa — or by using the Echo Look App "to take a photo, video, or use live preview." 

When I asked Lyon how susceptible a device like Echo Look is to hackers, he said that "nothing is impossible and just like any internet-connected device it's a computer running an operating system so there will be some level of exposure," but added that Amazon "is much more responsible with code compared to a company that makes a fly-by-night DVR device that is then leveraged for a giant botnet." 

In short, Lyon believes the chance of a security breach in the Echo Look is "pretty low" but that "with new technology comes new security vulnerabilities."

Consumers can turn off the mic by pushing a button on the side of the Echo Look device, which will activate a red Ø and a light ring, meaning "the device will not respond to the wake word and you won't be able to take photos or video using the Echo Look App until you push the button on the side of your device again," according the spokesperson.

But this precautionary measure doesn't protect the images and videos already instantly uploaded to the cloud. The massive revenge porn campaign that resulted in the leak of nude and private celebrity photos was attributed to a security breach in several iCloud accounts

And as technology, privacy and surveillance expert Zeynep Tufekci pointed out in a Twitter thread, it's not only personal photos you should be worried about getting into the wrong hands — Amazon's algorithms will also be able to collect a wealth of data about you aside from just "hot or not" — they may be able to detect whether you're depressed, pregnant and your sexual orientation.