Servant or spy? Law enforcement, privacy advocates grapple with brave new world of AI assistants

Amazon has announced that Alexa, the brain that powers Amazon Echo, is coming to the UK, along with two Alexa-enabled devices: Echo and the all-new Echo Dot.

Alexa, the voice assistant built into the Amazon Echo, is one of many artificially intelligent (AI) personal assistants being deployed by technology companies to help consumers manage their homes and schedules. Amazon's gadget, which is quickly emerging as a strong rival to Apple's Siri and Google's Assistant, was a big hit at this year's Consumer Electronics Show (CES) in Las Vegas.

Yet as a recent murder case illustrates, AI assistants are creating thorny legal and privacy questions that legal and cybersecurity experts are scrambling to understand. Because virtual assistants rely on microphones that, in some cases, may be continuously recording and sending information, that trove of information creates a delicate balance between law enforcement requests, corporate strategy and individual privacy rights.

In 2015, an Arkansas man was found dead in a hot tub, and investigators issued a warrant to Amazon, requesting the company turn over audio recordings and information captured by an Echo smart speaker owned by the suspect. Although the internet retailer declined to give authorities the requested information, at least a few experts say the case may be a foreshadowing of things to come.

That is because the convenience of voice-activated devices, which passively listen for a "hot word" or a "wake word" in order to activate, may come at a cost of individual privacy. In order to function, the device must constantly record and process all sound all the time, hoping to pick up on the wake word.

A big part of the onus lies on the companies manufacturing the technology, explained Andrew Crocker, a staff attorney with the digital rights group Electronic Frontier Foundation, in an interview. "We can still insist that these companies protect our privacy when the government comes for that data," he said.

There will be a real privacy scare when the real 'always-on' devices, such as the remote video cams and Wi-Fi connected baby monitors and the like, become more prevalent.
Jules Polonetsky
CEO, Future of Privacy Forum

The trade-off between privacy and convenience, including the range of information captured, is generally acceptable to users — as long as they clearly understand what those gadgets record and what they do with the information. "All those things are important to know if you're going to make an informed decision to use a piece of technology like this."

Central to privacy fears is that many voice-recognition devices are "always listening," as a recent research paper jointly authored by Ernst & Young and the Future of Privacy Forum, an advocacy group, pointed out.

Among the various types of microphone-enabled devices, each has different privacy implications that consumers should understand, the paper argued, "influenced in part by whether data is stored locally…or whether it is transmitted from the device to a third party of external cloud storage."

It's a scenario that created a stir for Samsung in 2015, when a little-noticed provision in its Smart TV's privacy policy suggested spoken words could be recorded and transmitted to a third party.

The LG Hub Robot (L) and Mini are displayed at an LG press event for CES 2017 at the Mandalay Bay Convention Center on January 4, 2017, in Las Vegas.
Getty Images

Jules Polonetsky, CEO of the Future of Privacy Forum, says in an interview with CNBC that the distinctions between devices create misunderstandings among users. He said that companies should make it obvious when a device is recording (typically via a flashing light or other visual indicator), and when and how smart assistants handle voice recordings.

It creates a bond of trust between the user and the device — with significant potential for backlash if the trust is broken. "These devices are designed to not be useful to law enforcement," Polonetsky told CNBC. The lawyer and advocate said it was "incredibly and highly unlikely" that an automated assistant could capture a crime in progress.

Nevertheless, the Arkansas investigation "is an important wake-up call, because it shows that people are really ready to get very upset if they think they are being spied on," he added. Polonetsky says the real danger lies in devices that constantly record.

"It's a good reminder here to companies that they need to be mindful of only collecting what most of us want collected, which is our actual commands and directions," Polonetsky said, adding that "there will be a real privacy scare when the real 'always-on' devices, such as the remote video cams and Wi-Fi connected baby monitors and the like, become more prevalent."

That day may be closer than some think, as smart devices explode in popularity and heighten cybersecurity risks. A 2015 Gartner study estimated that consumers around the world are adding a staggering 5.5 million smart devices to their digital arsenal on a daily basis, and companies are responding by creating even more sophisticated devices.

At the 2017 CES expo, General Electric announced a futuristic lamp that integrates Alexa, while LG unveiled a robot that relies on the voice assistant. Even home appliances manufacturer Whirlpool is getting into the act, with Alexa-enabled washers and dryers.

The increased ubiquity of these devices, however, means that there is the ever-present danger of them being used for nefarious purposes — like the sophisticated Denial of Service attack that briefly crashed the internet in October.

All of which means connected technology may make our homes smaller — but it comes at the cost of privacy and could even set consumers on a collision course with law enforcement. Only time will tell if gadget holders fully appreciate the trade-offs. "Our home is just too private of a place," Polonetsky said.