Is your phone or home assistant listening to you?
Smart assistants may sometimes accidentally record you
Technology companies are constantly collecting, storing, and sharing data from various devices, such as wearable technology like an Apple Watch, home assistants like Amazon's Echo, and of course smartphones like iPhones.
As our digital footprint grows, the debate intensifies over how much privacy we should have and when our civil liberties are infringed upon. Recently, this debate has centered on smart assistants, like Apple’s Siri and Amazon’s Alexa, which are voice commanded.
Smart assistants are generally always listening for their “wake word.”
APPLE WON'T INCLUDE CHARGER, WIRED EARPHONES WITH NEW IPHONE MODELS: REPORT
For instance, Alexa’s wake word is simply “Alexa.” Once a human says the wake word, Alexa records what you say or ask, then sends that information to Amazon’s cloud, where it is stored.
This is where the privacy factor comes in, as humans analyze a small sample of these recordings. Amazon uses your stored requests to train its speech recognition software using machine learning.
“This training relies in part on supervised machine learning, an industry-standard practice where humans review an extremely small sample of requests to help Alexa understand the correct interpretation of a request and provide the appropriate response in the future,” Amazon explains.
GET FOX BUSINESS ON THE GO BY CLICKING HERE
Other smart assistants work in a similar fashion. For instance, The Guardian revealed last July that a fraction of recordings from Apple’s smart assistant, Siri, is analyzed by contractors around the world.
“A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements,” Apple told The Guardian.
The biggest problem occurs when a smart assistant confuses their wake word with another word and starts listening unbeknown to the user.
For instance, Apple’s Siri may mistake “sneering” or “leery” for the user saying “Siri.” This can lead to some uncomfortable situations.
AMAZON REVEALS STATES WITH MOST DIGITAL ENTREPRENEURS
“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on,” a whistleblower told The Guardian. “These recordings are accompanied by user data showing the location, contact details, and app data.”
Amazon said that Alexa does sometimes make mistakes when it comes to hearing its wake word, but that these instances are exceedingly rare.
"We care deeply about customer privacy, and designed Echo devices to wake up only after detecting the wake word," an Amazon spokesperson told FOX Business. "Customers talk to Alexa billions of times a month and in very rare cases devices may wake up after hearing a word that sounds like 'Alexa' or one of the other available wake words. Our wake word detection and speech recognition get better every day – as customers use their devices, we optimize performance and improve accuracy. In fact, wake word performance improved by 50% in the last year alone."
Companies do allow you to delete your recordings and change your settings if you are concerned about your privacy. For instance, Amazon allows you to delete your voice recordings online as well as opt out of having any of your voice recordings reviewed by a person.
There are other smart assistants on the market such as Facebook's Portal and Google's Home. They are on the edge of the evolving relationship between technology and privacy. According to a Microsoft survey last year, smart assistants are here to stay, as 72 percent of people reported using one in the last six months.
CLICK HERE TO READ MORE ON FOX BUSINESS