Alexa smart device is always on and can record you in your home even if you don't use a command word

Alexa, Get Out Of My Life

11 mins

Smart devices, speakers and wearable tech are being used to solve crimes, but they could also make you vulnerable to criminals in the first place 

When Richard Dabate told police a masked intruder had broken into his home, assaulted him and killed his wife Connie, his story was contradicted by an unlikely source: the dead woman’s Fitbit. 

It showed her walking around the house in Connecticut long after the time Richard said she’d been shot. And when officers examined her mobile they found a list entitled: ‘Why I want a divorce.’ Dabate was charged with her murder and is currently on trial. 

Ross Compton said that he was asleep when his house in Ohio caught fire and he grabbed his nearest possessions and jumped out of the window. Investigators pulled data from his pacemaker which, they said, undermined his account. He has been charged with aggravated arson and insurance fraud. 

Smart Witnesses

Step forward the latest witness in all manner of crimes: smart devices. Yes, your Alexa, Echo, speakers, fitness monitors, scales, kettles, vacuum cleaners, coffee machines, door locks, and lightbulbs – even your doorbell and child’s toys and dolls, if connected – are watching and recording your every move and could prove to be the indisputable evidence in solving a crime. 

And these devices aren’t always used to show someone’s lying – they can be equally important in proving someone’s innocence. 

In 2016 police requested Amazon Echo data when James Bates was charged with murder in Arkansas after telling officers his friend had accidentally drowned in his hot tub. The charge was dropped after Amazon handed over the data. 

One forensic cyber investigator described how a washing machine with an internet-connected app was used to prove an alibi in a criminal case.

Smart home devices, speakers and wearable data have figured in multiple crime cases since then. Requests to Amazon for data from police has risen 72 per cent in that period and continues to rise. 

We will be spending $88bn on smart devices by 2025

According to Statista, last year we spent over $62bn on these smart home devices, which is predicted to rise to $88bn by 2025. They are connected to our smartphones, and to our home wi-fi systems called the Internet of Things, or IoT. 

In this techno utopia, lives are run by algorithms, artificial intelligence (AI), apps and gadgetry which collects sensitive, personal data and sends it to be stored and analysed on cloud servers controlled by faceless organisations. 

Vulnerable Security

But when we buy a set of smart scales or a smart toothbrush, security and privacy is barely a consideration. It also barely factors in the design and manufacture of these devices. 

As well as providing police with evidence at the scene of a crime, each connected smart devices is an ‘attack surface’ and, therefore, vulnerable. There were 7.6 billion of them in 2018. By 2030 there will be 24.1bn. That is a lot of vulnerability.

Alexa: Friend or Foe?

Such concerns were explored in March when international cybersecurity experts gathered at GISEC Global in Dubai, the largest and most influential cyber security exhibition and conference in the region. There, the UAE Cyber Security Council hosted the UAE’s first live National Bug Bounty ethical hacking programme in which 100 international ethical hackers were tasked with hacking, identifying, and fixing software flaws in a number of different scenarios and mainframes, including electric cars, mobile phones, and drones. 

Privacy of your data collected by Alexa and other smart devices is at best opaque

So, is Alexa your friend or foe? Generally, concerns about connected consumer devices fall into two categories: privacy and security. Privacy relates to the data devices collect, where that data goes, who sees it and what it is used for. Security relates to the robustness of the software architecture those devices use.

Privacy of data in most devices is at best opaque. Manufacturers of goods that collect, store and sell personal data should, under the law in most Western countries, seek consent to handle that data and have protocols in place to protect it. In practice, these permissions are usually buried in terms and conditions documents that take hours to read. Denied consent often affects functionality. Studies show that few people ever read the T&Cs and click to authorize automatically. 

These devices sit in millions of homes listening to the intimate details our lives. The common misconception is that they only spring to live when consciously activated with a command word but that is not the case

One sector of the IoT market in which privacy concerns are particularly acute are smart speakers, dominated by Amazon Echos and Dots and their integrated digital personal assistant, Alexa. 

Along with Apple HomePods and Google Nests, these devices sit in millions of homes listening to the intimate details of our lives. The common misconception is that they only spring into life when consciously activated with a command word. But that is not the case. 

Studies by Northeastern University and Imperial College London show that voice assistants embedded in speakers and smartphones, such as Apple’s Siri, Microsoft’s Cortana, Alexa and Google Assistant are being falsely triggered up to 19 times a day. 

Tricky Conversation with Alexa

The researchers found smart speaker devices commonly mistakenly hear wake words such as ‘Hey Google’ when people are talking, and, therefore, record private conversations. Researchers have already demonstrated that they can make Alexa skills that effectively trick the owner into thinking the speaker is inactive when it is listening.

Alexa smart device is always on and can record you in your home even if you don't use a command word
Alexa is always on even when you don’t use command words, research has shown

But the privacy issues go much deeper. Alexa is always on, even when she is not roused by her command words, as David Emm, Principal Security Researcher at Kaspersky reveals. 

‘Until a couple of years ago most people were under the impression Alexa woke up when you said the trigger word and that was that. At various points in the last couple of years, however, it has transpired that is not the case and, in fact, Alexa is alive. Amazon has shared the fact that it does collect a lot of information which it says it uses purely for improvement of service, nevertheless the fact that information can be picked up has privacy implications.’ 

Personal digital assistants are designed to learn your habits, your needs and your desires and seamlessly integrate themselves into your life. Companies call this ‘personalization’ and ‘intuitive functionality’. 

In the world of technology there is a well-worn adage:‘If something is free, then you are the product’

When Google launched its first assistant, Google Now, in 2012, company Chief Economist Hal Varian explained that the system should know what you want and tell you before you ask the question. The more of yourself you give to the system, the more you reap the value of the application.

In 2016, Microsoft launched Cortana, and CEO Satya Nadella was equally enthusiastic.‘It knows you deeply. It knows your context, your family, your work. It knows the world. It is unbounded,’ she said. 

But companies are not designing these AIs from a desire to make your life easier. In the world of technology there is a well-worn adage: ‘If something is free, you are the product.’ The commercial imperative in Alexa, Cortana and Google Assistant is data. Author of The Age of Surveillance Capitalism, Shoshana Zuboff, describes personal digital assistants as ‘Trojan horses’ which render and monetize lives. They record data and send it to data farms. How it is then used remains unclear. 

Kaspersky’s Emm continues: ‘The business model of companies like Amazon is increasingly rooted in data. But there are huge questions over how well they look after the vast swathes of data they are hoarding.’

Dr Garfield Benjamin is a Postdoctoral Researcher at Solent University in the UK. ‘The approach of these companies so far seems to be to gather as much data as possible and decide if it’s useful later,’ he says. ‘But that often means they don’t keep track of all the data they have collected and have varying levels of security over how well it is kept, not to mention questions over following privacy regulations. There are specific questions over whether you can even give permission for data to be collected if, say, you have guests over.’

I said ‘stop listening to me, Alexa’

Even your voice has a commercial value to companies determined to develop the perfect voice capabilities. Big tech is on a global hunt for terabytes of human speech which is then used to train AIs to understand and respond to the commands and queries we give them. 

This type of data use should be made implicit to smart speaker owners, argue privacy advocates. However, one in three smart speaker owners are unaware of how their voice recordings are stored. Amazon, meanwhile, explains that the latest 4th Generation Echo is ‘designed to protect your privacy, built with multiple layers of privacy protection and control, including a Microphone Off button that electronically disconnects the microphones’. 

Big Brother TVs

And questionable privacy concerns are in no way solely restricted to the smart speaker market.

Parents were ordered to destroy My Friend Cayla, an 18-inch internet connected doll, as they constituted a concealed espionage device

In 2015, it was discovered that some Samsung smart TVs were recording all speech within their vicinity and sending the records to be transcribed by voice recognition specialists Nuance Communications. Samsung acknowledged this in the TV’s surveillance policy and disclaimed responsibility for third party policies. In 2017, TV manufacturer Vizio paid $2.2million to the Federal Trade Commission for capturing data about owners’ viewing habits from its sets and selling the information to advertisers and other third parties. 

In the same year My Friend Cayla, an 18-inch internet connected doll made by Genesis Toys was banned in Germany. The toy used a built-in microphone to record commands which were then processed and uploaded to third party servers where they were analysed and stored. The German Federal Network Agency ordered parents to destroy any such dolls in their possession as they constituted a concealed espionage device.  

The rush to turn anything and everything into a smart device is understandable given that the data broker industry is worth multi-billion dollars a year and every snippet has a price. 

The situation is going to get a whole lot worse, because as more connected devices are linked together in homes, more data is shared. 

The other side of the IoT coin, security, is where things get scary on a society level because the IoT has the potential to be a Pandora’s Box, easily exploitable by criminals, terrorists, and state-sponsored hackers. 

‘Security is not important for buyers or manufacturers,’ says Dr Duncan Hodges, Senior Lecturer in Cyberspace Operations at Cranfield University, says. ‘We have devices which are incredibly vulnerable, and we put them into the most sensitive locations we have – our homes – and feed them all sorts of sensitive information.’

IoT devices such as smart thermostats can be exploited by criminals planning personal attacks. Investigator Dr Sarah Morris of the Centre for Electronic Warfare, Information and Cyber Digital Investigation Unit in the UK, explains: ‘We’ve seen a number of cases where people have had their smartphones stolen specifically so people can access IoT devices within the home. They then use data from devices to work out what their victims are up to and identify when the victim leaves the home, so they can get to them.

‘We see it in divorce cases too, where someone exploits the other party’s credentials from outside the home to utilise the tech. We’ve seen IT technicians use access to laptops for stalking purposes.’

Crime Harvest

Security experts predict an ensuing ‘crime harvest’ as more felons realise the vulnerabilities insecure devices expose. 

And therein lies possibly the biggest pitfall in the IoT market. Many of these devices incorporate software and code of obscure origin. They are made by small organisations that have raced to get ideas to market without necessarily taking the time to think about secure architecture for their products.

As Young reveals: ‘The software and drivers for these devices and apps are a big hodge-podge from a range of sources.’

It is not unfeasible to imagine malicious agents infiltrating this market in code and algorithms to embed trojan viruses in programmes that eventually become incorporated in millions of consumer products. 

And the consequences could be catastrophic. ‘If they were the right devices you could shut down the internet,’ Young predicts.

It’s a sobering thought, next time you ask Alexa to play your end of days playlist.

Newsletter signup

SIGN UP TO OUR NEWSLETTER

AND GET OUR LATEST ARTICLES DELIVERED TO YOUR INBOX EACH WEEK!


THE ETHICALIST. INTELLIGENT CONTENT FOR SUSTAINABLE LIFESTYLES