Smart speakers and voice assistants such as Siri and Alexa can be very useful, but what are the ethical dangers?
Many businesses rely on these tools without considering the privacy and security issues that could arise. The questions are ‘Can They Hear Me? and how do you use the data?
What have consumers agreed to?
Being able to listen to you is a basic requirement for any voice activated system. It has to be able to hear your request in order to do what it is told. As well as using what you say to perform tasks for you, the company may be able to store recordings or allow workers to listen to them for training or research purposes, for example in order to improve its voice recognition.
- Amazon Alexa and Echo: all interactions after the activation phrase are recorded and stored, but you can delete them and opt out of research if you don’t want your data used.
- Google Assistant and Home: anything you say after the activation phrase will be recorded and linked to your Google account, but you can delete your data. Third party extensions can also use the HomePod speaker and will share your data with other organisations.
- Microsoft Cortana: can be used without signing in to your Microsoft account. Data will be stored but can be managed.
- Apple Siri and HomePod: offers a more anonymous service as interactions won’t be linked to your Apple ID. However, the data can still be stored and used by the company.
Due to increasing concern about data use these companies do offer some privacy settings and tools to manage your data. You should be able to delete your data or have some control over how it is used. It is often possible to mute your microphone so that you have to turn it on manually rather than with the activation phrase. If you are leaving the microphone on then it’s a good idea to turn on audible alerts so that you know when it starts listening as this can happen unexpectedly if something else is misinterpreted as the activation phrase. Any accounts (such as your Google or Amazon account) linked to the smart assistants should also be kept secure.
According to YouGov, 1 in 3 people with smart speakers don’t know their recordings are being stored in the cloud.
How are companies listening to consumers?
There are several ways in which a company could listen to you after you agree to allow them to use a microphone:
- The microphone will always be on so that it can hear the activation phrase (e.g. OK Google or Hey Siri). The company should only listen or record microphone data after the activation phrase is used. However, a disreputable company or hacker could use the microphone on your phone or smart speaker to listen in at other times.
- When you use the activation phrase, the company will use what you say to perform the task you have requested. This is an essential part of the smart speaker or voice assistant’s role.
- The company may then be able to make use of your microphone data in other ways. You may have to agree for it to be recorded, stored or used for training and research purposes when you sign up for the service. Companies may also share or sell the data to third parties. The details will be set out in the terms and conditions.
One issue that has recently come to attention is that members of staff at these companies can actually listen to the recordings. Amazon, Apple and Google have all admitted to the practice, which is used to improve their services, even though their terms and conditions did not always make it clear that the data would be heard by actual people. 1 in 3 smart speaker owners were unaware their data was sent to the companies at all.
Does it matter if your Smart Speaker is listening to you?
Although the terms and conditions for voice activated services can seem suspicious, they aren’t usually that dissimilar from other services. Companies often need to use and store data (at least temporarily) in order to perform tasks. It is also common for companies to use this data to improve their services. However, there are some potential issues with microphone data that businesses need to be aware of.
There are three main questions that you should ask when deciding whether to agree to the terms and conditions for a voice activated tool:
- Can you trust the company with your data?
- How will your data be used?
- How sensitive is the data?
If you’re using voice assistants for your business then you need to be sure that your data will be secure, especially if it includes any sensitive information. You should only agree to give microphone access to reputable companies and avoid using these tools for data that you aren’t willing to share with them. Even though the data is usually kept private, there are always risks when you’re sharing information that could be stored, sold on, stolen, or used in other ways.
In case you’re interested, Consumer Report (a blog website) has written about how to delete recordings on Amazon, Apple, and Google.