Smart speakers let us use voice control to play music, make schedules and look up information online. But if the smart speaker is listening, who else is? Few users seem to care.
One might think microphones tapping our private conversations at home is a blast from the Cold war past. Think again.
Nowadays surveillance technology is not secretly planted in our homes, it is a visible part of our everyday lives. In fact, many people willingly pay for the surveillance device and bring it into their homes themselves.
Smart speakers are voice controlled mobile devices using artificial intelligence and language processing to help with everyday tasks. The sales of these speakers are increasing world-wide. Countless devices are currently available on the market, with big corporations like Apple, Amazon and Google all manufacturing their own speaker systems.
Convenience at the expense of privacy?
There is no doubt these devices make our lives easier, so why make a privacy fuzz?
When smart speakers are operational, they remain in a listening mode. In other words, the speaker is continuously recording everything being said. The live audio data is then sent to the speaker company for processing and storage. This “always-on” listening feature poses privacy risks for users.
Christoph Lutz and Gemma Newlands, both researchers at BI’s Nordic Centre for Internet and Society, investigate topics related to digitization and new internet technologies. In a recent study, they seek to understand what types of privacy concerns users of smart speakers have, and how these impact privacy protective behaviors.
"On the one hand, these technologies can make our lives more convenient, efficient, and entertaining. On the other hand, they may challenge our privacy, lock us into commercial eco-systems, and can lead to divides and inequalities", says Christoph Lutz.
Low privacy concerns
Based on an online survey of 367 smart speaker users from the United Kingdom, Lutz and Newlands found that the most common concern was contractors and third-party developers accessing user data. The lowest ranking concern was related to issues of social privacy, like monitoring of family members.
However, overall the respondents reported only low to moderate levels of concern about their privacy.
Protective actions are rare
Lutz and Newlands’ study also shows that the users did little to protect their privacy. The device had become an integrated part of their everyday life, causing few to take direct actions to limit the speaker’s surveillance potential.
More than half of the respondents never turned off their smart speaker when it was not in use, and as much as 72 % never switched it off during sensitive conversations.
Also, very few people engaged with activities such as reviewing and deleting the information collected by their speaker or aggregated in their Alexa, Google or Apple user profile.
Privacy cynicism?
Despite knowing that certain technologies put our private data at risk, few of us seem to do much about it. Lutz and Newlands categorize this conflicting behavior as ‘privacy cynicism’. We ignore concerns in order to use new technology and engage in online transactions.
"A combination of developing more privacy-friendly technologies, implementing more effective regulation, and increasing user empowerment is needed to tackle this issue", Lutz concludes.
Reference:
Christoph Lutz & Gemma Newlands (2021) Privacy and smart speakers: A multi-dimensional approach, The Information Society, 37:3, 147-162, DOI: 10.1080/01972243.2021.1897914