Google Assistant, Alexa Smart Assistants May Steal Sensitive Information: Study

Image for Representation

Image for Representation

Researchers found that Google Assistant and Alexa Smart Speakers grant third-party developers access to sensitive information, allowing hackers to eavesdrop on users.

Share this:

Living in the 21st century, we understand and accept the fact that privacy is a myth. From our PCs to smartphones, most of the technical gadgets around us have the capability to hack through and steal our information. Most interestingly, most of these apps seek our permission before hacking our data, which we fail to recognize. Now, a new study by Security Research Labs has discovered the hidden vulnerabilities in Google Assistant and Alexa Smart Speakers. According to the report published by SRLabs, the voice assistant devices are capable of secretly listening to a conversation and leak some major information, including passwords.

In a report, titled, “Smart Spies: Alexa and Google Home expose users to vishing and eavesdropping,” the hacking research company has revealed major vulnerabilities of these smart speakers. As the report revealed, the capability of the speakers to be accessed through voice commands can be extended by third-party developers through small apps. While these smart speaker voice apps are called Skills for Alexa, it is named Actions on Google Home. The researchers found out how these flaws allow a hacker to phish for sensitive information and eavesdrop on users. This way, users can compromise on important data as the apps can request and collect personal data including user passwords. Not just this, these apps can also eavesdrop on users after they think the smart speaker has stopped listening. After the issue was raised, a spokesperson told CNN Business, “We have review processes to detect the type of behaviour described in this report, and we removed the actions that we found from these researchers. We are putting additional mechanisms in place to prevent these issues from occurring in the future.”

Next Story