Apple has been reportedly paying contractors across the world to listen in on Siri recordings, be it intentional or accidental, in the name of quality control. While this would typically be an aspect that many users have accepted, what makes it alarming is that a large number of these recordings are "accidental", i.e. ones that happen when Siri gets inadvertently activated. As a result, many of the recordings that are submitted for the quality control, or "grading" process are sensitive in nature, including confidential medical conversations, criminal and drug dealings, sexual encounters and more.
The information, first reported by The Guardian, was brought to light by one of the contractors who works in a third party firm hired by Apple. In a statement shared by Apple with The Guardian, the company claims, "A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements." However, this brings to light multiple other issues, the biggest of them being non-disclosure.
Apple, as it seems, does not explicitly disclose to its users that the quality control aspect, which Siri recordings may be subjected to, involve human agents at the far corners of the back-end systems. Apple claims that less that 1 percent of daily Siri recordings are sent for such evaluations, and are picked at random. They, too, are typically only a few seconds long, and hence the aspect of leakage of sensitive and confidential data automatically gets reduced. However, even if the user's Apple ID is not linked to his/her voice recording, the contractor who spoke to The Guardian has revealed that the recordings come with identifier tags such as location coordinates and app data, which are put in place to identify technical issues such as accidental activation of Siri, without being enforced with the "Hey Siri" command.
These tags can be seemingly used to stitch together where the recording originated, thereby compromising the privacy of an iPhone user. Furthermore, the contractor alarmingly revealed that there are not strict screening process that these third parties undertake in order to select who can access this data. As a result, any individual with nefarious intentions can trace down an individual who has been recorded in the middle of a sensitive moment, and take undue advantage of this inadvertent data access. The contractor also revealed that while "grading" these recordings, the contractors are only encouraged to report any technical issue spotted through the recordings, while there are no given way to report the content of the recordings.
While Amazon and Google were already spotted listening in on user recordings, Apple has typical prided itself on the aspect of privacy, and how sensitive it is about usage of user data. Given the highly sensitive nature of the incident, it remains to be seen whether Apple changes its stance on contractor involvement in analysing voice recordings, or tweaks its customer disclosure. Voice recording analysis such as this can be overtly sensitive, and it is highly important that big tech firms such as Apple treat such issues with utmost importance.