“We realize we haven’t been fully living up to our high ideals, and for that we apologize.”
Apple has issued an official apology after recent reports revealed that the company had been outsourcing Siri recordings to 3rd party contractors for grading purposes. The resulting outcry was expected—after all, Apple doesn’t explicitly state that these Siri recordings are listened to by human beings, much less so non-Apple staff.
With former contractors alleging that some of these recordings were highly sensitive in nature, including recordings of illicit drug deals, appointments with doctors, or even of couples being intimate with one another, Apple has reviewed its grading process.
As such, there have been some changes to policies surrounding how Siri recordings are evaluated, including an opt-in program that gives users the choice to allow their Siri recordings to be used by Apple. Their previous practice of saving, and grading these recordings (including accidental recordings) with 3rd party contractors is now suspended for good.
Here’s a breakdown of the changes:
- First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.
- Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.
- Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.
In case you missed it, the controversy surrounding Apple and Siri recordings was exacerbated by former contractors who revealed details of the operation. This includes a source who claims that contractors in Cork, Ireland, were listening to around 1,000 Siri recordings per day. The contractor also named the Apple Watch and HomePod smart speaker as the highest causes of accidental Siri recordings.
The issue here isn’t really that Apple used Siri recordings to help improve Siri. It’s more a case of Apple being deliberately vague about the process by which the Cupertino-based company “improves” Siri—humans listening to these recordings (and non-Apple staff, to boot).
[ SOURCE ]