AI Cybersecurity

Apologizing For Human Reviews Of Siri Conversations, Apple Makes Its Surveillance Program Opt-in

Sharing is caring!

Today Apple said it will overhaul its practice of listening in to Siri conversations, so as to provide users with “greater privacy”. This change comes after a July 2019 report by The Guardian revealing that contractors regularly heard sensitive information while listening to Siri conversations. 

Until recently the computing giant used a team of contractors to manually review Siri conversations with the aim of improving their virtual assistant. Likely the biggest change in the program Apple announced today is making the reviews opt-in only. 

The company plans to roll out the opt-in option this fall as part of a software update. Apple said it will stop using any audio recordings until that update. And the company stopped all manual reviews earlier this month.

Apple says they will stop retaining all recordings of consumers who decide not to participate. That will remove the human review element, but users still don’t have an option to completely opt out of the Siri quality assurance program, as the company says it will keep using computer-generated transcripts of the voice recordings. 

Apple will also change the way it manages data from users who decide to opt into human reviews. Saying they will allow users to leave any time and that they will take the program in-house. The result is that rather than a dirty, strange contractor reviewing audio of your private life; a clean, paternal, Apple employ will listen to the audio of your private goings on… Much like a big brother.

The company says employees who oversee the program will be instructed to delete all recordings produced as a result of a user triggering Siri by accident. Apple didn’t mention if the computer generated transcripts would be removed or if the company has adopted “wrong think different” as an Orwellian new slogan. 

Related: Title: YouTube Ending Targeted Ads on Videos Aimed at Kids

The exposé by The Guardian is probably not the only reason for the change to this Siri quality assurance program. It could be that Apple just feels bad about abusing consumer trust. It could be last months decision by German data protection authorities ordering Google to temporarily stop listening to Google Assistant interactions; And calling on other tech corporations to “swiftly review” their quality assurance programs.

German data protection authorities specifically named Apple, Google and amazon in the statement. Amazon also changed some of its policies after the Google investigation in Germany, adding an opt-in option for Alexa users in Europe.

Facebook also made changes to its quality-domestic-surveillance-assurance program. Microsoft will likely follow, as the company doesn’t currently offer Skype and Cortana users a way to opt out of audio reviews. Microsoft has been accused of a having contractors listen to recordings from its virtual assistant, Cortana. 

“Graffiti tunnel, VI” by Newtown grafitti

Daniel Payne
Daniel Payne
I’ve been a freelance writer, video, and web person since 1988. My passion is technology, whether it’s the latest cameras or cutting edge ways the internet is used to improve medicine. I write for Internet News Flash and am helping with the online resurrection of Digital Content Creators Magazine Contact me: danielpaynetech@gmail.com
http://www.danieljpayne.com/

7 thoughts on “Apologizing For Human Reviews Of Siri Conversations, Apple Makes Its Surveillance Program Opt-in

  1. Apple should set itself apart with external audits on its privacy policies. Mark Zuckerberg also apologizes all the time; Apple has now positioned itself in the Facebook category. Not good.

Leave a Reply

Your email address will not be published. Required fields are marked *