Apple halts Siri 'grading' program, promises opt-out in upcoming software update Credit: AppleSupplied Art
Business Management

Apple halts Siri 'grading' program, promises opt-out in upcoming software update

In the wake of backlash over a Guardian report that exposed employees who were tasked with analyzing Siri recordings for accuracy and quality, Apple has announced it is temporarily suspending the program as it decides how to proceed.

In a statement to TechCrunch, an Apple spokesperson said the company is “committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally.”

Apple added that users will have the ability to choose whether they want to participate in the program as part of an upcoming software update.

Apple’s Siri grading process was exposed last month when one of the contractors contacted the Guardian claiming that they “regularly hear confidential medical information, drug deals, and recordings of couples having sex” as part of their job. Apple explained to the Guardian that the data collected “is used to help Siri and dictation … understand you better and recognize what you say.”

Apple also said the recording are anonymized and represent less than 1 percent of daily Siri activations. It also said recordings were “not associated with the user’s Apple ID,” though the employee said they “are accompanied by user data showing location, contact details, and app data.”

According to the “whistleblower,” recordings routinely contain snippets of conversations recorded by accidental triggers of the “Hey Siri” wake word. It’s unclear whether these recordings are supposed to be deleted before they reached the employee’s ears. It’s also unknown how long Apple has been running the grading program.

But while the practice might be necessary, the seeming secrecy of it is alarming. Nowhere in Apple’s privacy policy or Siri setup is it mentioned that recordings may be used for quality control, nor is there a toggle that lets you opt out of data collection. According to the statement, Apple will presumably be rectifying both of these issues once it reinstates the program.

That’s the right response. Customers should be aware that their Siri recordings may be listened to, and part of Apple’s privacy push should be the ability to keep your data to yourself. We’d also like to see an easier way to see and delete your Siri history, as well as a better way to filter out accidental recordings, but for now, a toggle is a good start.

PREVIOUS ARTICLE

« So Apple's going to stop listening in on your Siri requests. Now what?

NEXT ARTICLE

Amazon Dash buttons are going away for good, but there are alternatives »
author_image
IDG News Service

The IDG News Service is the world's leading daily source of global IT news, commentary and editorial resources. The News Service distributes content to IDG's more than 300 IT publications in more than 60 countries.

  • Mail

Recommended for You

Trump hits partial pause on Huawei ban, but 5G concerns persist

Phil Muncaster reports on China and beyond

FinancialForce profits from PSA investment

Martin Veitch's inside track on today’s tech trends

Future-proofing the Middle East

Keri Allan looks at the latest trends and technologies

Poll

Do you think your smartphone is making you a workaholic?