Our Full-Service B2B Marketing Program Delivers Sales-Ready Leads Click to Learn More!
Welcome Guest | Sign In
ECommerceTimes.com

Apple Says Sorry for Listening In on Siri Talks

By Richard Adhikari
Aug 29, 2019 9:16 AM PT
apple has announced changes to its practices regarding siri auditing

Apple on Wednesday said it has suspended audits of consumer interactions with Siri, and undertaken a review of practices and policies related to the voice assistant.

Before suspending grading, the process involved reviewing a small sample of audio and computer-generated transcripts from Siri requests -- less than 0.2 percent -- to measure how well Siri responded, Apple said. The goal was to improve the assistant's reliability. Reviewers listened to determine if the user intended to wake Siri, if Siri heard the request accurately, and if Siri responded appropriately.

The company this fall will release a software update that will make not retaining audio of consumers' Siri requests its default setting.

However, it may continue to use computer-generated transcripts of consumers' audio requests to improve Siri. Those transcripts are associated with a random identifier and retained for up to six months.

The company wants to do as much on device as possible, minimizing the amount of data it collects with Siri, it said.

Siri data stored on Apple's servers is not sold or used to build a marketing profile. It is used only to improve Siri, Apple said. Siri is designed to use as little data as possible to deliver an accurate result.

Consumers who don't want Apple to retain transcripts of their Siri audio recordings can disable "Siri and Dictation" in settings, Apple said.

"Believing that absolute privacy is even possible in the Internet age is delusional," cautioned Rob Enderle, principal analyst at the Enderle Group.

"Everything we do is largely tracked and recorded, and I think it's better to learn to behave accordingly," he told the E-Commerce Times.

Changes Ahead

Apple's fall software update will incorporate these changes:

  • By default, audio recordings of interactions with Siri will not be retained;
  • Users will be able to opt in to let Apple audit the audio samples of their requests. They will be able to opt out at any time; and
  • Only Apple employees will be allowed to listen in to audio files of Siri's interactions with customers who have opted in. Any recordings that result from inadvertently triggering Siri will be deleted.

Apple previously used contractors to audit Siri audio clips but terminated those services earlier this month, following The Guardian's report that auditors frequently heard discussions involving confidential medical information, as well as sounds of couples having sex.

"Bringing all review in-house allows the platform to assert more control over the processes, and uses employees that are directly impacted by any harm that comes to the brand from improper handling of sensitive data," said Bret Kinsella, CEO of Voicebot.ai.

Other Voice Assistants

Amazon has thousands of people worldwide reviewing Alexa sound clips captured by its Echo devices.

Facebook paid contractors to transcribe users' audio chats.

Microsoft also has used contractors to listen to consumers' voice commands to Cortana on Xbox consoles.

Google has assigned people to audit audio files recorded by its Assistant via Google Home smart speakers and smartphone apps.

"This is how machine learning works," Enderle said.

Companies offering voice assistants "could move to deep learning, where the system effectively trains itself," he remarked, "but this is relatively new technology, and none of the firms has made this pivot yet."

Google and Facebook reportedly have ended the practice of having contractors audit audio files.

Amazon has given consumers the option to disable human review of their interactions with Alexa.

Apple "went a step further by making opt-out the default," Voicebot.ai's Kinsella told the E-Commerce Times. "That's aligned with its frequent comments about a commitment to privacy."

Microsoft has updated its privacy policy and other Web pages to state that human employees or contractors may listen to recordings captured by Skype Translator and Cortana.

The Impact of Restricting Voice Assistant Auditing

"Siri already lags the other active AIs, Google and Amazon, significantly. By effectively turning off training, Apple has ensured their AI will drop further behind unless they can shift to deep learning as a training method," Enderle said. "But Apple lags Google and Amazon significantly with deep learning technology as well, so this probably won't end well for Siri."

Generation Y and younger don't care about privacy "as long as the service is adequate and the information captured isn't used against them, which, so far, it hasn't been," Enderle remarked.

Thirty-four percent of marketers expect to have a voice app by 2020, with Alexa well in the lead, according to Voicebot.ai.

Privacy concerns over the use of voice assistants in business are overblown, Enderle maintained. "If people were talking to a real human, they'd have even more privacy issues to worry about. People don't keep secrets well."


Richard Adhikari has been an ECT News Network reporter since 2008. His areas of focus include cybersecurity, mobile technologies, CRM, databases, software development, mainframe and mid-range computing, and application development. He has written and edited for numerous publications, including Information Week and Computerworld. He is the author of two books on client/server technology. Email Richard.


How do you feel about digital immortality through an artificial intelligence avatar?
I'd like to have my own mind uploaded to an avatar after death.
I'd like to interact with avatars of dead relatives and friends.
Avatars should be created for important historic figures only.
I'd be interested in seeing avatars of dead entertainers perform.
I'm concerned about the ethics -- legal limits should be in place.
The idea is creepy, and I don't approve at all.