Apple likes to present itself as the one tech company that is both transparent and deeply concerned about maintaining user privacy. However, as the Washington Post’s Geoffery A. Fowler recently uncovered, the iPhone is private should you only use Apple apps and services. Install any third-party apps and services on your iPhone, your data starts being shared willy nilly with 5,400 trackers bleeding 1.5 GB of data a month found on Fowler’s iPhone. Now, a whistleblower has revealed to The Guardian that just like Google’s Assistant and Amazon’s Alexa, Apple employs contractors to listen in to the private conversations of users when the deliberately or inadvertently trigger Siri in order to “improve” the service.
From The Guardian article:
Accidental activations led to the receipt of the most sensitive data that was sent to Apple. Although Siri is included on most Apple devices, the contractor highlighted the Apple Watch and the company’s HomePod smart speaker as the most frequent sources of mistaken recordings. “The regularity of accidental triggers on the watch is incredibly high,” they said. “The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on.”
Sometimes, “you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”
Most egregiously perhaps, there is nothing in Apple’s public privacy documentation that indicates that Apple has been engaging in this practice. In fact, even Google and Amazon allow users to opt out the use of their personal recordings in some instances – no such option for Apple users. Further, when the stories about Google and Amazon engaging in similar practices, Apple failed to come clean that it too did this until The Guardian was made aware of the practice by the whistleblower.
In response to the revelation, Apple issued this statement to The Guardian:
A small portion of Siri requests are analyzed to improve Siri and Dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.
Like the years long Apple butterfly keyboard fiasco that allegedly only affected a “small” percentage of users, once again Apple’s PR machine looks to be downplaying the seriousness of the issue at hand with its favorite adjective. Also not addressed in the statement from Apple is the fact that a high number of the recordings heard by its contractors were accidentally triggered meaning that users were both completely unaware that Siri was listening in on them and that people could be then played the recordings. Seems like a pretty clear cut breach of user trust, no?
Are you a techie who knows how to write? Then join our Team! Wanted:
- Specialist News Writer
- Magazine Writer
- Translator (DE<->EN)
Details here
Join our Support Satisfaction Survey 2023: We want to hear about your experiences!
Participate here
Top 10 Laptops
Multimedia, Budget Multimedia, Gaming, Budget Gaming, Lightweight Gaming, Business, Budget Office, Workstation, Subnotebooks, Ultrabooks, Chromebooks
under 300 USD/Euros, under 500 USD/Euros, 1,000 USD/Euros, for University Students, Best Displays
Top 10 Smartphones
Smartphones, Phablets, ≤6-inch, Camera Smartphones