Home Social Siri “Regularly” Listens In On Your Sexual Encounters, Apple Insists “Only For...

Siri “Regularly” Listens In On Your Sexual Encounters, Apple Insists “Only For A Few Seconds”

6517

Should it come as any surprise? And yet the details are shocking and outrageous. A whistleblower working for Apple has revealed to The Guardian that its popular voice activated spying device helpful virtual assistant Siri, now in millions of households, “regularly” records people having sex, and captures other “countless” invasive moments which it promptly sends to Apple contractors for their listening pleasure “quality control”:

Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant, the Guardian has learned.

We’ve long pointed out that according to Amazon’s Alexa terms of use, the company collects and stores most of what you say to Alexa (or perhaps what you groan) - including the geolocation of the product along with your voice instructions.

However, what’s not disclosed or at least not well known up to this point is that a “small proportion” of all Siri recordings of what consumers thought were private settings are actually forwarded to Apple contractors around the world, according to the new report. Supposedly this is to ensure Siri is responding properly and can continue to distinguish dictation. Apple says, according to The Guardian, the data “is used to help Siri and dictation… understand you better and recognise what you say”.

But an anonymous current company insider and whistleblower told The Guardian: “There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.”

Contradicting Apple’s defense that these sultry samples are “pseudonymised recordings,” Apple employees can know precisely who is having sex and where, and what time the deed was done.

Apple’s formal response to the Guardian investigation was as follows:

A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.

Just trust us, Apple appears to be saying. Most of what can be deemed sensitive data is captured through so-called accidental activations by “trigger words,” according to the report, with the highest rates of such occurrences via the Apple Watch and HomePod smart speakers.

“The regularity of accidental triggers on the watch is incredibly high,” the company whistleblower explained. “The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on.”

The insider continued, “you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal… you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”

Apple Watch Series 4. Image source: Bloomberg

Further less than comforting is just how many across the globe have access to these private moments: “There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad,” the contractor continued. “It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.”

“Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”

The evidence continues to mount: Siri is a blackmailer’s dream come true… or spy agency, or voyeur, or political adversary, or just plain pervert.

via zerohedge

5 COMMENTS

  1. I had to evaluate a property the other day, and the lady was wearing one of those smart watches. She could ‘answer’ her mobile phone with it, talking to it, holding it up to her ear so she could hear it outside; and she had one of those ‘Siri’ or ‘Alexa’ devices in the building. I don’t have anything like that, so I’m not sure which is which; but this article makes me wonder . . . Her wristwatch could communicate with those devices and act like a mobile phone, and she had a Google mail service (gmail), so I’m sure the devices she used/owned were also Google. How much info did Google collect about ‘ME’ while I was surveying her property, and interviewing her, and just being in friendly conversation with her????
    It’s not just the stupid people who want these things that are in ‘danger’, it’s ALSO those around them! From family to innocent strangers.
    The world is going toward artificial intelligence because the greedy have educated the populous that laziness and codling is acceptable. No one wants to do anything strenuous today; even thinking!

    • You should your homework before making comments like this. Most of what you have said is technically incorrect!
      I do, however, understand your concerns especially if you are involved in illegal activities.
      Did you know that there are ways that people can turn on the camera in your laptop or webcam at will?

      • You are too funny John. You need to behave yourself. Poor Gerry will be afraid to take a shower or use the restroom if he knew what was inside his plumbing…..

  2. I read an article in News Week (??) in May, 2018, that talked about Alexa. It reported that everything you say within ear-shot of the device is recorded to the cloud. Everything. The only difference in Alexa reacting is whether you say “Alexa” first. But it records everything either way. Is our right to privacy being violated? The article went on to tell the true story of when Alexa recorded a murder. The police were able to access the recording and used it to charge the homeowner. At first, the Defendant argued privacy, while the police said no, it is outside the house, like the trash. We can use it. The Defendant decided to plead guilty and the idea of accessing the cloud has not yet been challenged.

  3. If you don’t want Alexa or its like, then don’t buy it. If you do want to buy it, read the legal paperwork. If you don’t like what’s in the paperwork regarding your privacy, then don’t buy it. That’s how freedom works.

Leave a Reply