iPhone Users Discover That Siri Has a Bad Mouth
iPhone users on Saturday stumbled upon an unorthodox feature in Siri’s programming that sees the virtual assistant swear. News of the feature, believed to have initially been reported on Reddit, spread quickly to iPhone owners worldwide.
People from across the globe took to social media to share their experiments with Siri’s newfound bad mouth. The instructions are pretty straightforward.
If you have an Iphone, ask Siri to define “mother”. When she asks if you want to hear another definition, say “yes” or “yeah”, it doesn’t matter—just agree, then listen carefully. And oh, wear your earphones or make sure you’re in a safe space before you do this.
— Cams Fajardo (@camsfajardo_) April 29, 2018
Ask Siri to "define the word mother." When the virtual assistant asks if you want to hear another definition simply say yes.
The computer voice then says: "it means short for motherf*****." The test was performed by many amused users, each with their own take on the unexpected outcome.
Does Apple know?
Some people speculated that someone at Apple would surely be made redundant due to this gitch. However, Apple has not yet addressed the incident.
Too funny! Somebody is getting fired at Apple. — iPhone’s weirdest glitch yet: Ask Siri to define “mother” twice, learn a bad word https://t.co/GCpd4vUniY pic.twitter.com/ojBpk2dhlo
— David Braxton (@davidbraxton) April 29, 2018
Up to now the Reddit thread, that is believed to first have revealed the glitch, has garnered 359 responses. Users expressed everything from shock to disbelief.
Some people pointed out that the words were not said clearly. User hashtag duh said: "I can’t understand what she’s saying? It sounds like sped up or mumbled when she does?"
Could this feature simply be a misunderstanding? Not according to the youtube videos loaded by some users featuring video proof.
This definition of the word mother is actually not unheard of. The Oxford dictionary features it as an option for verbal slang.
However, Siri may want to exercise more caution as the virtual assistant can easily be heard by people around including children. This is not the first time a virtual assistant exhibits a glitch.
Virtual assistants going rogue
Last year, Alexa went rogue and initiated shopping sprees through instructions heard on TV. Even scarier, around the same time, Google's Allo began revealing its users' searches.
It is the last example that is more worrisome. Not what Siri says but what the program collects.
[see-also]
Apple’s iPhone Software License Agreement states that “by using Siri or Dictation, you agree and consent to Apple’s and its subsidiaries’ and agents’ transmission, collection, maintenance, processing, and use of this information, including your voice input and User Data, to provide and improve Siri, Dictation, and other Apple products and services.” Perhaps Reddit should start a thread about Siri and data privacy.
Via: Reddit
From robot dogs to AI and a train that could take you to Mars, the Oracle's industry labs showcase a vision of a sustainable future.