iPhone Users Discover That Siri Has a Bad Mouth

iPhone users stumbled on a potential glitch that sees Siri swear. The feature may be a mistake or an additional definition.
Loukia Papadopoulos

iPhone users on Saturday stumbled upon an unorthodox feature in Siri’s programming that sees the virtual assistant swear. News of the feature, believed to have initially been reported on Reddit, spread quickly to iPhone owners worldwide.

People from across the globe took to social media to share their experiments with Siri’s newfound bad mouth. The instructions are pretty straightforward.

Ask Siri to "define the word mother." When the virtual assistant asks if you want to hear another definition simply say yes.

The computer voice then says: "it means short for motherf*****." The test was performed by many amused users, each with their own take on the unexpected outcome.

Does Apple know?

Some people speculated that someone at Apple would surely be made redundant due to this gitch. However, Apple has not yet addressed the incident.

Up to now the Reddit thread, that is believed to first have revealed the glitch, has garnered 359 responses. Users expressed everything from shock to disbelief.

Some people pointed out that the words were not said clearly. User hashtag duh said: "I can’t understand what she’s saying? It sounds like sped up or mumbled when she does?"

Could this feature simply be a misunderstanding? Not according to the youtube videos loaded by some users featuring video proof.

This definition of the word mother is actually not unheard of. The Oxford dictionary features it as an option for verbal slang.

However, Siri may want to exercise more caution as the virtual assistant can easily be heard by people around including children. This is not the first time a virtual assistant exhibits a glitch.

Virtual assistants going rogue

Last year, Alexa went rogue and initiated shopping sprees through instructions heard on TV. Even scarier, around the same time, Google's Allo began revealing its users' searches.

Most Popular

It is the last example that is more worrisome. Not what Siri says but what the program collects.

[see-also]

Apple’s iPhone Software License Agreement states that “by using Siri or Dictation, you agree and consent to Apple’s and its subsidiaries’ and agents’ transmission, collection, maintenance, processing, and use of this information, including your voice input and User Data, to provide and improve Siri, Dictation, and other Apple products and services.” Perhaps Reddit should start a thread about Siri and data privacy.

Via: Reddit

message circleSHOW COMMENT (1)chevron