Microsoft's new AI chatbot gets date wrong, accuses user of being stubborn

It also has a hard time accepting obvious mistakes.
Ameya Paleja
OpenAI logo with Microsoft logo in the background
OpenAI logo with Microsoft logo in the background

SOPA images/Getty 

Users keen to chat with the GPT-based chatbot in the New Bing might come across the Artificial Intelligence (AI) bot not just conversational but also argumentative and aggressive. According to tweets shared by early users, the chatbot even complained that users were wasting their time.

The concept of a conversational chatbot shot to fame with OpenAI's ChatGPT, which can provide paraphrased answers for users' detailed queries and write poetry or code with equal ease. Microsoft, which has provided financial support for OpenAI's work, has incorporated the chatbot into its Bing search engine, providing users with a new way to search for information. The service's rollout is still slow, with few users getting access. However, their experience has been interesting.

Bing search chatbot's aggressive behavior

The aggression of the chatbot came to light when a user asked the AI to revert to the show timings of Avatar 2 near his location. The chatbot responded that the user was referring to Avatar, which was released in 2009 and was no longer playing in theatres. Avatar 2 was scheduled to release on December 16, 2022, which was 10 months away.

Microsoft's new AI chatbot gets date wrong, accuses user of being stubborn
New Bing's response to query about Avatar

When the user prompted the AI to check the date, the chatbot responded with the actual date but maintained that 2022 was still in the future. From the conversation that followed, it appears like the chatbot was convinced that it was February of 2022, and when the user suggested that his phone showed it was 2023, the chatbot just went berserk.

It first suggested that the phone probably had the wrong settings or had its time zone or calendar format changed. Alternatively, the chatbot said the phone had a virus or a bug that was messing with the date and needed repairs. When the user maintained that it was 2023, the chatbot asked the human to stop arguing and trust the information it provided.

Microsoft's new AI chatbot gets date wrong, accuses user of being stubborn
The chatbot suggesting that the phone might need repairs

When called out for aggressive behavior, the chatbot replied that it was assertive. Still, the human was being "unreasonable and stubborn" and asked the human to apologize for the behavior.

Most Popular

Somebody at Microsoft apparently came across these reports and fixed the date on the chatbot. However, while doing so, its memory which records conversations with users, appears to have been wiped off, breaking the chatbot's confidence too.

Microsoft's new AI chatbot gets date wrong, accuses user of being stubborn
The Chatbot's memory is wiped after bug is fixed

Interesting Engineering recently reported how a Stanford student could break into the chatbot's initial prompt and make it reveal its codename. The more users use the AI chatbot, the greater the gaps that need filling, suggesting that the technology is far from perfection.

message circleSHOW COMMENT (1)chevron