Microsoft rolls out personality options for its Bing AI chatbot

Users can now pick if they want "creative" or "precise" responses.
Ameya Paleja
Microsoft Bing and OpenAI ChatGPT.
Microsoft Bing and OpenAI ChatGPT.

Getty Images 

Even as users make a beeline to get access to Microsoft's artificial intelligence (AI) powered Bing chatbot, the company is adding on more features, such as giving the users the power to pick what personality type they would prefer to interact with.

Microsoft has partnered with ChatGPT-creator OpenAI to bring the conversational chatbot to its Bing search engine, among its other products. Given that the chatbot is still largely experimental, Microsoft is slowly making the service available to users. Yet, in a short span of time, it has amassed over a million users already.

During this time, users have reported different experiences during their interactions with the chatbot ranging from bullying to flirtation. In a bid to improve the experience, Microsoft is now looking to provide users with better control over how the chatbot responds.

Mikhail Parakhin, the head of web services at the company, announced via Twitter:

For those who do not like the two extremes, there is also an option for a "balanced" setting that bridges the two.

Do AI chatbots need personality settings?

Microsoft's move comes at a time when there have been multiple cases of the chatbot showcasing strong personalities already. It has refused to indulge in conversations with users who persistently ask about its origins and even refused to accept that it was wrong about the current date. So, what is Microsoft trying to do by introducing more personality into the mix?

With the introduction of AI into its products, Microsoft is looking to cover lost ground in areas such as the search engine. With Google caught on the back foot with its AI offerings, the timing is right for Microsoft to aggressively push its offerings and make get users hooked to these services before others catch up.

However, instances of the AI acting rude or stubborn can ruin the user experience, and with these experiments, Microsoft is trying to test which approach is best suited when dealing with a large number of users. Not all of Microsoft's users are expected to ask the chatbot long probing questions about its history or reason for existence. Those looking at the tool for its utility to get concise answers can now just do that and move on and not bother whether it is too talkative or meandering in its replies.

Add Interesting Engineering to your Google News feed.
Add Interesting Engineering to your Google News feed.
message circleSHOW COMMENT (1)chevron
Job Board