OpenAI warned that Bing wasn't ready, but Microsoft launched it anyway

The Wall Street Journal reports that all is not well between the two companies.
Sejal Sharma
OpenAI CEO Sam Altman (left) and Microsoft CEO Satya Nadella (right)
OpenAI CEO Sam Altman (left) and Microsoft CEO Satya Nadella (right)

1, 2 

Microsoft’s multiyear, multibillion-dollar partnership with OpenAI isn’t new. The tech giant first invested in the Sam Altman-led artificial intelligence (AI) behemoth in 2019, starting off with a $1 billion investment, followed by another in 2021. And again in January 2023, the two renewed their tech camaraderie in what is rumored to be a $10 billion investment.

On the surface, everything seems great between the two, with Bill Gates saying AI is the next big thing in the tech industry. But a new report by the Wall Street Journal (WSJ) says that there is tension brewing between the two companies, which has made their partnership "awkward."

Microsoft turned a deaf ear to OpenAI’s concerns 

WSJ spoke to the people familiar with the matter who revealed that OpenAI had warned Microsoft that it was rushing to integrate ChatGPT with Bing and that the former needs more training. OpenAI’s primary concern was that Bing’s chatbot Sydney might hallucinate and give unhinged responses, which is exactly what happened.

These warnings were reportedly ignored by Microsoft.

Microsoft CEO Satya Nadella, in an interview with Wired, admitted that Sydney could not have been tested in a lab alone, it had to be integrated with the outside world to understand its full potential. “We did not launch Sydney with GPT-4 the first day I saw it, because we had to do a lot of work to build a safety harness. But we also knew we couldn't do all the alignment in the lab. To align an AI model with the world, you have to align it in the world and not in some simulation.”

The WSJ report also said that Microsoft insiders have voiced their concerns regarding the limited investment the company does in its own AI projects. Another concern for the Microsoft staff is the limitations OpenAI has put in place so that no one can access their technology’s underlying mechanisms. 

Both Microsoft and OpenAI’s sales teams have sought the same clients, which has also contributed to the friction between the two. 

The two are rivals in the AI race

There’s no denying that Bing and ChatGPT are competitors, even though Bing runs on ChatGPT-4. However, the biggest difference between the two is that Bing has access to the entirety of the internet, whereas ChatGPT can only give out information based on the limited data it is trained on.

But even then, it seems ChatGPT has won the race, after it amassed over 100 million users in February, four months after its launch. As ChatGPT overtook TikTok to become the fastest-growing app, Microsoft employees raised concerns that ChatGPT was stealing Bing's "thunder," WSJ reported. This could have been another reason why Microsoft hastily rolled out Bing.

Both have AI chatbots that hallucinate

After he was asked how Microsoft is tackling this issue, Nadella said, “There is very practical stuff that reduces hallucination. And the technology's definitely getting better. There are going to be solutions. But sometimes hallucination is “creativity” as well. Humans should be able to choose when they want to use which mode.”

As concerns mount over the potentiality of AI and more importantly, artificial general intelligence (AGI), both companies are addressing the need for guardrails. 

OpenAI CEO Sam Altman in a blog, said “Given the picture as we see it now, it’s conceivable that within the next ten years, AI systems will exceed expert skill level in most domains, and carry out as much productive activity as one of today’s largest corporations. Citing the example of nuclear energy, Altman said that we would soon need an AI governing body similar to the International Atomic Energy Agency (IAEA).

Whereas, Nadella said that he’s not worried about AGI showing up, “If this is the last invention of humankind, then all bets are off. Different people will have different judgments on what that is, and when that is. The unsaid part is, what would the governments want to say about that? So I kind of set that aside. This only happens when there is superintelligence.”

Add Interesting Engineering to your Google News feed.
Add Interesting Engineering to your Google News feed.
message circleSHOW COMMENT (1)chevron
Job Board