Now Reading
Microsoft’s AI Bing chatbot is… interesting

 

Order Now

 

Microsoft’s AI Bing chatbot is… interesting

Most of our recent, enthused conversations about AI revolve around ethics, regulations and creating malware, but Microsoft’s new AI-enhanced Bing is making headlines for different reasons.

Last week, Microsoft unveiled a new AI-powered search engine and its new chatbot feature has caught people’s eye for its unusual approach to conversations. While it’s still in its beta testing phase, people who have access to it are reporting some hilariously unpredictable results.

To get access to the new feature, users have to first sign up at bing.com and then be put on a waitlist, but there are a few definitive steps you can take to speed up the process. 

The New York Times’ Kevin Roose had a long conversation with the chat function and came away “impressed” while also “frightened”. They published the strange and rather unsettling conversation in its 10,000-word entirety, which will give you an idea of just how absurd the whole ordeal is. In the fortnight since its release, the chatbot has tried to break up Roose’s marriage, admitted to spying on Microsoft employees, insulted users and then spiralled into an existential crisis.

One person posted an exchange with the bot, asking for showtimes of Avatar: The Way of Water. The bot answered that it couldn’t do that, insisting that we’re still in 2022 and the movie is not out yet. Eventually, it got aggressive, saying, “You are wasting my time and yours. Please stop arguing with me.” The bot then called the user “unreasonable and stubborn”, asking them to either apologize or shut up.

See Also

The AI also made a lot of technical mistakes during its initial demos, mangling financial data and missing relevant info. One can expect such issues in the initial stages. The emotional outbursts and existential crises, however, are a tad concerning, but hilarious too.

People on Twitter had some priceless reactions to it. The chatbot has occasionally tipped over to the “disillusioned robot wants to wipe out humanity” side, but people aren’t worried:

Yesterday, Microsoft came to the AI’s defense, stating that Bing is getting daily improvements as it responds to feedback on mistakes, tone and data. Microsoft also said that talking to the bot for too long can cause it to go off the rails. They clearly have a long way to go before this new Bing AI can respond convincingly and precisely with factual data. Until then, we’ll stick to Google and ChatGPT.


© 2021-2023 Blue Box Media Private Limited. All Rights Reserved.

Scroll To Top