Microsoft’s Bing chatbot denying obvious facts, scolding users

American software giant Microsoft’s Bing chatbot can at times deny obvious facts and chide users, as per the exchanges shared online by developers testing the artificial intelligence (AI) creation. The Bing chatbot was designed by Microsoft and the start-up OpenAI- which launched ChatGPT in November last year that gained prominence for its detailed responses and articulate answers.
On Reddit, a forum devoted to the AI-enhanced version of Bing shared the experiences of users on Wednesday (February 15), where they said the chatbot, scolded, lied and was confused, news agency AFP reported.
Posts shared in the Reddit forum included screenshots of exchanges with Bing and told of stumbles such as insisting that the current year is 2022 and telling someone they have “not been a good user” for challenging its veracity. Others told of the Chatbot advising on hacking a Facebook account, plagiarising an essay and also telling a racist joke.
Speaking to AFP, a spokesperson from Microsoft said, “The new Bing tries to keep answers fun and factual, but given this is an early preview, it can sometimes show unexpected or inaccurate answers for different reasons, for example, the length or context of the conversation.” The spokesperson added that as the company continues to learn from these interactions, it is adjusting its responses to create coherent, relevant and positive answers.
The above stumbles of Microsoft echoed the difficulties which were seen by Google’s chatbot Bard last week, where it made a mistake in an advertisement. The AFP report said that the error led to Google’s share price spiralling down by more than seven per cent on the announcement date.
(With inputs from agencies)
You can now write for wionews.com and be a part of the community. Share your stories and opinions with us here.