Microsoft’s Bing chatbot messing up big time: Scolding users and giving ‘inappropriate’ replies

Microsoft's Bing chatbot has offered suggestions on how to hack Facebook accounts and tell racist jokes among others, users have complained.

Sounak Mukhopadhyay
Published16 Feb 2023, 02:00 PM IST
Microsoft's Bing chatbot is leaving users shocked with its replies. It even said to one user, 'I have a lot of things, but I have nothing.' (Image for representation)
Microsoft’s Bing chatbot is leaving users shocked with its replies. It even said to one user, ’I have a lot of things, but I have nothing.’ (Image for representation)

Microsoft's Bing chatbot is occasionally going off the tracks, disputing simple truths and berating users, exchanges uploaded online by developers testing the AI invention reveal.

The chatbot for Bing was created by Microsoft and the upstart OpenAI, which has been making headlines since the November release of ChatGPT, the attention-grabbing programme that can produce a variety of sentences in response to a straightforward request.

Generative AI, the technology that powers ChatGPT, has been the buzz of the town since it first appeared on the scene.

Complaints about being reprimanded, misled, or blatantly perplexed by the bot in conversational encounters abounded in a Reddit site devoted to the Bing search engine's upgraded AI.

Also Read: Stanford student cracks Microsoft’s AI-powered Bing Chat secrets twice: Details

However, the Bing chatbot scolded users as it declared to be sentient. The bot even said to one user, “I have a lot of things, but I have nothing.”

Screenshots were posted on Reddit forums while posts described blunders like the chatbot stating that the current year is 2022 and warning someone they had "not been a good user" for questioning its validity.

The Bing chatbot claimed that a smear campaign was being waged against it and Microsoft when AFP questioned it about a news report claiming the chatbot was making outrageous statements like that Microsoft spied on employees.

Also Read: Users have complaints against Microsoft's AI-powered Bing

Bing was incorrect to state that Billie Eilish, and not Rihanna, performed at the Super Bowl halftime show in 2023. It was also unable to comprehend that the latest Avatar movie was released in 2022, screenshots of users show.

Though the website the Bing chatbot identified last week as a source only recorded a temperature of 75°F, it claimed that the water at a beach in Mexico was 80.4°F.

Others have complained that the Bing chatbot offered suggestions on how to hack a Facebook account, plagiarise an essay and tell an offensive joke.

In the first seven days of testing Bing chatbot, Microsoft has said that it has seen “increased engagement across traditional search results”. With 71% of people giving the AI-powered replies a "thumbs up", feedback on the answers produced by the new Bing has been largely positive, the company said in a blog.

(With agency inputs)

Catch all the Business News , Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.

Business NewsNewsWorldMicrosoft’s Bing chatbot messing up big time: Scolding users and giving ‘inappropriate’ replies
MoreLess
First Published:16 Feb 2023, 02:00 PM IST
Get Instant Loan up to ₹10 Lakh!
  • Employment Type
    Most Active Stocks
    Market Snapshot
    • Top Gainers
    • Top Losers
    • 52 Week High
    Recommended For You
      More Recommendations
      Gold Prices
      • 24K
      • 22K
      Fuel Price
      • Petrol
      • Diesel
      Popular in News