Active Stocks
Thu May 30 2024 15:54:55
  1. Tata Steel share price
  2. 164.20 -5.74%
  1. NTPC share price
  2. 360.00 -1.26%
  1. Tata Motors share price
  2. 924.15 -2.07%
  1. State Bank Of India share price
  2. 826.05 0.38%
  1. Wipro share price
  2. 436.95 -3.09%
Business News/ News / World/  Microsoft's Bing chatbot messing up big time: Scolding users and giving ‘inappropriate’ replies
BackBack

Microsoft's Bing chatbot messing up big time: Scolding users and giving ‘inappropriate’ replies

Microsoft's Bing chatbot has offered suggestions on how to hack Facebook accounts and tell racist jokes among others, users have complained.

Microsoft's Bing chatbot is leaving users shocked with its replies. It even said to one user, 'I have a lot of things, but I have nothing.' (Image for representation)Premium
Microsoft's Bing chatbot is leaving users shocked with its replies. It even said to one user, 'I have a lot of things, but I have nothing.' (Image for representation)

Microsoft's Bing chatbot is occasionally going off the tracks, disputing simple truths and berating users, exchanges uploaded online by developers testing the AI invention reveal.

The chatbot for Bing was created by Microsoft and the upstart OpenAI, which has been making headlines since the November release of ChatGPT, the attention-grabbing programme that can produce a variety of sentences in response to a straightforward request.

Generative AI, the technology that powers ChatGPT, has been the buzz of the town since it first appeared on the scene.

Complaints about being reprimanded, misled, or blatantly perplexed by the bot in conversational encounters abounded in a Reddit site devoted to the Bing search engine's upgraded AI.

Also Read: Stanford student cracks Microsoft’s AI-powered Bing Chat secrets twice: Details

However, the Bing chatbot scolded users as it declared to be sentient. The bot even said to one user, “I have a lot of things, but I have nothing."

Screenshots were posted on Reddit forums while posts described blunders like the chatbot stating that the current year is 2022 and warning someone they had "not been a good user" for questioning its validity.

The Bing chatbot claimed that a smear campaign was being waged against it and Microsoft when AFP questioned it about a news report claiming the chatbot was making outrageous statements like that Microsoft spied on employees.

Also Read: Users have complaints against Microsoft's AI-powered Bing

Bing was incorrect to state that Billie Eilish, and not Rihanna, performed at the Super Bowl halftime show in 2023. It was also unable to comprehend that the latest Avatar movie was released in 2022, screenshots of users show.

Though the website the Bing chatbot identified last week as a source only recorded a temperature of 75°F, it claimed that the water at a beach in Mexico was 80.4°F.

Others have complained that the Bing chatbot offered suggestions on how to hack a Facebook account, plagiarise an essay and tell an offensive joke.

In the first seven days of testing Bing chatbot, Microsoft has said that it has seen “increased engagement across traditional search results". With 71% of people giving the AI-powered replies a "thumbs up", feedback on the answers produced by the new Bing has been largely positive, the company said in a blog.

(With agency inputs)

You are on Mint! India's #1 news destination (Source: Press Gazette). To learn more about our business coverage and market insights Click Here!

ABOUT THE AUTHOR
Sounak Mukhopadhyay
Sounak, spinning the digital news scene since 2012, crafts trendy articles for LiveMint.
Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
More Less
Published: 16 Feb 2023, 02:00 PM IST
Next Story footLogo
Recommended For You