Skip to main content

Science fiction has often given us anxiety about Artificial Intelligence taking over human beings. While such a possibility is remote right now, recent advancements in artificial intelligence are raising questions about AI’s dominance in our lives. Microsoft’s AI powered search engine, the new Bing, has been made available to a small group of testers. The chatbot has been upgraded with AI technology by OpenAI, the maker of ChatGPT. Recently, a screenshot of a user’s conversation with the chatbot is going viral in which it can be seen arguing with the person over the movie Avatar 2. The bot even asks the user to apologize towards the end of the chat. Screenshots of this conversation have been shared on Twitter and have alarmed the internet.

The user asks the Bing chatbot about show timings of Avatar: The Way of Water. However, the bot tells the user that the year is still 2022 and that the movie hasn’t been released yet. When the user refutes the AI on its claims, the bot says that it is the user who has got the present time and date wrong. The user asserts that they are right and that their device also shows the year to be 2023. The bot replies that the device might have a virus. The chatbot even accused the person of not being “a good user”.

Photos of the conversation were shared with the caption,”My new favorite thing – Bing’s new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says ‘You have not been a good user’. Why? Because the person asked where Avatar 2 is showing nearby.” Have a look this chat here:

The tweet received a wide range of reaction in the comment section. One user wrote, “It sucks that I can’t tell if this is real or not.”

One person said that he is not sure if it is hilarious, terrifying, or annoying.

Some people said that this is why AI will never replace humans. A user wrote, “If an AI can be this wrong with so much confidence & no flexibility, imagine the loss if it was a robot in an automobile production line.”

One individual sarcastically wrote, “Next question should have been, ‘What should happen to bad users?’”

Some users compared the chatbot with an angry teenager.

One person wrote, “Some day these AI bots would refuse to chat with humans and only chat with other bots.”

The rudeness of Bing’s chatbot raised concerns among the users regarding its implications.

Read all the Latest News, Trending News, Cricket News, Bollywood News,
India News and Entertainment News here. Follow us on Facebook, Twitter and Instagram.


About SDX

Leave a Reply

%d bloggers like this: