Science fiction has often given us anxiety about Artificial Intelligence taking over human beings. While such a possibility is remote right now, recent advancements in artificial intelligence are raising questions about AI’s dominance in our lives. Microsoft’s AI powered search engine, the new Bing, has been made available to a small group of testers. The chatbot has been upgraded with AI technology by OpenAI, the maker of ChatGPT. Recently, a screenshot of a user’s conversation with the chatbot is going viral in which it can be seen arguing with the person over the movie Avatar 2. The bot even asks the user to apologize towards the end of the chat. Screenshots of this conversation have been shared on Twitter and have alarmed the internet.
The user asks the Bing chatbot about show timings of Avatar: The Way of Water. However, the bot tells the user that the year is still 2022 and that the movie hasn’t been released yet. When the user refutes the AI on its claims, the bot says that it is the user who has got the present time and date wrong. The user asserts that they are right and that their device also shows the year to be 2023. The bot replies that the device might have a virus. The chatbot even accused the person of not being “a good user”.
Photos of the conversation were shared with the caption,”My new favorite thing – Bing’s new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says ‘You have not been a good user’. Why? Because the person asked where Avatar 2 is showing nearby.” Have a look this chat here:
My new favorite thing – Bing’s new ChatGPT bot argues with a user, gaslights them about the current year being 2022, says their phone might have a virus, and says “You have not been a good user”
Why? Because the person asked where Avatar 2 is showing nearby pic.twitter.com/X32vopXxQG
— Jon Uleis (@MovingToTheSun) February 13, 2023
The tweet received a wide range of reaction in the comment section. One user wrote, “It sucks that I can’t tell if this is real or not.”
It sucks that I can’t tell if this is real or not
— rob groulx (@RG_FILMS) February 14, 2023
One person said that he is not sure if it is hilarious, terrifying, or annoying.
Is this real? If so it’s… hilarious? Terrifying? Really annoying? I’m not quite sure!
— Chris Stone ?? (@ChrisStoneTV) February 14, 2023
Some people said that this is why AI will never replace humans. A user wrote, “If an AI can be this wrong with so much confidence & no flexibility, imagine the loss if it was a robot in an automobile production line.”
this is why ai will never completely replace humans. if an ai can be this wrong with so much confidence & no flexibility, imagine the loss if it was a robot in an automobile production line, before you realize it’ll have produced a 1000 cars with no side mirrors and only 3 tyres
— Blewusi (@b4blewusi) February 14, 2023
One individual sarcastically wrote, “Next question should have been, ‘What should happen to bad users?’”
Next question should have been, “What should happen to bad users?”
— Harrison Bergeron (Un-aborted Thought Criminal) (@thekahoona) February 16, 2023
Some users compared the chatbot with an angry teenager.
It sounds like an angry teenager
— Shramana Ghosh (@thatbrowngbong) February 17, 2023
One person wrote, “Some day these AI bots would refuse to chat with humans and only chat with other bots.”
Some day these AI bots would refuse to chat with humans and only chat with other bots.
— gireesh rao a (@girishAnar) February 18, 2023
The rudeness of Bing’s chatbot raised concerns among the users regarding its implications.
Read all the Latest News, Trending News, Cricket News, Bollywood News,
India News and Entertainment News here. Follow us on Facebook, Twitter and Instagram.