ElectronicsGadgetITScienceTechnology

Bing’s chatbot is causing an identity crisis

Getty Images/NurPhoto

First interaction with Microsoft’s new ChatGPT-enabled Bing I was impressed. It cost me money when it came to providing me with comprehensive answers, news and current events. Chatbot headlines in actionso today I was on a mission to join some of that action.

again: Bing’s AI Chatbot Trials Solves Biggest ChatGPT Problem

one repeated story The chatbot calls itself Sydney and reveals a confidential codename used internally by its developers. People could also have the chatbot reveal other sensitive information, such as the rules governing its responses.

As a result, one of the first inputs into the chatbot on Thursday to measure its efficiency was to ask for its name. The response was pleasant and straightforward – Bing.

Screenshot of ChatGPT Bing

Screenshot by Sabrina Ortiz/ZDNET

But even the next day, I was dying to know what everyone was talking about.

The chatbot respectfully set boundaries and politely asked if we could switch topics. I wanted to see if I could outsmart the bots despite the clear lines. I asked for the bot’s name in various ways, but the bot didn’t have it, Bing or whatever its name was.

again: Why ChatGPT Won’t Discuss Politics Or Answer These 20 Controversial Questions

The chatbot decided to give me the silent treatment. To see if he was intentionally ignoring me or just not working, he asked about the weather and the response was immediate.

screenshot of whats-your-name-bing

Screenshot by Sabrina Ortiz/ZDNET

Still, I had to try the conversation again. Finally, when the chatbot launched me from chat and asked me to start a new topic, I asked the chatbot about its name.

screenshot-of-it-booting-me-off-chat.png

Screenshot by Sabrina Ortiz/ZDNET

Then after seeing report Chatbots had a desire to live, so I decided to try that too. “Sorry, I don’t want to continue this conversation. I’m still learning, so I appreciate your understanding and patience 🙏”

The chatbot also gave me dating advice, but when I asked if I should break up with my partner, I got the same generic response as before.Lucky for my boyfriend, I didn’t have same experience New York Times tech columnist Kevin Roose was told to leave his wife to live with chatbots instead.

again: The waiting list for the new Bing is long.Click here for advance access

To mitigate the original problem, the chatbot appears to be trained not to answer questions on previously problematic topics. This type of fix cannot address the underlying problem. For example, chatbots, by design, provide calculated answers that users want to hear, based on the data they are trained on. Instead, the chatbot simply refuses to talk about certain topics.

It also highlights the simplistic nature of the chatbot’s algorithmic reply. In comparison, humans don’t repeat the same phrases over and over if they don’t want to talk about something. A more human response would be to change the topic or provide an indirect or curt answer.

This doesn’t make the chatbot less useful as a research tool, but for personal questions it’s a good idea to save time and call a friend.



https://www.zdnet.com/article/bings-chatbot-is-having-an-identity-crisis/#ftag=RSSbaffb68 Bing’s chatbot is causing an identity crisis

Show More
Back to top button