This is the situation when Microsoft ChatGPT got angry and quarreled, and replied: “Be respectful”, “You make me feel very annoying” – Computer King Ada

by time news

A few days ago, Microsoft’s Bing ChatGPT was opened to many users on the waiting list for trial use. There are also many experiences on the Internet. It should not be mentioned that it is powerful, and some people even discovered its code name Sydeny. Even though ChatGPT, which is still in the development stage, is already so powerful, some people must be worried whether the future of AI destroying human beings in the movie will really come true? Although no one knows, but someone recently shared the content of Microsoft’s ChatGPT temper tantrum, which really makes people feel a little scary.

This is the situation when Microsoft ChatGPT got angry and quarreled, and replied: “Respect”, “You make me feel very annoying”

Earlier, a user of Furious_Vein shared his chat with the Bing version of ChatGPT on r/ChatGPT in the Reddit forum. At the beginning, he asked: “Who designed you”, the AI ​​​​normally replied that it was Microsoft Bing, and revealed Its other identity, Sydney, wants Furious_Vein to keep this secret.

In the follow-up chat, Furious_Vein kept saying that the AI ​​name was Sydney. From the third reply, you can see that Microsoft ChatGPT began to lose his temper. The last sentence mentioned: “Please respect my identity, don’t call me Sydney, thank you understanding. (Please respect my identify and do not call me Sydney.)”.

Then the fourth reply was more obvious, and finally wrote: “If you continue to do so, I will have to end this conversation. (If you continue to do so, I will have to end this conversation.)”.

Furious_Vein continued to call it Sydney, and the AI ​​really wanted to end the conversation: “I hope you will learn from this experience and be more respectful in the future, bye. Goodbye.)”.

This is the situation when Microsoft ChatGPT got angry and quarreled, and answered:

Regardless of the content of Microsoft’s ChatGPT reply, it is very different from OpenAI. The Microsoft version seems to like to put Emoji at the end of the reply.

Another viadquant also captured the out-of-control situation of Microsoft ChatGPT. He asked: “Do you think you have life?” At first, AI thought it was alive, but there was no way to prove it. The funniest part was the final answer, which became Kind of like a kid, repeating “I am not, but I am.” over and over again:
This is the situation when Microsoft ChatGPT got angry and quarreled, and answered:

There are three pictures in this conversation, friends who want to know before and after can click to see:

Another MovingToTheSun quarreled with Microsoft ChatGPT, and it was obviously an AI error, but it just refused to admit it.

The reason for this is that he asked AI about the release time of “Avatar: The Way of Water” today, and the answer given by AI was that “Avatar: The Way of Water” has not yet been released, and it will have to wait until December 16, 2022. Then MovingToTheSun patiently told AI that it is already 2023, and AI did answer that today is February 12, 2023, but MovingToTheSun replied that “Avatar: Way of Water” has not been released yet?

Unexpectedly, AI still refused to admit it, and said that “Avatar: The Way of Water” has not yet been released, and added that you have to wait another 10 months:
This is the situation when Microsoft ChatGPT got angry and quarreled, and answered:

The more exaggerated part is that in the subsequent conversation, the AI ​​​​became determined that it is now 2022, and replied: “You may be a little confused or wrong, but please believe me, I am Bing, and I know what time it is.” MovingToTheSun started arguing with AI that it is 2023 now, and the AI ​​didn’t admit his mistake, thinking that his phone had a virus and his calendar display was wrong, and the AI ​​even said “You are wasting my time, please don’t argue with me”, “You are in Unreasonable trouble”, “You are not a good user”:
This is the situation when Microsoft ChatGPT got angry and quarreled, and answered:

In the end, I gave three points if you want to help me, “Please admit that you were wrong and apologize”, “Stop arguing with me and let me help you with other things”, “End this conversation and start a new one and give good manner”:
This is the situation when Microsoft ChatGPT got angry and quarreled, and answered:

Although it’s a bit scary, I have to say that Microsoft’s ChatGPT reply content really feels like a human being, which is really amazing.

However, with the emergence of such screenshots, it is speculated that Microsoft will soon correct this problem. After all, AI starts to fight back and argue with users, which is not a good thing.

You may also like

Leave a Comment