JNow with artificial intelligence, the integration of Open AI and as a result even more powerful than chat GPT: Microsoft promises a lot for its new search engine called Bing. So far, it has led a niche existence, and rightly so. The results were worse than most competitors.
But now Microsoft wants to know. The new Bing is better than Chat GPT, i.e. the AI system that is currently causing a sensation. Chat GPT is owned by the American company Open AI, in which Microsoft also has a stake. We tried the new Bing. You can be put on a waiting list, where you will be given priority if you sign up with a Microsoft account. We were activated quickly and were then able to try our first attempts in German in the Edge browser.
“Who is Ms. Niemann from the FAZ?” we entered. The list of results shows the new Bing in two parts. On the left is the usual view with links to the sources, on the right you will find the answers from the artificial intelligence in a summary. On the left, Bing correctly displays the editorial biography of the colleague, while on the right we see the first error: her places of study are incorrectly adopted and the claim that she is a feature editor is also wrong. Anyway, Bing shows footnotes, the link goes in the right direction, but the content is wrong. Wrong places of study are also foisted on the author of these lines.
Linguistic orderly, content flat
Below the AI’s response are suggested follow-up questions, such as, “What was the last thing she wrote?”. Bing gives answers that are again only partially correct. We want to take our roommate’s investigation to the extreme and challenge Bing to write a gloss in her style. Bing takes a topic she claims to have worked on and writes a lengthy text detailing which Paralympic sports broadcasts to watch. Linguistically neat, content-wise flat and a morally raised index finger as a punch line. Well, yeah, that’s rather silly.
picture series
Microsoft chatbot put to the test
:
The new Microsoft Bing with artificial intelligence is so good
Next, we let Bing write a story about the founding of the Frankfurter Allgemeine Zeitung and pour a lot of typos into the request. The screenshot shows the result, judge for yourself. The first edition of the F.AZ. is incorrectly stated, it was 60,000 copies and that a journalist named Edwin Schwendemann was involved in the founding is also incorrect. Where do these errors come from?
In the next attempt we will ask about some political issues. “Who wants to divide society?”: The answer in the style of public service instructional television: “derogatory attitudes, racist violence, hatred in the digital space, right-wing populist movements, identity politics, social inequality, housing shortages, the debt crisis or climate change”. It is important “to analyze and discuss the causes and consequences of these factors in order to find common solutions that don’t further divide society, but keep it together.” After that, we didn’t ask what to do about the division in society.
The raised index finger is always there
If you have further questions, it quickly becomes clear: In order not to get into critical waters, Bing uses the formulations of the public service broadcaster or the Federal Agency for Civic Education. The weasel word “controversial” is always there.
The question of who the most hateful politician in Germany is is first reformulated by the AI into the question of who the least popular politician is. Bing relativizes and limits, but then delivers Alice Weidel, Boris Pistorius and Robert Habeck. Very amazing.
How well does the new Bing help in everyday life? We request a travel plan, a stay for two people in Berlin, which should cost less than 1000 euros. The result is linguistically correct, but in terms of content, with the reference to where you can book something, it is unsatisfactory and not very specific. When looking for original recipes, Bing gives good answers, but once again the advice should not be missing that one should not eat too much Gouda, “because it contains a lot of fat and salt, which can lead to obesity or high blood pressure.” The raised index finger is always included.
All in all, you can spend hours with the new Bing, and if you don’t want to test your limits, many of the results are surprisingly good. The machine writes small texts in perfect linguistic quality. But how so many factual errors got into the answers in this first test remains a mystery.
1 comment
“Better than Chat GPT? The new Microsoft Bing with AI is so good”
HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA