Microsoft's New Bing Goof Isn't Going To Help AI Skepticism

On February 7, Microsoft introduced the next evolution of its Bing search engine, which it simply calls New Bing. A key feature of New Bing is a conversational chatbot called Bing Chat, something made possible through a partnership with OpenAI, the company behind ChatGPT. Bing Chat isn't available on a widespread basis yet — Microsoft started putting people on a waitlist for access to New Bing the same day it was announced — but we did see a demonstration of the technology in action. As with Google's Bard, the chatbot wasn't entirely accurate.

Advertisement

Google announced its Bard conversational AI only a day after Microsoft introduced Bing Chat, marking the start of the race between the two companies — something critics have said is premature and happening at a pace that makes it difficult to adequately refine the technology, not to mention other issues like its potential to destroy ad revenue on the web and fundamentally change the way we use the internet. Google took the brunt of the negative attention due to the viral nature of its AI's mistake, but Bing shouldn't escape similar attention, and its own mistakes highlight one of the biggest concerns surrounding this technology.

Bing Chat seemed to make up info during demonstration

Bing Chat works in a straightforward way, at least on the user's end: you can ask it a question and it will scrape the internet for related information, which is then recompiled into something seemingly original in response to the query. You can, for example, ask the Bing AI to give you recipes that meet specific requirements — or, as demonstrated by Microsoft, you can ask it to compare more than one product. In a post on his blog, software engineer Dmitri Brereton highlighted some of the mistakes Bing made when comparing pet vacuums, not to mention offering false details on bars in Mexico and incorrectly summarizing a quarterly financial report.

Advertisement

At the heart of the matter is what appears to be entirely fictional information generated by the chatbot, which cited sources for information that wasn't actually presented by those sites. Brereton dug through the Bing Chat answers given during the demonstration and found everything from "cons" about a particular pet vacuum model that didn't reflect actual reviews all the way to a financial summary that included figures not included in the actual report. Bing reported the wrong numbers, said that a cordless vacuum had a 16-foot cable, and recommended nightlife destinations in Mexico that may be far different in reality than described by the bot.

People may mistake AI as an authority on subjects

On its FAQ page, Microsoft says that the Bing AI isn't necessarily accurate and that users should keep that in mind. "Bing will sometimes misrepresent the information it finds, and you may see responses that sound convincing but are incomplete, inaccurate, or inappropriate," the website states. Whether that's enough to justify rolling out a tool that presents false information is a matter of debate. 

Advertisement

While suggesting the wrong hours for a club or fake cons for a product are minor, the consequences could be more severe if the AI gives fake info about more serious things in the future, such as medical advice or political happenings. Because the information sounds like it could be correct — as noted by Microsoft — some users may be unable to distinguish it from actual fact, and that makes Microsoft's advice to "use your own judgement" less than useful. 

Presenting inaccurate information right off the bat — that is, during the demonstration of the technology — doesn't offer reassurance about the quality of the results the average user will receive. The market turned on Google quickly when news of its Bard mistake surfaced, and Microsoft certainly won't be shown any more grace. The faux pas reinforces skepticism about the technology and its usefulness, and if the company isn't careful, it could damage the reputation of such technology before it even gets off the ground.

Advertisement

Recommended

Advertisement