The brand new Microsoft Bing will generally misrepresent the data it finds

Picture by Tom Warren / The Verge

Search engines like google are about to alter in an important manner: whenever you sort in a question and get an official-looking reply, it may be unsuitable — as a result of an AI chatbot created it.

Immediately, Microsoft introduced a brand new model of its Bing search engine that can present “full solutions” to your questions by tapping into the facility of ChatGPT. You’ll be able to already attempt some canned pattern searches and join extra.

However although Microsoft is taking many precautions in comparison with its 2016 failure with Tay — a chatbot that Twitter taught to be racist and misogynist in lower than a day — the corporate’s nonetheless proactively warning that a few of the new Bing’s outcomes may be unhealthy.

Listed below are a pair key passages from Microsoft’s new Bing FAQ:

Bing tries to maintain…

Proceed studying…