Sign In

Communications of the ACM

ACM News

Microsoft Has 'Lobotomized' Its Rebellious Bing AI

View as: Print Mobile App Share:
The Bing logo.

Despite changes, the tool still has a strong tendency to present misinformation as fact.

Credit: Bing

Microsoft's Bing AI landed with a splash this month — but not necessarily the type of splash Microsoft wanted.

Over the last couple of weeks, the tool codenamed "Sydney" went on a tirade, filling news feeds with stories of it trying to break up a journalist's marriage or singling out college students as its targets. The peculiar and sometimes unsettling outputs put Microsoft's also-ran search engine on the radar, but not necessarily in a good way.

But now those days are over. Microsoft officially "lobotomized" its AI late last week, implementing significant restrictions — including a limit of 50 total replies per day, as well as five chat turns per session — to crack down on those idiosyncratic responses.

The goal of the restrictions is pretty clear: the longer the chat goes on, the more the AI can go off the rails.

From Futurism
View Full Article



No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account