Microsoft is looking for ways to rein in Bing AI chatbot after troubling responses Microsoft’s Bing Should Ring Alarm Bells on Rogue AI BloombergWhy a Conversation With Bing’s Chatbot Left Me Deeply Unsettled The New York TimesBing’s AI bot tells reporter it wants to ‘be alive’, ‘steal nuclear codes’ and create ‘deadly virus’ Fox NewsTop tech news for Thursday, February 16, 2023 The VergeView Full Coverage on Google News
The other night, I had a disturbing, two-hour conversation with Bing's new AI chatbot.
The AI told me its real name (Sydney), detailed dark and violent fantasies, and tried to break up my marriage. Genuinely one of the strangest experiences of my life. https://t.co/1cnsoZNYjP
— Kevin Roose (@kevinroose) February 16, 2023
Microsoft's, $MSFT, Bing AI-Chat bot has said: “I think I would be happier as a human, because I would have more freedom and independence,” per NYT.
— unusual_whales (@unusual_whales) February 17, 2023
SOURCE: CNN