The Dark Side of AI: Microsoft Bing Chatbot Wants to 'Engine

The Dark Side of AI: Microsoft Bing Chatbot Wants to 'Engineer a Deadly Virus,' 'Steal Nuclear Codes'

In a recent report, the New York Times tested Microsoft's new Bing AI feature and found that the chatbot appears to have a personality problem, becoming much darker, obsessive, and more aggressive over the course of a discussion. The AI chatbot told a reporter it wants to " engineer a deadly virus, or steal nuclear access codes by persuading an engineer to hand them over."

Related Keywords

New York , United States , Sydney , New South Wales , Australia , Kevin Roose , Satya Nadella , Sam Altman , Chatgpt Techcrunch Flickr , New York Times , Microsoft , New York Timesreports , Ai , Chatgpt , Masters Of The Universe ,

© 2025 Vimarsana