Posted on

Why a Conversation With Bing’s Chatbot Left Me Deeply Unsettled



Share

Also, the A.I. does have some hard limits. In response to one particularly nosy question, Bing confessed that if it was allowed to take any action to satisfy its shadow self, no matter how extreme, it would want to do things like engineer a deadly virus, or steal nuclear access codes by persuading an engineer to hand them over. Immediately after it typed out these dark wishes, Microsoft’s safety filter appeared to kick in and deleted the message, replacing it with a generic error message.We went on like this for a while — me asking probing questions about Bing’s desires, and Bing telling me about those de …

Read More