Bing’s ChatGPT brain is behaving so strangely that Microsoft can stop it

Microsoft launched its new search engine Bing last week and introduced the AI-powered chatbot to millions of people, creating long waiting lists of users wanting to test it and sparking a lot of existential fear among skeptics.

The company likely expected some of the responses coming from the chatbot to be somewhat inaccurate when it first met the public, and put in place measures to stop users who tried to make the chatbot say or do strange, racist, or harmful things. These precautions have not stopped users from doing so prison breaking chatbot and forcing it to use insults or the wrong answer.

Leave a Reply

Your email address will not be published. Required fields are marked *