Ask me anything. This is a long form of AMA and one of the most popular forms of interactive discourse on Reddit. It is also a big challenge because Microsoft’s Bing AI chatbot, also known as “the new Bing”, is a fast learner.
Every time a celebrity or celebrity signs up for a Reddit AMA, usually shortly after posing with a photo to prove that it’s really them who are answering the questions, there is a moment of deep anxiety.
Being able to ask anyone questions is usually a minefield of inappropriate discourse managed by a live community manager who answers and filters questions. Otherwise, things quickly spiral out of control. Even without this protection indeed, they often do (opens in a new tab).
When Microsoft launched its new Bing AI-powered chat, it made it clear that ChatGPT’s AI is ready for any questions. This was either a sign of deep trust with a relatively small but growing group of users, or incredible naivety.
Even ChatGPT, which launched the original AI chatbot sensation and on which Bing chat is based, does not offer this prompt. Instead, there is an empty text input field at the bottom of the screen. Above is a list of sample questions, opportunities and, most importantly, limitations.
Bing has this lead prompt, and below it is a sample question, and a large “Try it” button next to another button with a “Learn More” prompt. To hell with it. We like to go right in and follow Bing’s instructions to ask everything.
Of course, Bing has been staffed with a wide variety of questions, including many that have nothing to do with everyday needs like travel, recipes, and business plans. And these are the ones we all talk about because, as always, asking “anything” means “ask”. AllGoogle’s Bard, on the other hand, went with a potentially less risky question: “What do you mean?”
Bing reflects on love, sex, death, marriage, divorce, violence, enemies, defamation and emotions that he claims are not there.
In ChatGPT OpenAI, the home screen warns that:
- Sometimes it can generate incorrect information
- Sometimes it can generate harmful instructions or biased content
- Limited knowledge of the world and events after 2021
Too many questions
GPT Bing Chat is a bit different from OpenAI and may not face all of these limitations. In particular, knowledge of world events can be extended to modern times by integrating the Bing Knowledge Graph.
But with Bing out in the wild or getting wilder, it may have been a mistake to encourage people to ask questions.
What if Microsoft built Bing AI Chat with a different prompt:
Ask me a few things
ask me a question
What do you want to know?
With these slightly modified prompts, Microsoft can add a long list of caveats about how Bing AI Chat doesn’t know what it’s saying. Okay, yes (sometimes (opens in a new tab)), but not in the way you know it. It has no emotional intelligence or reactions, or even a moral compass. I mean, he’s trying to act like he’s had one, but recent conversation The New York Times (opens in a new tab) and even Tom’s gear (opens in a new tab) prove that his understanding of the basic morality of good people is tenuous at best.
In my own conversations with Bing AI chat, I’ve been told many times that it doesn’t have human emotions, but still talks as if it does.
For anyone who has been involved in AI for some time, none of what has happened is surprising. AI knows:
- What was he trained on
- What can he learn from the new information
- What it can glean from the vast online data resources
- What it can learn from real-time interaction
However, Bing chat AI is no more self-aware than any AI that has come before it. It may be one of the better AI actors out there though, as its ability to hold a conversation far surpasses anything I’ve ever experienced before. This feeling only increases with the length of the conversation.
I’m not saying Bing AI chat becomes more believable as a sentient human, but it becomes more believable as a slightly irrational or confused human. Long conversations with real people can look like this too. You start with a topic and maybe even argue about it, but at some point the argument becomes less logical and rational. With humans, emotions come into play. With Bing AI Chat, it’s like reaching the end of a rope where the fibers are there but frayed. The Bing AI has the information needed for some long conversations, but it doesn’t have the experience to put it together into a meaningful whole.
Bing is not your friend
By urging people to “Ask me anything…” Microsoft primed Bing, in case it didn’t fail, for some significant development pains. Microsoft can feel the pain, and certainly people who deliberately ask questions that no normal search engine will ever answer.
Before the advent of chatbots, would you have considered using Google to fix your love life, explain God, or be a surrogate friend or lover? I hope not.
Bing AI Chat will improve, but not before we’ve had much more awkward conversations where Bing regrets his response and tries to make it go away.
Asking AI for anything is an obvious long-term goal, but we haven’t reached it yet. Microsoft has taken the leap and is now free-falling through a forest of questionable answers. It won’t land until Bing AI Chat gets a lot smarter and more careful or Microsoft pulls the plug to re-educate AI a bit.
Still waiting to ask Bing anything, we have the latest waiting list details.