ChatGPT: Why is Microsoft limiting Bing chatbot to only five questions per session?

ChatGPT: Why is Microsoft limiting Bing chatbot to only five questions per session? (Photo by Jason Redmond / AFP)

ChatGPT: Why is Microsoft limiting Bing chatbot to only five questions per session?

  • Within the first week of the new Bing-powered by ChatGPT, Microsoft found that in long, extended chat sessions of 15 or more questions, the chatbot can become repetitive or be ‘provoked’ into giving responses that are not necessarily helpful or in line with its designed tone.

When Microsoft announced on February 7 that the new Bing was available in a limited preview on desktop, it also allowed the public to try sample queries and sign up to be on the waitlist for the new ChatGPT-powered search engine. The aim was to scale the preview to millions in the weeks after and eventually introduce a mobile experience. Microsoft wants critical feedback that will allow them to improve their models as they scale.

Ten days later, on February 17, Microsoft announced that it would limit chat sessions on its new Bing search engine to five questions per session and 50 questions per day. The move came after countless reviews shared online showed that users would continue with their queries with Microsoft’s Bing chatbot. As The New York Times puts it, “Microsoft was not quite ready for the surprising creepiness experienced by users who tried to engage the chatbot in open-ended and probing personal conversations.”

The new Bing will prompt users to start a new session after asking five questions, and the chatbot will respond up to five times. “Very long chat sessions can confuse the underlying chat model,” Microsoft said last Friday. In a blog post, Microsoft wrote that it “didn’t fully envision” people using the ChatGPT-like chatbot “for the more general discovery of the world and social entertainment.”

The chatbot became repetitive and sometimes testy in long conversations, Microsoft admitted. The new AI-powered Bing search engine, Edge web browser, and integrated Chat for Microsoft are meant to be its “copilot for the web.” For context, Microsoft’s new search tool combines its Bing search engine with ChatGPT, the underlying technology built by OpenAI.

Why is having long conversations on ChatGPT-powered Bing not a good idea?

During the Bing launch two weeks ago, Sarah Bird, a leader in Microsoft’s responsible AI efforts, said the company had developed a new way to use generative tools to identify risks and train how the chatbot responded. By the first week of public use, Microsoft noticed a pattern, especially since many users were testing out the generative AI and its ability to answer queries.

“We found that in long, extended chat sessions of 15 or more questions, Bing can become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone.” Microsoft noticed that very long chat sessions could confuse the model as to which questions it is answering. “The model sometimes tries to respond or reflect in the tone in which it is being asked to provide responses that can lead to a style we didn’t intend. This is a non-trivial scenario that requires a lot of prompting, so most of you won’t run into it, but we are looking at how to give you more fine-tuned control,” the blog post reads.

To solve that, the software giant plans to add a tool that will allow users to refresh the context or start from scratch more easily. “We want to thank those of you who are trying a wide variety of use cases of the new chat experience and testing the capabilities and limits of the service – there have been a few 2-hour chat sessions, for example!” Microsoft highlighted.

Overall, the issue of chatbot responses that veer into strange territory is widely known among researchers. In a recent interview by The New York Times, OpenAI’s chief executive Sam Altman said improving what’s known as “alignment” — how the responses safely reflect a user’s will — was “one of these must-solve problems.”