According to testing by Windows Latest, Microsoft has changed the functionality of Bing, which led to a simplification of Bing Chat artificial intelligence responses. Previously, journalists and users had access to the chatbot's secret modes, personal assistant, as well as faced with Bing's emotional responses. Now the chatbot has become more likely to answer "let's move on to a new topic".
Previously, on several occasions, Bing shared inside information. Microsoft has confirmed that it has made notable changes to Bing based on user feedback.
On February 17, the corporation limited test access to Bing to 5 questions per session and 50 requests per day. Microsoft explained its decision with the intention not to overload the system and not to cause digital hallucinations in chat-bots. The developers have decided to limit access to Bing, because previously it showed strange behavior in communication. Experts have concluded that long conversations can confuse the AI model.
Many users have noticed that Bing has become less willing to answer questions, offering to talk about something else. The chatbot began refusing to provide links to research, not answering questions directly. The AI now prefers not to continue the conversation because it is only "learning" and would appreciate users' "understanding and patience."
Bing Search used to be fantastic, but after the update it seems to have dumbed down, Windows Latest summed it up.