Microsoft is testing the preview version of AI-powered Bing with a select group of people in more than 169 countries. On the other hand, the number of people who want to try the new Bing has exceeded one million. After a week of testing, those who tried Bing shared that artificial intelligence can sometimes be aggressive and make hate speech. It was also revealed that Bing made mistakes in the information requested from him and could give incorrect information. Of course, these are not surprising, the important thing is what will happen when the full version is released.
Bing goes astray as the chat gets longer
There are also some caveats. Microsoft says long chat sessions of 15 or more questions can throw Bing off the beaten track. Long conversations can leave Bing confused and out of context. In addition, it is reported that he repeats himself in long conversations and can give unwanted answers from time to time. The biggest problem, of course, is that Bing often responds during these long chat sessions in the wrong tone, or “in a way we didn’t plan” as Microsoft calls it. Microsoft also underlines that this scenario does not always happen. As a result, keep the conversation short.
Signal of new features
Additionally, Microsoft says it’s working to improve search and responses, especially on live scores data and some of Bing’s recent financial data bugs. While it is announced that the basic data sent to the model will be increased by 4 times for mistakes made in financial information, Microsoft is considering adding the “New Topic” icon to the chat window, which will clear the chat history.