What you need to know
- Bing Chat received an update that improves its accuracy when answering questions about travel or recipes.
- The chatbot will also do a better job citing the source of recipes, including making sure the original source is linked to rather than Bing.com.
- Microsoft also reduced how frequently Bing Chat will end conversations unnecessarily.
Microsoft Designer, which uses the same technology as Bing Chat, rolled out in public preview this week. While the expansion of Designer's preview was the biggest AI news from Microsoft this week, it wasn't the only news. Microsoft also improved Bing Chat in several ways.
Since last week, Bing Chat has gotten better at answering questions about travel or recipes. The chatbot is also better at citing information accurately when answering those types of questions.
Microsoft also reduced the frequency of Bing Chat ending conversations unnecessarily. This is an ongoing process to fix a part of Bing Chat that's considered frustrating by many. The chatbot often states "I’m sorry but I prefer not to continue this conversation" or "It might be time to move on to a new topic" when not needed.
Here are Microsoft's release notes for Bing Chat from this week:
- Improving Travel and Recipe Grounding: We’ve taken steps to help Bing chat give better answers if you're asking questions about travel or recipes. For both, we improved the accuracy of citations. For recipes, we used improved grounding data from recipe content providers and made sure that citations directed you to the recipe site instead of Bing.com. Expect us to make further grounding improvements based on your feedback.
- Reducing End-of-conversation Triggers: We continue to work to resolve cases where Bing chat unnecessarily ends conversations (e.g. “I’m sorry but I prefer not to continue this conversation.” or “It might be time to move on to a new topic.”). We made some additional changes this week to reduce some of the most egregious scenarios where this occurred.