AI models learn from vast amounts of information but sometimes they create hallucinations, also known as “ungrounded content,” by altering or adding to the data.
Learn more about the tools we have put in place to measure, detect, and reduce inaccuracies and ungrounded content: https://news.microsoft.com/source/features/company-news/why-ai-sometimes-gets-it-wrong-and-big-strides-to-address-it/
Subscribe to Microsoft on YouTube here: https://aka.ms/SubscribeToYouTube
Follow us on social:
LinkedIn: https://www.linkedin.com/company/microsoft/
Twitter: https://twitter.com/Microsoft
Facebook: https://www.facebook.com/Microsoft/
Instagram: https://www.instagram.com/microsoft/
For more about Microsoft, our technology, and our mission, visit https://aka.ms/microsoftstories
Learn more about the tools we have put in place to measure, detect, and reduce inaccuracies and ungrounded content: https://news.microsoft.com/source/features/company-news/why-ai-sometimes-gets-it-wrong-and-big-strides-to-address-it/
Subscribe to Microsoft on YouTube here: https://aka.ms/SubscribeToYouTube
Follow us on social:
LinkedIn: https://www.linkedin.com/company/microsoft/
Twitter: https://twitter.com/Microsoft
Facebook: https://www.facebook.com/Microsoft/
Instagram: https://www.instagram.com/microsoft/
For more about Microsoft, our technology, and our mission, visit https://aka.ms/microsoftstories
- Category
- Software
Sign in or sign up to post comments.
Be the first to comment