A study from two Europe-based nonprofits has found that Microsoft’s artificial intelligence (AI) Bing chatbot, now rebranded as Copilot, produces misleading results on election information and misquotes its sources, stated Cointelegraph.
With insights from AI Forensics and AlgorithmWatch on December 1, 2023, Bing’s AI chatbot gave wrong answers 30% of the time to basic questions regarding political elections in Germany and Switzerland. Inaccurate answers were expected to be on candidate information, polls, scandals and voting.
Furthermore, the study also included that the safeguards built into the AI chatbot were “unevenly” distributed and caused it to provide evasive answers 40% of the time, Cointelegraph concluded.
(With insights from Cointelegraph)
Follow us on Twitter, Facebook, LinkedIn