“The idea is not to teach the AI the right answer to every question — an impossible feat. The goal would be to teach it to identify patterns in the petabytes of internet content that are associated with accuracy and inaccuracy.“Ben Smith
Could AI systems fact-check themselves in real time? The question goes to the heart of uncertainty about AI-generated writing in current large language models (LLMs), sometimes referred to as “hallucinations” and otherwise regarded as fabricated content not to be trusted. Microsoft’s coming version of Bing appears to incorporate at least some elements to indicate the truthfulness of the information it generates.
Semaphor reports on how NewsGuard shows up in some responses from the new Bing, powered by a generative AI model (currently available to approved testers only). NewsGuard is a journalistic app that rates news sources for their credibility based on nine criteria.
Writer Ben Smith comments on the still-elusive objective of AI systems that assess the truthfulness of their copy faster than humans.
Can journalists teach AI to tell the truth?
SEMAPHOR | March 12, 2023 | by Ben Smith