“. . . while the Associated Press has used AI technology to automate some “rote tasks” — think corporate earnings reports, sporting event recaps, transcribing press conferences, etc. — since 2014, the standards unveiled on Wednesday sound a skeptical note about using generative AI for journalism’s most essential work.”

– Sarah Scire

Associated Press has updated its editorial policies with ten guidelines for material produced by generative AI tools. The additions were issued by Amanda Barrett, vice president of standards and inclusion at AP.

The new guidelines emphasize the primacy of human judgment in AP’s reporting.

AP says material generated by AI systems “should be treated as unvetted source material.” In an interview with NeimanLab, Barrett gave an example of how polling and investigative teams work.

“They often ask sources where the data they have is coming from, who put it together and to see the full data so they can draw their own conclusions. That kind of diligence is going to be standard for so many other journalists on so many other beats.”

Amanda Barrett, to NeimanLab

The AP guidelines apply to the material it originates and to third-party material that the news agency might use.

Other guidance, to come separately, advises caution when AP journalists are reporting about AI advances. They should avoid relying only on claims by tech developers or unduly focusing on future scenarios.

They also say to resist humanizing AI systems, for example, describing digital systems with human traits or gendered pronouns.


“Not a replacement of journalists in any way”: AP clarifies standards around generative AI | NEIMANLAB | August 16, 2023 | by Sarah Scire