Tools such as ChatGPT threaten transparent science; here are our ground rules for their use | NATURE

First, no LLM tool will be accepted as a credited author on a research paper. That is because any attribution of authorship carries with it accountability for the work, and AI tools cannot take such responsibility.

NATURE Editorial

Leading science journal NATURE is introducing two guidelines for attributing material generated by a large language model (LLM). Treatments of AI-derived content has been split, both in practice and opinions of practice, for example, whether it is accurate, helpful, or unethical to attribute authorship.

As well as their first guideline shown above, NATURE says the use of AI tools should be documented elsewhere in the research, for example, in the methods, acknowledgements, or an “other appropriate section.”

SEE FULL STORY

Tools such as ChatGPT threaten transparent science; here are our ground rules for their use
NATURE | January 24, 2023 | Editorial

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.