‘…it’s making billions of lightning-fast probability calculations about word patterns’

The New Yorker looks at predictive text. It unpacks Smart Compose, the AI capability recently built into Google mail.

But the stride and strength of this piece is its examination of GPT-2, the AI text-generating algorithm developed by OpenAI, a California AI lab pursuing artificial general intelligence, computers that display human-level abilities in several domains at once.

GPT-2 ‘wrote’ passages of The New Yorker piece after being fed 12 years of the magazine’s archives. A human reading 24/7 would have taken two weeks to review the training text. The computer absorbed the same text in under an hour.

The generated writing results are embedded throughout the piece.

SEE FULL STORY

The Next Word
THE NEW YORK | October 14, 2019 | by  John Seabrook

LATEST