‘…it’s making billions of lightning-fast probability calculations about word patterns’
The NEW YORKER takes a detailed look at predictive text by unpacking Smart Compose, the AI capability recently built into Google mail.
But the stride and strength of this piece is its examination of GPT-2, the AI text generating algorithm developed by OpenAI, a California AI lab pursuing artificial general intelligence, aka computers that display human-level abilities in several domains at once.
- GPT-2 ‘wrote’ passages of the New York piece after being fed 12 years of its New Yorker archives.
- A human reading 24/7 would have taken two weeks to review the training text. The computer absorbed the same text in under an hour.
- The results speak for themselves, embedded throughout the piece.
The Next Word
THE NEW YORK | October 14, 2019 | by John Seabrook