The Next Word | THE NEW YORKER

‘…it’s making billions of lightning-fast probability calculations about word patterns from a year’s worth of e-mails sent from’

The NEW YORKER begins a detailed look at predictive text by unpacking Smart Compose, the AI capability recently built into Google mail. This new feature is reported to save two billion keystrokes amongst 1/5 of the world’s population that makes up the Gmail user base.

But the stride and strength of this piece is its examination of GPT-2, the AI text generating algorithm developed by OpenAI, a California AI lab pursuing artificial general intelligence, or computers that display human-level abilities in several domains at once.

The New Yorker used 12 years of its digital archives to ‘train’ GPT-2 , so it could write passages of this report. They say the volume of text fed to the machine would have taken a human two weeks of 24/7 reading to absorb. The computer acquired the information in under an hour. The results are embedded throughout the piece.


The Next Word
THE NEW YORK | October 14, 2019 | by  John Seabrook

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.