‘Completing someone else’s thought is not an easy trick for A.I. But new systems are starting to crack the code of natural language.’

AI systems are learning the relationships between words in the English language by studying how they are typically used. In one lab, they are trained with thousands of novels; another uses thousands of books plus ‘the length and breadth of Wikipedia,’ reports The New York Times.

The system uses what’s known as a ‘transformer model.’

Transformers work contextually — considering possibilities for a new word in relationship to all the adjacent or preceding words. From that information, the model infers the intent of the sentence.

The systems use their learning to predict the most appropriate words in a given situation. Perhaps it’s filling in a missing word in a sentence, or extrapolating the next word in a thought.

Either way, it presents the impression the machine is thinking and capable of expression.


Finally, a Machine That Can Finish Your Sentence
NEW YORK TIMES | November 18, 2018 | By Cade Metz