‘Completing someone else’s thought is not an easy trick for A.I. But new systems are starting to crack the code of natural language.’
AI systems are learning the relationships between words in the English language by studying how they typically are used. In one lab, they are trained with thousands of novels. Another likewise uses thousands of books plus ‘the length and breadth of Wikipedia,’ reports the NEW YORK TIMES.
The systems use what they’ve learned to predict the most appropriate words in a given situation. Perhaps it’s filling in a missing word in a sentence, or extrapolating the next word in a thought. Either way, it presents the impression the machine is thinking and capable of expression.
The system uses what’s known as a ‘transformer model.‘ Transformers work contextually — considering possibilities for a new word in relationship to all the adjacent or preceding words. From that information, the model infers the intent of the sentence.