ChatGPT is everywhere. Here’s where it came from | MIT TECHNOLOGY REVIEW


By tracking this contextual information, transformers can handle longer strings of text and capture the meanings of words more accurately. For example, ‘hot dog’ means very different things in the sentences ‘Hot dogs should be given plenty of water’ and ‘Hot dogs should be eaten with mustard.

Will Douglas Haven
server racks on data center
Photo by Brett Sayles on Pexels.com

Current large language models (LLMs) are possible because of very large computing power. ChatGPT’s roots extend back to research in the ’80s and ’90s. The current release is the latest in a series.

MIT TECHNOLOGY REVIEW traces their evolution and highlights how the key breakthrough for developing LLMs was in 2017 with the invention of transformers. Transformers provide context for text generation. They are the foundation for ChatGPT and the many versions of LLMs emerging today.

SEE FULL STORY

ChatGPT is everywhere. Here’s where it came from
MIT TECHNOLOGY REVIEW | February 8, 2023 | by Will Douglas Haven

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.