Building an AI chip saved Google from building a dozen data centers | WIRED

WIRED reports on the Tensor Processing Unit, or TPU, a chip built by Google that radically alters the infrastructure used for high-end AI applications. More power in any one individual chip ultimately reduces the number of chips required, in turn reducing the number of data centres and their size.

The article says the TPUs can process operations in a neural network 15 to 30 times faster than regular chips made the same way.

Google has used the TPU for a good two years, applying it to everything from image recognition to machine translationto AlphaGo

SEE FULL STORY

Building an AI chip saved Google from building a dozen data centers
WIRED | April 4, 2017 | by Cade Metz

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.