Google has used the TPU for a good two years, applying it to everything from image recognition to machine translationto AlphaGo
Wired reports on the Tensor Processing Unit, or TPU, a chip built by Google that radically alters the infrastructure used for high-end AI applications. More power in any one individual chip ultimately reduces the number of chips required, in turn reducing the number of data centres and their size.
The article says the TPUs can process operations in a neural network 15 to 30 times faster than regular chips made the same way.
Building an AI chip saved Google from building a dozen data