Google is moving beyond developing artificial intelligence chips for its own data centers and is now designing them to work inside products made by other companies.
After unveiling the Tensor Processing Unit two years ago, Google announced on Wednesday the Edge TPU, which will enable sensors and other gadgets to process data more quickly. The chips could be used in a wide variety of scenarios, but one initial use is in industrial manufacturing: Consumer electronics maker LG is testing them in a system that detects manufacturing defects in glass for displays.
Google's jump into custom silicon is one way it's trying to expand its cloud market share against Amazon and Microsoft. Since 2015, Google has been using TPUs to accelerate certain workloads in its own data centers, rather than relying on commercially available hardware from vendors like Nvidia.
Last year, Google said its AI silicon is becoming even more strategically important.
In AI, researchers train models with lots of data so that machines are capable of making predictions as new data arrive. The initial version of the TPU could only make those predictions, whereas the second version, in 2017, could also be used to train models — an update that made the chips competitive with Nvidia's graphics cards.
A third-generation TPU was announced earlier this year.
Now we have Edge TPUs, tiny chips that are specifically meant to do the prediction part of AI, which is less computationally intensive than training models. Edge TPUs can run their calculations without having to connect to a bunch of powerful computers, so applications can work faster and more reliably. They can handle AI work alongside a standard chip or microcontroller in a sensor or gateway device.