
Google has spent a decade building servers that can handle billions of web searches a day. The company is now developing chips to deliver the smartest results.
At its annual developer conference on Wednesday, Alphabet introduced the second generation of Google's
The upgraded version is the latest indication that Google doesn't want to depend on other companies for core computing infrastructure. It's potentially troubling news for Nvidia, whose graphics processing units (GPUs) have been used by Google for intensive machine learning applications. Nvidia even named Google Cloud as a notable customer in its latest annual report.
Deep learning, a trendy type of AI, typically involves two stages: training artificial neural networks on lots of
While the original TPU was only meant for the inference stage of deep learning, the new version can handle training as well.
"I would expect us to use these TPUs more and more for our training needs, making our experiment cycle faster and more rapid," said Jeff Dean, a senior
It takes a day to train a machine translation system using 32 of the best commercially available GPUs, and the same workload takes six hours atop eight connected TPUs, Dean said.
Unlike Nvidia and Intel, Google operates this equipment inside its own data centers rather than selling it to other device makers. Facebook has done much the same, although it has opted to share the designs publicly through the Open Compute Project it established in 2011.
People outside Google will be able to rent out virtual machines (VMs) with acceleration from the second-generation TPUs. Google will also introduce VMs that draw on the Volta GPU that Nvidia announced earlier this month.
Over time, Nvidia could find itself getting less business for AI directly from Google. There could also be a more indirect impact, as some users of Nvidia GPUs for AI processing might run that work in Google's data centers.
Nvidia has been a Wall Street darling of late. The stock price has surged nine-fold over the past four years and the company is now worth more than $80 billion. In addition to data center customers, Nvidia also sells GPUs for professional workstations, gaming
Last month, Google published a paper comparing TPUs to existing chips and said its own processors are running 15 to 30 times faster and 30 to 80 times more efficient than the competition. Nvidia CEO Jen-Hsun Huang shot back and said his company's current chips have "approximately twice the performance of the TPU — the first-generation TPU."
A spokesperson for Nvidia did not respond to a request for comment on Wednesday's announcement.