Google has finally started allowing other companies to rent its artificial intelligence-tailored chips in the cloud.
The company announced in a blog post on Monday that tensor processing units are available in "limited quantities" for a select set of customers looking to run machine learning models on the Google Cloud Platform.
Google unveiled its stealthy TPU project in 2016 and followed up with the second generation of chips at its annual developers' conference last May. The update allowed Google to get into the more sophisticated training phase of deep learning.
Usage is billed by the second and costs $6.50 per Cloud TPU per hour.
The TPUs are designed specifically for AI and machine learning workloads and provide two main benefits to Google.
First, by using its own silicon, Google has a cheaper, more efficient alternative to relying on chipmakers like Nvidia and Intel for its core computing infrastructure. Owning its own hardware enables Google to experiment faster.
The new TPUs also allow parent company Alphabet to add a revenue stream to the Google Cloud Platform. GCP and Google's collection of business apps called GSuite now generate more than $1 billion a quarter.
Google is currently allowing companies to rent individual TPU boards but later this year will let them connect multiple boards into supercomputer networks called TPU pods.