Tech Drivers

Amazon announces new AI chip as it deepens Nvidia relationship

Key Points
  • Amazon Web Services announced Trainium2, a chip for training artificial intelligence models, and it will also offer access to Nvidia's next-generation H200 Tensor Core graphics processing units.
  • AWS will host a special computing cluster for customers and Nvidia to use.
  • For now, AWS customers can start testing new general-purpose Graviton4 chips.

In this article

Amazon Web Services CEO Adam Selipsky speaks at the Collision conference in Toronto on June 27, 2023.
Chloe Ellingson | Bloomberg | Getty Images

Amazon's AWS cloud unit has announced new chips for customers to build and run artificial intelligence applications on, as well as plans to offer access to Nvidia's latest chips.

Amazon Web Services is trying to stand out as a cloud provider with a variety of cost-effective options. It won't just sell cheap Amazon-branded products, though. Just as in its online retail marketplace, Amazon's cloud will feature top-of-the-line products from other vendors, including highly sought after GPUs from top AI chipmaker Nvidia.

Demand for Nvidia GPUs has skyrocketed since startup OpenAI released its ChatGPT chatbot last year, wowing people with its abilities to summarize information and compose human-like text. It led to a shortage of Nvidia's chips as companies raced to incorporate similar generative AI technologies into their products.

Amazon's dual-pronged approach of both building its own chips and letting customers access Nvidia's latest chips might will help it against its top cloud computing competitor, Microsoft. Earlier this month, Microsoft took a similar approach by revealing its inaugural AI chip, the Maia 100, and also saying the Azure cloud will have Nvidia H200 GPUs.

The announcements came at the Reinvent conference in Las Vegas on Tuesday. Specifically, AWS said it will offer access to Nvidia's latest H200 AI graphics processing units. It also announced its new Trainium2 artificial intelligence chip and the general-purpose Graviton4 processor.

The new Nvidia GPU is an upgrade from the H100, the chip OpenAI used to train its most advanced large language model, GPT-4. Big companies, startups and government agencies are all vying for a limited supply of the chips, meaning there's high demand for renting them out from cloud providers like Amazon as well. Nvidia has said the H200 will generate output nearly twice as fast as the H100.

Amazon's own Trainium2 chips are built for training AI models, including the sort that AI chatbots like OpenAI's ChatGPT and its competitors run on. Startup Databricks and Amazon-backed Anthropic, an OpenAI competitor, plan to build models with the new Trainium2 chips, which will boast four times better performance than the original model, Amazon said.

The Graviton4 processors are based on Arm architecture and consume less energy than chips from Intel or AMD. Graviton4 promises 30% better performance than the existing Graviton3 chips, enabling what AWS said is better output for the price. Inflation has been higher than usual, inspiring central bankers to hike interest rates. Organizations that want to keep using AWS but lower their cloud bills to better deal with the economy might wish to consider moving to Graviton.

More than 50,000 AWS customers are already using Graviton chips, Amazon said.

Finally, as part of its deepening relationship with Nvidia, AWS said it will operate more than 16,000 Nvidia GH200 Grace Hopper Superchips, which contain Nvidia GPUs and Nvidia's Arm-based general-purpose processors. Nvidia's own research and development group and AWS customers will both be able to take advantage of this infrastructure.

AWS has launched more than 200 cloud products since 2006, when it released its EC2 and S3 services for computing and storing data. Not all of them have been hits. Some go without updates for a long time and a rare few are discontinued, freeing up Amazon to reallocate resources. However, the company continues to invest in the Graviton and Trainium programs, suggesting that Amazon senses demand.

AWS didn't announce release dates for virtual-machine instances with Nvidia H200 chips, nor instances relying on its Trainium2 silicon. Customers can start testing Graviton4 virtual-machine instances now before they become commercially available in the next few months.

WATCH: Analysts are going to have to raise their AWS growth estimates, says Deepwater's Gene Munster

Analysts are going to have to raise their AWS growth estimates, says Deepwater's Gene Munster
VIDEO2:4502:45
Analysts are going to have to raise their AWS growth estimates, says Deepwater's Gene Munster