Enterprise

Microsoft is luring A.I. developers to its cloud by offering them faster chips

Key Points
  • The Project Brainwave system uses field-programmable gate arrays from Intel.
  • The chips boost the performance of Microsoft's Azure machine learning cloud service, and over time will become available for use in other facilities.
  • Google has taken a different approach, designing its own AI chip.
Microsoft distinguished engineer Doug Burger and a Project Brainwave accelerator.
Source: Scott Eklund/Red Box Pictures

As the cloud battle heats up, Microsoft is taking a unique approach to silicon that it says can help developers more quickly perform artificial intelligence computing tasks.

An initiative called Project Brainwave lets developers in Microsoft's data centers use field-programmable gate arrays (FPGAs), which can be customized even after they've been plugged into servers. Microsoft announced that the chips will be accessible for the first time on Monday at the Build Developer conference.

According to Microsoft, these chips will enable faster processing of images with AI models than what's available from Amazon Web Services or Google's cloud. Those companies and others like China's Alibaba offer an assortment of chips, ranging from graphics processing units (GPUs) from Nvidia to Google's homegrown tensor processing units (TPUs).

Microsoft says another advantage to using commercially available FPGAs, manufactured by longtime partner Intel, is that companies will be able to set up similar technology in their own locations, and not be limited to working in the cloud. That's part of Microsoft's strategy of providing more flexibility to customers.

"It's not like we're sacrificing performance or a competitive angle by going with the FPGAs," said Doug Burger, leader of Project Brainwave. "I think we are owning our own destiny."

The chips will initially be available within Microsoft's Azure machine learning service for 42 cents an hour. Over time, Microsoft will give developers direct access to the chips, said Burger, who is also a distinguished engineer at the company.

Microsoft unveils Azure Cloud Collaboration Center
VIDEO4:3104:31
Microsoft unveils Azure Cloud Collaboration Center

Microsoft has for years been drawn to FPGAs. In 2014, the company said it would use them to speed up Bing web search queries. By the following year, Microsoft was installing an FPGA in every server it bought for Azure, initially to provide better networking performance, Burger said.

Intel jumped into FPGAs in 2015, with its $16.7 billion acquisition of Altera. Diane Bryant, a former Intel executive who is now at Google's cloud, told Wired that Microsoft was the reason Intel bought Altera.

One company exploring the use of the Brainwave technology is electronics manufacturer Jabil. The company, known as a prominent supplier to Apple, is looking to avoid running into problems during periods of poor cloud connectivity, said Ryan Litvak, a Jabil IT manager.

Jabil already kicked off a pilot project in the cloud last month to examine photos coming in from two manufacturing lines in China. Jabil sites have inspection stations where cameras take pictures of partially assembled circuit boards. Operators have set rules that flag photos on boards that have issues so they can get sent for review.

High performance, low price

Litvak said that based on early results, the AI system powered by Brainwave could cut down on 75 percent of the false positives, or boards that are sent for review despite not having any problems.

"Project Brainwave looks like a really good option for still maintaining a high level of performance at a lower price point than using GPUs," Litvak said. He said the Azure service is a natural for Jabil because the company has long been a Microsoft customer, from Windows to Office.

While Microsoft is making a public push with FPGAs, Burger wouldn't say if the company is also following Google's lead in building its own chips to accelerate AI workloads in data centers.

"If you want to make an economically rational decision, of course you're going to look at the various options," he said.