As the cloud battle heats up, Microsoft is taking a unique approach to silicon that it says can help developers more quickly perform artificial intelligence computing tasks.
An initiative called Project Brainwave lets developers in Microsoft's data centers use field-programmable gate arrays (FPGAs), which can be customized even after they've been plugged into servers. Microsoft announced that the chips will be accessible for the first time on Monday at the Build Developer conference.
According to Microsoft, these chips will enable faster processing of images with AI models than what's available from Amazon Web Services or Google's cloud. Those companies and others like China's Alibaba offer an assortment of chips, ranging from graphics processing units (GPUs) from Nvidia to Google's homegrown tensor processing units (TPUs).
Microsoft says another advantage to using commercially available FPGAs, manufactured by longtime partner Intel, is that companies will be able to set up similar technology in their own locations, and not be limited to working in the cloud. That's part of Microsoft's strategy of providing more flexibility to customers.
"It's not like we're sacrificing performance or a competitive angle by going with the FPGAs," said Doug Burger, leader of Project Brainwave. "I think we are owning our own destiny."
The chips will initially be available within Microsoft's Azure machine learning service for 42 cents an hour. Over time, Microsoft will give developers direct access to the chips, said Burger, who is also a distinguished engineer at the company.