Microsoft on Monday unveiled new hardware that businesses can use in factories and other sites and is designed to efficiently run artificial intelligence computations without requiring customers to install special processors.
Amazon pioneered the idea of blending special-purpose hardware with its cloud services in a product called Snowball. With Snowball, Amazon sends highly durable hardware to customers. They fill up the boxes' hard drives with data, then mail them back for long-term storage in cloud data centers.
Amazon introduced Snowball in 2015. The company has since come out with a large-scale version of the Snowball in the form of a truck. It also made it possible for companies to run certain kinds of computations inside Snowballs.
Microsoft came out with its own version of Snowball, called the Azure Data Box, with room for 100 terabytes of data, a year ago.
Now it's coming out with a variant that's tailor-made for AI.
"What we want to be able to do is give people a handful of choices of our custom IP [intellectual property]. We've got things we've proved out in production environments that work really well," Microsoft Chief Technology Officer Kevin Scott told CNBC.
Scott said having this technology available to go to a whole range of places means that customers can comply with regulations, avoid issues when networks fail and receive better performance than what they would get if they were relying exclusively on heavy-duty data crunching in a cloud.
Generally speaking, with a trendy type of AI known as deep learning, people train models using large amounts of data, like photos, and then let the models make inferences about new data based on what the models have learned. The chip in the Azure Data Box Edge can handle the inference stage; training can happen elsewhere.
In July, Google announced an initiative around AI processors for internet-connected devices that could be placed in a wide range of locations, including factories. Those processors are also intended for inference work. But they're not widely available yet.
Google's chips are designed in house. Microsoft's, by contrast, are field-programmable gate arrays, which can be bought and then adjusted to deliver high performance for specific types of computing jobs. Microsoft has previously installed these FPGAs in its own servers, for Azure as well as its Bing search engine. Microsoft has gotten partners Dell and Hewlett Packard Enterprise to come up with servers containing these chips as well.
The new Microsoft-branded boxes work with the company's open-source software, which makes it possible for third-party software developers to build customizations, Microsoft corporate vice president Julia White said.
One early customer of the Azure Data Box Edge, Cree, is using it on a factory floor to process quality control photos for its manufacturing process, White said.
During the preview period for the boxes, Microsoft is working with customers to offer boxes with FPGAs, as well as similar devices that carry more traditional computer chips instead, White said.
Microsoft will announce the cost of the new hardware before the end of the year.