The power of Amazon Echo is no longer just for your home.
Developers building on top of Amazon Web Services can now use Alexa, the voice-controlled intelligence system that runs the Echo home assistant, inside their applications, the Seattle-based company said on Wednesday at the annual AWS re:Invent conference in Las Vegas.
It's called Lex, and the idea is to bring natural language controls to any app, whether it's for booking a flight, ordering food or, more importantly, the ability to ask questions and follow-ups in a conversational manner. The preview is available starting Wednesday.
"There are very few natural language understanding and speech recognition applications and platforms that get more everyday usage than what Alexa and the Echo get," said AWS CEO Andy Jassy, in an interview with CNBC. "What a lot of customers want when they build these conversational applications is they want the technology that's actually being used over and over again that they see every day."
As more computing moves from servers in company closets and basements to data centers owned by Amazon, Google, Microsoft and IBM, businesses are able to take advantage of more sophisticated technologies in their own services. Machine learning and artificial intelligence is perhaps the best example of what's coming.
Amazon has two decades of experience recommending items to its customers on the retail side and building formulas that show what a buyer may like given past purchases. That eventually evolved into voice and the Echo, which has sold millions of units, and the technology is now available within AWS.
IBM is promoting its Watson AI platform as part of its cloud-computing offering. CEO Ginny Rometty told CNBC last month that Watson will reach more than 1 billion consumers by the end of next year. And Google's Cloud has a software tool called TensorFlow, an open source offering that allows developers to embed machine learning in their apps and services.
—CNBC's Deirdre Bosa contributed to this report.