Apps will soon leverage new tools from Apple to identify everything from a bowl of almonds to a poison ivy rash.
CNBC met with Apple on Wednesday to take a look at some of the first applications that will leverage this technology. It's pretty incredible, and Apple is doing all of the processing on the device, which means nothing is sent out to the cloud.
Take a look at what's possible.
To be clear, the app isn't new. But before, you had to scan barcodes or manually enter what you ate.
With iOS 11, Lose It! can identify what's exactly on the plate in front of you. This information isn't being sent to the cloud for processing, and developers didn't have to build the machine learning engine themselves. Apple did the work that allowed Lose It! senior data scientist Dr. Edward Lowe to develop the deep-learning framework you see at play, all without an internet connection.
There are other use cases, like in medicine.
This function wasn't available at all in prior versions of the app. You had to search through symptoms and dig through multiple menus to find what the ailment might be.
These are just two examples of updated apps that are coming this fall. Apple showed us another app that's not quite finished that can detect a face so that, when you take pictures, the person in the photo automatically receives a copy of the image.
The possibilities seem pretty endless, and that's just a small taste of what machine learning can do.