- Siri is better in iOS 11, though users might not see everything that's changed.
- Artificial intelligence is now a bigger part of Siri, so it can predict what you want and when.
- Apple, unlike Google or Amazon, stores this data locally.
Apple's iOS 11 is available for iPads and iPhones beginning on Tuesday.
There are a lot of big changes, and Siri is one of them.
Users won't find Siri is that much smarter in iOS 11, at least not the way Alexa and Google Assistant are smart.
Part of the reason is that Apple still limits a lot of the third-party apps that Siri can tap into. You can ask it to book an Uber (that isn't new), but you can't play music from Spotify, only Apple Music. You can't tap into apps you have installed, like a flash briefing from CNBC or checking on the status of an Amazon order. Those would be useful additions.
Some new changes include the ability to translate between languages. You can ask how to say something in Italian, Spanish, French, German or Mandarin and get a response from Siri, which is neat but not exactly groundbreaking either.
Thanks to a big artificial intelligence undertaking, Siri has a more human-sounding voice. It sounds a little less robotic, but still ... otherworldly. It's an improvement, for sure, and you can still choose from a variety of voices in case you don't like the default female voice. Plus, Siri seems to understand what you say to it more quickly than in the past.
Apple is starting to think about Siri differently with iOS 11. On an iPad, for example, iOS 11 will smartly recommend apps in the task bar that you might want to use. This is the brain of Siri at work.
You can also ask Siri to play music you'll like, and it'll power up a playlist from Apple Music playing tunes that you've either listened to or are similar to what you're most likely to play manually.
On the Apple Watch, a new Siri watch face provides news articles it thinks you might be interested in, as well as photos of people and places it thinks you might want to see. It can show you what's next on your schedule, the weather where you are and at home and more. I wish Apple would improve this with a note on when I should leave for work in the morning based on traffic, but it's not the watch face I use most.
If you open the News application in iOS 11, Siri will start to learn about the stories and topics you read most and cater to what it thinks you'll want to read about. Others, like Facebook, have done this by analyzing what you're most likely to open in the news feed.
Apple says it's using local A.I. to power Siri. While some data — such as your name, your contacts, the names of your photo albums and the songs in your music library — do get sent to Apple so that you'll get a better Siri experience, generally speaking, most data is stored only on the device. That includes data about your music tastes, news preferences, the things shown in your photos and more.
This is philosophically different from what Amazon, Google and Microsoft do. Data those companies collect is generally shared with their remote servers and, in some cases, mixed with hundreds of millions of other people's data so that those companies can make their systems smarter with collective knowledge. Apple says it doesn't care about seeing its users' data, so all the inference is done locally on an iPad or iPhone.
That may be the Achilles heel of Siri in some regards. While it can gain skills, it's limited to the data on your device — a pro for folks who are worried about security, but a con for those who want a smarter voice assistant. At least your personal version of Siri is synced across all of your Apple devices with iOS 11, though.
So Siri is improving, but it's also evolving. Siri has become more about learning what you want while also providing information when asked on command. That trend should continue when Apple launches its HomePod smart speaker later this year. It'll serve as a competitor to the Amazon Echo and Google Home.