Democratic candidates take the stage together for the first time as they jockey for position in the race to take on President Trump in 2020.2020 Electionsread more
In a strategy to draw attention away from Wednesday's Democratic debate, President Donald Trump's reelection campaign bought out YouTube's "masthead," the leading...2020 Electionsread more
That's surprising, as Siri is ubiquitous in Apple products -- from the iPhone to CarPlay to Apple TV. It also lags behind competitors like Amazon Alexa and Google Assistant in certain areas, which has to be a sore spot for Apple, since Siri predates these other personal assistants by several years.
But Apple is making some notable improvements to Siri this fall, and I got to check a few of them out during WWDC 2019. You'll start to see them this fall when Apple rolls out new software, including iOS 13 for the iPhone and iPad.
Here's a rundown:
If you've used Google Assistant and then gone back to Siri, you've probably noticed Siri sounds pretty robotic. That's because Siri's voice was really just a bunch of spliced audio clips, so sometimes the cadence of a word didn't quite fit a sentence.
Apple said Siri will soon have a new voice that's entirely generated by software, which will allow it to sound more accurate. I heard it and I thought it sounded better, but still not quite as good as John Legend's voice on Google Assistant. I'm sure it'll continue to improve in the months before the new model launches.
Right now, Siri will play music only from Apple services like Apple Music and iTunes -- and nothing else. It's a bad look, since Amazon Alexa and Google Assistant let you play music from any other service.
But Apple is opening Siri to outside servcies with a new SiriKit Audio tool for app developers. Soon, you'll be able to say "Siri, play rock from Pandora," and she'll queue that app up. It supports audiobooks and podcasts as well.
Spotify has not announced support for it yet, but the tool is open to any app developer, so it would make sense for Spotify to add it in order to be more integrated with Apple's platforms.
Siri will also play more than 100,000 live radio stations, with no other app required. All you'll have to do is ask Siri to play the station you want, like "Hey Siri, play 99.9 The Hawk," and Siri will pull up the live station right through Apple Music. I haven't spent much time listening to old favorite stations -- like The Hawk -- but maybe I will now that it'll be easier to call them up.
As is the case with music, Siri can only navigate using Apple services, like Apple Maps. But that changes with the update this fall. Soon, you'll be able to say "Siri, navigate to 30 Rock with Waze" or "Siri, navigate to Yankee Stadium with Google Maps" and Siri will do that.
Siri will soon be able to read a message right into your AirPods while you're walking, and it will use the AirPods' microphones to understand if you're talking and avoid interrupting you -- even if you're talking to somebody in front of you, rather than through your phone. You'll also be able to respond right away without having to tap a button or say "Hey Siri" first.
So, if your spouse texts you, Siri will read the message to you through your AirPods and you'll be able to dictate your response right back.
One of my biggest complaints with Apple's home speaker, the HomePod, is that it doesn't have multi-user support. So, while Siri can read out my messages or send them, or play some of my favorite music, it doesn't recognize my wife as a separate person and account when she speaks.
That will change this fall when HomePod will get support for multiple users, which means Siri will respond appropriately when my wife asks what's on her calendar and when I ask what's on mine.
Apple is putting its acquisition of music-identifying app Shazam to better use.
Right now, you can ask Siri on your iPhone to identify a song. Soon, you'll also be able to ask Siri to identify what song is playing right from your wrist. Just raise your Apple Watch and ask Siri to identify the tune. I wish Apple had followed in Google's footsteps, though, by letting you run this feature in the background at all times -- you can set the Google Pixel to identify whatever music is playing around you, without having to give it a command each time.
Siri Shortcuts is useful, but complicated to use.
It works with third-party apps to give you specific information on command -- for instance, you can set it up with TripIt to get flight information every time you say "Hey Siri, flight status," for example. But, usually these shortcuts are buried or complicated to set up and use. Plus, you have to download Shortcuts from the App Store to get started.
In iOS 13, Apple will include the Shortcuts app by default and will have a bunch of highlighted apps and use-cases for it, which will make it easier for you to set up custom routines for Siri. So, maybe you want Siri to turn off the lights and close the garage door every time you leave the house. That will be a lot easier to set up in iOS 13.