×

What happened when I spent a week with an AI voice assistant in my head

"You look creepy, imagine going out with that in," a colleague remarked to me. It was a great start to my week wearing Sony's Xperia Ear device, a tiny headset that sits in your ear and connects wirelessly to a smartphone.

The Xperia Ear contains a smart voice assistant inside it that can carry out tasks such as texting a person or telling you what's on your schedule for the day. A user has to tap the earpiece and then just speak.

In fairness, it was great at picking up my voice the first time and getting things right, which isn't always the case with other voice assistants. But once it recognized the voice, it took a long time to process it and give a response. I asked it to call a colleague and by the time I'd said the name and which number to call, I could have easily got my phone out and dialed.

The call quality was good however and the advantage was that I had my hands free to do other things like type on my computer or use my phone. Another useful feature was the ability for the Xperia Ear to read out messages so you can decide whether you want to reply or not. Also, the device recognized when motions like nodding your head to reply "yes".

Sony

But that's where the usefulness ended. The Xperia Ear and Sony's intelligent voice assistant has serious limitations versus other offerings on the market. There was very little integration with anything beyond making calls. You couldn't set reminders in Google Calendar and even navigation required me taking out my phone.

It's understandable why Sony launched the device. The smartphone division is small and while it has stabilized, the company is looking to new areas of growth such as connected devices and artificial intelligence (AI). The problem is, compared to the likes of Google, Amazonand Apple, Sony has been behind in this area, particularly AI voice assistants.

I recently tried Amazon's Echo speaker which has Alexa – its voice assistant –built in. Alexa was much more integrated with apps such as Uber and Spotify, something that the Xperia Ear lacks. Amazon has had the Echo on the market for a longer time, while Apple's Siri and Google Assistant has been around for a while.

Beyond just the functionality is the looks of the Xperia Ear. It looks like a slightly older Bluetooth headset and a little geeky. This is a major drawback, especially when walking down the road and trying to talk to it. I always felt self-conscious that people were giving me funny looks because it looked like I was talking into thin air.

Sony's Xperia Ear is one of the first products of its kind, and Apple's wireless AirPods are set to follow soon. The AirPods have Siri built-in and it will be interesting to see what kind of functionality Apple adds into these and whether users will overcome the image issue of wearing the little earpieces.

I still think there is huge potential in voice assistants and the Xperia Ear showed me that. Also, I enjoyed using the Echo for certain things, and am increasingly using Google Assistant on my smartphone. But I'm still not sure what form it makes the most sense to have an AI assistant in and that's something that technology firms will no doubt be experimenting with.