×

Transforming virtual reality into life-saving technology

A firefighter fights a fire in Huntington Park, California.
Mark Boster | Los Angeles Times | Getty Images
A firefighter fights a fire in Huntington Park, California.

When crisis strikes and you're first to the scene, it can take precious seconds to pick up a phone or two-way radio for crucial updates from a police captain or emergency dispatcher. But what if that call could be made in the blink of an eye?

Eyefluence, a California-based technology company, is building a user interface completely controlled by the eye. That could be key to first responders such as police, firefighters and EMTs, who need fast, hands-free access to information, according to new investors, Motorola Solutions.

"We definitely see an evolution in first-responder technology, not only in the command center and vehicles but also on the person; that's what we are working with our clients on," Motorola Solutions' chief strategy and innovation officer, Eduardo Conrado, said. "The design approach is very different than a consumer market; you have to design for extreme situations, when the ability to absorb information decreases. So our design language has to serve up just the right interaction for that moment. "

Motorola Solutions invents and invests in enterprise technology for a client base that is 70 percent first-responders, according to Conrado. Its products include radios and police body-cams.

Recently, Motorola Solutions began investing in "rugged" eyeware, similar to goggles used in sports, that could be made "smart" to send images and receive messages, like Google Glass. The problem was, for the wearer to respond while wheeling a gurney or apprehending a suspect, the interaction needed to be fast and hands-free: no typing, swiping or hovering.

Enter Eyefluence.

While new augmented and virtual reality headsets have gained ground in the entertainment field, Eyefluence said the same technology could allow workers at the scene to interact with data, founder Jim Marggraff said. The company envisions an interface that highlights potential high-risk areas and streamlines the flow of communication, thus minimizing the need to look away as a dangerous scene unfolds.

"If you are wearing a phone, a watch and [Google] Glass, it's not good if they all are going off in a room full of smoke," Marggraff said. "The information coming in should be purposeful and focused."

Here's how it works: A bundle the size of the pinkie finger, carrying a camera, processor and tracker, would sit inside the goggles picking up common, natural movement patterns to execute commands. The tracker can filter out excess sunlight and use small doses of infrared light to track the eye.

Eyefluence can also use biometric readings to identify the individual wearing the goggles, using the eye's unique, fingerprint-like properties. That identification allows a "command center" to map immediately not only where staff is on the scene, but who is where. Conrado expects the technology to be ready to test in the next year.

Marggraff sees the product as a natural evolution of the computing interface. At the dawn of computers, there was the keyboard. Next came to the mouse and screen, then the tap and swipe. Next in the evolution could be quick, subtle movements of the eye, Marggraff said.

Eyefluence and Motorola Solutions are not the only companies exploring the possibilities of augmented reality and wearables, of course.

Google Glass was the latest in a long line of consumer products to try implementing eye-tracking technology for interacting with information, though the tapping, head-movement controlled interface failed to get major traction. After halting the consumer version in January, Alphabet has also quietly explored an enterprise application for the technology, just as augmented reality devices like Facebook's Oculus Rift and Microsoft's HoloLens gained ground in the consumer sphere.

But Conrado sees this push to innovate as a plus for Motorola Solutions and its competitors, since the companies can learn from each other. Marggraff sees Eyefluence expanding to consumer realms such as gaming and shopping.

"I'm excited about AR and VR, and the massive investments in fundamental human potential," Marggraff said. "When you look at the time it takes for your eye to move, consume and synthesize — as you are exposed to accessing a large amount of information — the brain can solve problems in ways that had never been possible before. When we start to push the limits, we're evolving the way we think and learn ... at the highest level."