Technologists say we're a long way from the kind of sentient robots we see in the science fiction series "Westworld" — but some of the show's thought-provoking scenarios should already give us pause.
In the show "Westworld," people travel to a western theme park filled with humanoid robots called "hosts." The machines think and interact like real people, but since they aren't living beings, people treat them with reckless abandon. That world gets turned upside down when the robots start remembering their past "lives" and those who took advantage of them.
"It's a dangerous moral ground are we walking into, making systems that are reminiscent of humanity and then treating them in a way that is inhumane," said Illah Nourbakhsh, professor of robotics at Carnegie Mellon University.
HBO gave people a glimpse of what a real "Westworld" theme park would be like during an immersive experience at the South by Southwest Festival in Austin, Texas. Show co-creators Jonathan Nolan and Lisa Joy and agency Giant Spoon recreated the "Westworld" town of Sweetwater.
Guests were sent on missions that involved interacting with the "hosts" and searching through the area for clues, similar to what the guests of the park do on the show. The buildings looked like they came straight from the series.
There was one major change: The hosts were played by real people, not artificial intelligence (AI) robots. In total, 60 actors, six stunt people, six horses, and five bands were hired for the event. The final script was more than 440 pages.
While AI becomes "smarter" through learned experience, we'd need a breakthrough for the technology to have "deep learning" like humans, Nourbakhsh said. Robots would have to learn the nuances of human desire to keep people compelled to talk to them, and we're simply not there, he said.
"The most likely scenario [for a robot theme park today] would be having robot controllers from afar, like people in call centers," Nourbakhsh said.
Another problem an AI robot theme park: Machines don't have very smooth motions, said Mark Riedl, associate professor of computer science at Georgia Institute of Technology.
"They are uncoordinated, and don't have really fine grain dexterity to pull guns," Riedl said. "Even waiting tables is really, really hard."
However, technologies that mimic social cues and dialogue can already be seen today, Riedl said.
"What the AI looks like in 'Westworld' is in some ways what game companies are trying to do in virtual worlds," Riedl said.
In the show "Westworld," the hosts are given pre-installed storylines that are triggered when a person interacts with them. It's similar to how game mechanics work in action role-playing video games like "Borderlands," Riedl said. Still, "Westworld's" missions are much more complex than what our current technology enables, he said.
The central theme of the show revolves around how people should treat human-like robots. And while our AI isn't that advanced, we're already facing that ethical issue, Carnegie Mellon University's Nourbakhsh said.
"Technology is moving much faster than our discussions about what is or isn't acceptable behavior in society against machines," Nourbakhsh said.
Although robots do not have emotions, people can get attached to them, Georgia Tech's Riedl said. Some people name their Roombas and give them backstories, only to be emotionally upset when the devices break down, he said. People have even held funerals for robot dogs.
"We are hardwired to treat living things as human, so when [machines] are designed to act autonomously it triggers feelings," he said.
We're already seeing evidence of people taking advantage of AI without thinking twice, Carnegie Mellon University's Nourbakhsh said. In a ethical test he often gives, Nourbakhsh finds people are willing to take a parking spot from a driverless car if the owner isn't around because they know the car can keep on circling forever.
"Making things that are designed to be as close to looking and acting human and then saying 'go ahead and abuse these things,' to me it says something about humans," Georgia Tech's Ridel said.
"In some ways that's okay because you're given permission to do that [in "Westworld"], but what that says about the individual human — if they have some issues — you may to have to question," he added.