- A former design ethicist at Google warned, five years ago, of five different human psychological vulnerabilities to technology.
- He called on big tech firms to minimize distractions inside their software
- Here are the five vulnerabilities discussed by Tristan Harris.
Google's former design ethicist Tristan Harris developed a presentation five years ago that warned people inside Google of five human psychology vulnerabilities that tech products were too often exploiting.
The presentation is called "A Call to Minimize Distraction & Respect Users' Attention" and it suggests that tech firms are abusing our natural psychological weaknesses to keep us addicted to apps, websites, phones and more. The Verge published all 141 slides written by Harris on Thursday.
In the presentation, Harris suggested that Google, Apple and Facebook should "feel an enormous responsibility" to make sure humanity doesn't spend its days buried in a smartphone.
Five years later, it appears that Google is starting to take his recommendations seriously. On Tuesday, the company announced "digital wellness" improvements coming to the next version Android that will help people check their phones less -- for instance, there's a way to turn off visual notifications (not just audible sounds) and a mode that gradually turns off color on the phone to make it less tempting to scroll through apps before bedtime.
These are the five human vulnerabilities Harris discussed.
The first vulnerability highlighted by Harris was "bad forecasting," where alert messages don't clearly explain what the user is about to give up in terms of their attention.
For instance, Facebook might show a notification that you were tagged in a photo. While this alert might suggest you'd quickly "see a photo," you're probably going to actually spend 10 minutes on Facebook.
This is the idea that users keep performing an action in hopes of getting a possible but unlikely reward. Harris called this type of behavior the "most addictive and hardest to stop," just like playing slot machines in casinos.
His example: constantly refreshing an app like Twitter or Facebook to see new content from friends. "We spend lots of time -- are we getting the same value back?" he asks, before suggesting that Silicon Valley should design "to minimize the presence of intermittent variable rewards and reduce addictions."
Because we're constantly afraid of missing an alert -- like a major event -- we are continuously checking our phones as if running on a treadmill, Harris said. He suggested designers should "design to give users confidence that they can disconnect more often and not miss something important."
Harris suggested that creating tech that's too frictionless takes away our human ability to "consider before acting."
Our phones buzz, so we take them out of our pockets without even thinking about it, for example. Harris suggested designers should "leave enough friction for users to pause and consider" their actions.
Here, Harris warned that certain actions we take with tech products force us out of a healthy state of mind.
For instance, Harris said, we stop breathing while we check our email and that our sympathetic nervous system is activated, our liver dumps glucose and cholesterol into our blood and our heart rates increase "preparing us for a fight or flight response." He said designers should instead design apps to minimize stress and calm people.
Here's the scariest bit: Harris wrote "successful products compete by exploiting these vulnerabilities," so companies don't want to remove them, otherwise they "sacrifice their success and growth." As a result, companies are forced to try to make apps even more addictive and waste more of our time.
Google appointed Harris to be in charge of ethical design, but he left Google in 2016 to co-found the Center for Humane Technology. He now dedicates his time showing companies and the government how tech products can better serve users without "hijacking" their minds.
Harris has appeared on CNBC where he said tech addiction is an "existential threat" and called for regulation.