The professor who wrote the book on making addictive technology is having second thoughts

Key Points
  • A Stanford professor who wrote a best-seller on how to design addictive products now says that's not always a good thing.
  • Nir Eyal wrote in a recent blog post that "the tech industry needs a new ethical bar."
  • "There's nothing wrong with building products people want to use, but the power to design user behavior ought to come with a standard of ethical limitations," he wrote.
Nir Eyal speaking at the TED Institute.
Source: Ted Institute | YouTube

Facebook CEO Mark Zuckerberg isn't the only big name in Silicon Valley reconsidering whether frequent use of software and technology is good for the emotional health of consumers.

The Stanford professor who literally wrote a book on how to design addictive technology has begun to ask whether doing so is always a good thing.

Nir Eyal, whose 2014 book "Hooked: How to Build Habit-Forming Products" is a best-seller of industrial product design, has written a blog post challenging the tech industry to weigh the impact of its product-design choices.

"The techniques used by product managers at the world's largest companies are equal parts psychology and technology. As Sean Parker, founding president of Facebook, recently acknowledged, the company has long been engaged in the business of 'exploiting a vulnerability in human psychology,'" wrote Eyal, who received a degree from the Stanford School of Business and who's taught a course at the school on product design.

A new standard: The 'regret test'

Eyal then proposes what he calls "the regret test" to measure the ethical dimension of a product:

"The tech industry needs a new ethical bar. Google's motto, "Don't be evil," is too vague. The Golden Rule, "Do unto others as you would have them do unto you," leaves too much room for rationalization.

I'd argue that what we ought to be saying is, "Don't do unto others what they would not want done to them." But how can we know what users do and don't want?

I humbly propose the "regret test."

...If users would regret taking the action, the technique fails the regret test and shouldn't be built into the product, because it manipulated people into doing something they didn't want to do. Getting people to do something they didn't want to do is no longer persuasion — it's coercion.

So how do we tell if people regret using a product? Simple! We ask them."

Eyal's post comes as Facebook has begun to make changes to its own software in response to criticism that use of its service may be harmful to the health of its users — and society in general.

The company has begun to change its content-recommendation software to prioritize posts from friends, trusted media companies and local news.

"Our next update on our 2018 focus to make sure Facebook isn't just fun but also good for your well-being and for society," Zuckerberg wrote in his own blog post this Monday, referring to the move to boost the amount of local news users see.

When he first announced the changes earlier this month, Zuckerberg wrote, "We feel a responsibility to make sure our services aren't just fun to use, but also good for people's well-being."

That January 11 announcement, which came the same day Eyal tweeted a link to his own blog post, caused Facebook's shares to fall 4 percent.