Eyal then proposes what he calls "the regret test" to measure the ethical dimension of a product:
"The tech industry needs a new ethical bar. Google's motto, "Don't be evil," is too vague. The Golden Rule, "Do unto others as you would have them do unto you," leaves too much room for rationalization.
I'd argue that what we ought to be saying is, "Don't do unto others what they would not want done to them." But how can we know what users do and don't want?
I humbly propose the "regret test."
...If users would regret taking the action, the technique fails the regret test and shouldn't be built into the product, because it manipulated people into doing something they didn't want to do. Getting people to do something they didn't want to do is no longer persuasion — it's coercion.
So how do we tell if people regret using a product? Simple! We ask them."
Eyal's post comes as Facebook has begun to make changes to its own software in response to criticism that use of its service may be harmful to the health of its users — and society in general.
The company has begun to change its content-recommendation software to prioritize posts from friends, trusted media companies and local news.
"Our next update on our 2018 focus to make sure Facebook isn't just fun but also good for your well-being and for society," Zuckerberg wrote in his own blog post this Monday, referring to the move to boost the amount of local news users see.
When he first announced the changes earlier this month, Zuckerberg wrote, "We feel a responsibility to make sure our services aren't just fun to use, but also good for people's well-being."
That January 11 announcement, which came the same day Eyal tweeted a link to his own blog post, caused Facebook's shares to fall 4 percent.