GO
Loading...

Facebook experiment raises privacy concerns

Facebook published a research paper this month revealing that it manipulated its users' emotions by controlling the number of positive and negative posts shown on news feeds, a move that has caused a backlash among social networkers and brought to light the lack of safeguards to protect users.

"It is simply a terrible thing to do to intentionally evoke a negative emotional response from someone without their knowledge or consent," said Jeremy Rosenberg, senior vice president and head of digital operations at Allison+Partners, a communications consulting firm.

New Facebook privacy alert
Source: Facebook
New Facebook privacy alert

Facebook studied how about 689,000 of its users responded to changes in their news feeds and found that, even though users weren't interacting face-to-face with their friends, positive and negative sentiments were contagious, according to a paper published in the Proceedings of the National Academy of Sciences last month.

Read More Facebook tinkers with users' emotions, stirs outcry

When users were shown fewer positive posts from friends in their news feeds, their own posts were more negative, and when they were shown fewer negative posts, their own posts were more positive, according to the study.

Facebook has defended its research. And while its users may have provided "consent" to their information's being used when they agreed to the social network's terms of service, many users of Facebook and other websites and apps don't read disclaimers and privacy policies in their entirety, Rosenberg suggests.

Read More Privacy increasingly not a big deal, says start-up

"While there is a degree of expectation of privacy with regard to ... the social networks we participate in, the reality is everyone must be aware that there is not any sort of blanket safeguard with regard to the experience," Rosenberg said.

—By CNBC's Althea Chang.