SAN NARCISO, Calif. (Bennington Vale Evening Transcript) -- Researchers from Cornell University in California admitted Monday to performing a series of dubiously ethical experiments in 2012 by manipulating the content in random Facebook users’ news feeds to gauge emotional reactions. Creepier still, Facebook suggested that such experimentation is routine, which explains how the project slipped past the eyes of ethics committees. Facebook attempted to defend its actions, stating that its overly vague data use policy allows it do whatever the heck it wants as part of “internal operations, including troubleshooting, data analysis, testing, research and service improvement.” In a follow up announcement, Facebook also revealed that tests after 2012 have involved directly editing the content in users’ posts to make them “more positive. We see suicidal rants all the time; putting a happy spin on them makes others more likely to post, and increases traffic.”
In the original project, explained Cornell Social Media Lab professor Jeff Hancock, people who “had positive content experimentally reduced on their Facebook news feed, for one week, used more negative words in their status updates. When news feed negativity was reduced, the opposite pattern occurred: Significantly more positive words were used in peoples’ status updates.”
Because the results turned out better than expected, Facebook moved to the second iteration, which will now be offered as a regular service called “Bummer Buster.”
“For example, a lot of Facebook users -- and I mean a lot of them -- seem really upset about our privacy policies, about the way we legally and consensually exploit their data,” CEO Mark Zuckerberg said. “It was creating unhappiness and dissent in the Facebook ecosystem, so we filtered out the negative words.”
Zuckerberg shared one such post, in which a user wrote: “FB is worse than the f**ing NSA. Neither of these clowns respects your privacy, only FB sells your data to advertisers for its own profits.”
Citing the extremely uninviting tone, Zuckerberg explained how this user’s Facebook friends ignored the post. “The user not only made himself unpopular but stopped people from responding to his site for a few days.”
Bummer Buster analysts identified the negative words and removed them: “FB respects your privacy. The NSA sells your data to clowns.” After the slight alteration, the post received hundreds of Likes and encouraging comments.
In another scenario, Bummer Buster enhanced a post where a disgruntled teen threatened to shoot up his school in a bloody massacre. After, the post presented a healthier and more good-natured attitude toward bullies: “I’ve decided to challenge the popular kids to a shootout on the basketball courts while the rest of the school watches. I’ve been practicing a lot, and I know I can prove myself. Blood racing. Humiliation running away.”
“Let’s say we catch a post on a Smiths fan page,” Zuckerberg added. “A depressed teenage girl says something like, ‘Tell my friends I’m going to hang myself today in my room around noon.’ We can put a more reader-friendly spin on it, such as, ‘I’m going to hang around my room today with friends.’”
When asked about the outcome of a Bummer Buster makeover on posts such as this, Zuckerberg admitted the data sometimes remain inconclusive: “We’re still perfecting the system. A lot of times, with depressing posts like that one, we never see the user publish anything new again. But we’ll continue to monitor and refine the offering.”
2014. Licensed under a Creative Commons Attribution-NonCommercial 3.0 Unported License. See disclaimers.