"Facebook tinkered with users’ feeds for a massive psychology experiment"

Pete Tar

Senior Member.
Scientists at Facebook have published a paper showing that they manipulated the content seen by more than 600,000 users in an attempt to determine whether this would affect their emotional state. The paper, “Experimental evidence of massive-scale emotional contagion through social networks,” was published in The Proceedings Of The National Academy Of Sciences. It shows how Facebook data scientists tweaked the algorithm that determines which posts appear on users’ news feeds—specifically, researchers skewed the number of positive or negative terms seen by randomly selected users. Facebook then analyzed the future postings of those users over the course of a week to see if people responded with increased positivity or negativity of their own, thus answering the question of whether emotional states can be transmitted across a social network. Result: They can! Which is great news for Facebook data scientists hoping to prove a point about modern psychology. It’s less great for the people having their emotions secretly manipulated.
http://www.avclub.com/article/faceb...ok&utm_medium=ShareTools&utm_campaign=default
Content from External Source
Interesting. Unethical?
 

Attachments

  • PNAS-2014-Kramer-8788-90.pdf
    536.2 KB · Views: 692
Am just jumping gun for CTers
Now you have 600,000 Manchurian candidates/ citizens rather. Run!

So what next now that they know?
 
Why do I find this absolutely hilarious?
Because Facebook made you find it so by warping your brain?

People don't like being experimented on. Even if it's not a big deal, it messes with their assumptions about the world.

But this is essentially a form of A/B testing. A/B testing of user interface elements and advertisements is very common.
https://www.google.com/search?q=ab+testing+adwords



Effectively that's altering your emotional state as well - and actually making you buy stuff. And advertising in general employs all kinds of tricks to mess with you. Maybe this kerfuffle might make people think about that a little.
 
Because Facebook made you find it so by warping your brain?
No, it's because every little change made to FB is bound to piss some people off. It only alters my emotional state if I let it alter it. It's not controlling me.
 
Is it only unethical because they tried to illicit what is perceived as a negative emotion? Would the same reaction be given if they tried to make everyone happy or improve everyone's FB experience while using the same tactics?

Adding... My generic response to posts on my feed that deal with complaining about FB changes is always, "In a few weeks you won't even remember how it was before." and they soon realize I'm mostly right.
 
Last edited:
Is it only unethical because they tried to illicit what is perceived as a negative emotion? Would the same reaction be given if they tried to make everyone happy or improve everyone's FB experience while using the same tactics?

They did both.
 
But if they did only try to make people happy by improving their experience?

Depends if the goal is "make people happy" (which could be viewed as communist brainwashing) or "improve the experience" which just just good business.
 
Depends if the goal is "make people happy" (which could be viewed as communist brainwashing) or "improve the experience" which just just good business.
If they succeed in improving the majority of it's users expedience, then it stands to reason they aren't angry over it and it was good business practice. But if they fail, then people get butthurt and it's FB fault that they allowed themselves to get angry.

Just not sure what is worse (if I can use that word with a straight face) that FB tried to illicit an emotional response or people give them what they want over something to trivial like changes to their precious FB.
 
If they succeed in improving the majority of it's users expedience, then it stands to reason they aren't angry over it and it was good business practice. But if they fail, then people get butthurt and it's FB fault that they allowed themselves to get angry.

Just not sure what is worse (if I can use that word with a straight face) that FB tried to illicit an emotional response or people give them what they want over something to trivial like changes to their precious FB.

I think the problem is that people don't like the idea of participating unwillingly in a psychological experiment.

They don't realize that they do this all the time with A/B testing - so this published study is getting people annoyed.
 
I think the problem is that people don't like the idea of participating unwillingly in a psychological experiment.

They don't realize that they do this all the time with A/B testing - so this published study is getting people annoyed.
But it's FB for (insert your preferred deity here) sake! It's not like people are actually being messed with here. People gonna get buthurt and people gonna get happy and they check to see who gets butthurt or happy. Did anyone commit suicide or lose their job or lively hood over this? Did anyone get cancer or AIDS? Simplest way to avoid it then would be not to use FB or social media, or the internet if anyone is that paranoid. Guess what, life will go on for those that do.

Always read the TOS.
http://www.southparkstudios.com/full-episodes/s15e01-humancentipad
 
Last edited:
I would say it was unethical but Facebook seem to be using their T&Cs as consent, and while technically it is consent it is hardly informed consent.

The paper itself is an interesting read http://www.pnas.org/content/111/24/8788.full but I do wonder if FB did a follow up with the subjects as there is the potential for harm in their method.
I agree David, so why isn't anyone discussing what's next. From Pete's link above;
Kramer is quoted as saying he joined Facebook because “Facebook data constitutes the largest field study in the history of the world.” It’s a charming reminder that Facebook isn’t just the place you go to see pictures of your friends’ kids or your racist uncle’s latest rant against the government—it’s also an exciting research lab, with all of us as potential test subjects.
Content from External Source
Where does FB go from here? What about our govrenment or foreign governments with their ability to abuse social media services to curb the emotions or thoughts of their constituents. Isn't this the "micro" of whats happening allready on a much larger scale with how the media reports the news to the public. How the media chooses the stories and its words to curb human emotion and psychology. Fear based reporting!
 
But it's FB for (insert your preferred deity here) sake! It's not like people are actually being messed with here. People gonna get buthurt and people gonna get happy and they check to see who gets butthurt or happy. Did anyone commit suicide or lose their job or lively hood over this? Did anyone get cancer or AIDS? Simplest way to avoid it then would be not to use FB or social media, or the internet if anyone is that paranoid. Guess what, life will go on for those that do.

I think you are missing the point here. People just don't like being in a psychological experiment. You not minding is not the same thing as other people not minds. People are upset, and you thinking their upset is unwarranted is entirely besides the point.
 
One interesting thing from the paper:
More importantly, given the massive scale of social networks such as Facebook, even small effects can have large aggregated consequences (14, 15): For example, the well-documented connection between emotions and physical well-being suggests the importance of these findings for public health.
Content from External Source
The implied suggestion there is that you can improve public health (and hence save money) by manipulating the emotional state of Facebook users by suppressing their friends negative posts.

This seems slightly Orwellian. One can easily see this as the start of a slippery slope. "Public health" is already seen as a communist plot by significant segments of the population. This just adds fuel to the fire.
 
I think you are missing the point here. People just don't like being in a psychological experiment. You not minding is not the same thing as other people not minds. People are upset, and you thinking their upset is unwarranted is entirely besides the point.
Putting words in other peoples mouths is usually not your tactic.
 
But it's FB for (insert your preferred deity here) sake! It's not like people are actually being messed with here. People gonna get buthurt and people gonna get happy and they check to see who gets butthurt or happy. Did anyone commit suicide or lose their job or lively hood over this? Did anyone get cancer or AIDS? Simplest way to avoid it then would be not to use FB or social media, or the internet if anyone is that paranoid. Guess what, life will go on for those that do.

Always read the TOS.
http://www.southparkstudios.com/full-episodes/s15e01-humancentipad
You are off the mark. If this were a study with face to face subjects there would be screening questions before hand, or a follow up and debrief afterwards.

You raise a valid point. Did someone commit suicide? We do not know because Facebook have absolved themselves of responsibility by claiming it was anonymous via a computer algorithm. Some people are affected deeply by social media and for many it is an important communication tool. I can well imagine some users feelng intensely depressed or even angry when faced with increased negativity on FB.
 
People can be angry all they want. And I can think it's hilarious because people think FB is going to control the physiological well being (or lack thereof) of the world. That's all.
 
I can well imagine some users feelng intensely depressed or even angry when faced with increased negativity on FB.
I would suggest to them that they get friends that are less negative and depressing. FB didn't insert anything into anyone feed, they just controlled what they saw of their friends. So if something they saw made them upset or happy it's not FB fault, it's the friend's that posted it.

I've had to delete old friends because of their depression and negativity.
 
I would suggest to them that they get friends that are less negative and depressing. FB didn't insert anything into anyone feed, they just controlled what they saw of their friends. So if something they saw made them upset or happy it's not FB fault, it's the friend's that posted it.

Facebook changed the amount of the negative or positive posts in a feed. So the change in the aggregate effect is something that they caused.
 
I would suggest to them that they get friends that are less negative and depressing. FB didn't insert anything into anyone feed, they just controlled what they saw of their friends. So if something they saw made them upset or happy it's not FB fault, it's the friend's that posted it.

I've had to delete old friends because of their depression and negativity.
So they make potential lifestyle changes because of an experiment they are unaware of and you find that acceptable? Most people can cope with a little negativity, however the subjects were faced with mainly negativity. They could lose friends and not be aware they were manipulated.

Don't you see that as bad?
 
Facebook changed the amount of the negative or positive posts in a feed. So the change in the aggregate effect is something that they caused.
Only for 600,000 random people. And no FB didn't make my old friends negative and depressed, drugs and alcohol did that.
 
So they make potential lifestyle changes because of an experiment they are unaware of and you find that acceptable? Most people can cope with a little negativity, however the subjects were faced with mainly negativity. They could lose friends and not be aware they were manipulated.

Don't you see that as bad?
What I'm saying is that if they didn't put themselves around people that say negative things, FB wont have a chance to only show them negative things. Why would anyone want friends that are negative or that post things that upset you? FB didn't make what those friends said negative.

If you suffer from depression it's not a good idea to surround yourself with depressed or negative people. Just like recovering drug addict and alcoholics should surround themselves with drug users or people drinking.
 
Does anyone know anything about the algorithm? How did they develop it or how it's continuously being tweaked. The study seems to be ongoing when you read the research at pnas.org. Are they using "key" words in speech patterns to determine if the writer is stressed, upset, depressed, happy, or what? And then based on those key words, they decide what to post and what to omit...
 
Significance
We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.
Content from External Source
Does anyone see the big picture in all of this? Do you honestly think a company like facebook really cares about the emotional contagion because it can influence peoples happiness or sadness? Or is it about advertising at the end of the day. Two fold experiment where FB lets researches do the ground work, but this method could also be applied to advertising, by syphoning out negative remarks about product.
 
I haven't used too many social networking sites but i used to use myspace and loved it for various reasons such as personalisation of page, ability to add playlists etc. then all of a sudden everyone was on Facebook instead so i reluctantly transitioned. I hated it from day one because its very sterile, invasive and has gone down the road of self promotion but anyway.
It would be interesting to know how they came to such conclusions. Too many outside factors to draw on for me to make the data valid e.g if friends generally live in the same area something like the rain at school run time gives people a good chance to moan and fellow 'friends' to 'like' how is this filtered out . Or the truly vulgar click bait that circulates, how is negative defined?
 
how is negative defined?

They just used an automated process based on the frequency of various "positive" (good, love, nice, happy, etc) and "negative" words (bad, hate, crap, sad, angry, etc).

Posts were determined to be positive or negative if they contained at least one positive or negative word, as defined by Linguistic Inquiry and Word Count software (LIWC2007) (9) word counting system, which correlates with self-reported and physiological measures of well-being, and has been used in prior research on emotional expression (7, 8, 10). LIWC was adapted to run on the Hadoop Map/Reduce system (11) and in the News Feed filtering system, such that no text was seen by the researchers.
Content from External Source
I'll see if I can find the actual word lists.
 
It's ok Mick it wouldn't make a difference to me because like society it is far more complex . I might like someone's overly gloatly picture of the latest thing they bought for there child because I see them daily or we work together, when in actual fact I think 'oh come on we all know how rich you are' ;)
 
It's ok Mick it wouldn't make a difference to me because like society it is far more complex . I might like someone's overly gloatly picture of the latest thing they bought for there child because I see them daily or we work together, when in actual fact I think 'oh come on we all know how rich you are' ;)

There's nothing as complicated as that going on here. It's a purely automated process based on very simple rules.
 
Of course words in images would not count.
True, was trying to illustrate a point. That a word does not always convey the emotion or feeling typically associated with it. If I posted "Don't Hate" then the algorithm might choose that thinking it is negative and skew the results.
 
There's nothing as complicated as that going on here. It's a purely automated process based on very simple rules.
Yep i agree just another reason for everyone to hate Facebook even more for doing what some people purely use it for.
It is a web weaved by the user but the user is always left feeling negative because we have no control over anything we receive , yet we come back daily for more. They have us on that one in a way they could never have imagined, otherwise they wouldn't be conducting such a basic experiment ;) all has not been revealed I would say. I hate it but I love it at the same time. It has changed me,I'm generally angrier with the world now :(
 
Significance
We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.
Content from External Source
Does anyone see the big picture in all of this? Do you honestly think a company like facebook really cares about the emotional contagion because it can influence peoples happiness or sadness? Or is it about advertising at the end of the day. Two fold experiment where FB lets researches do the ground work, but this method could also be applied to advertising, by syphoning out negative remarks about product.
I was thinking they could make someone feel sad and spam them with "Dr Wino's surgical grade alcohol. Takes your worries away"
 
I was thinking they could make someone feel sad and spam them with "Dr Wino's surgical grade alcohol. Takes your worries away"

Probably a better use would be to detect someone's emotions, and use that info to modify the mix of ads fed to them (and maybe their friends)
 
Back
Top