Some folks are complaining Facebook could affect how you feel via manipulating your News Feed.
Facebook released research on 689,003 users that had their levels of positive or negative News Feed content adjusted. Not surprisingly, their moods and words they used changed correspondingly with what they were exposed to.
But consider this: If your sports team wins, you’ll be more likely to make a celebratory remark.
If a friend is having a bad day, you’re likely to provide sympathetic encouragement.
Facebook posts overwhelmingly positive emotion.
Of the 122 million words they analyzed, 4 million were positive and only half as many were negative. Yet we know that the average person has 40,000 thoughts a day, of which 70-80% are negative.
So why the flip-flop? Positive news travels further.
Social networks are for boasting and celebrating. People are less likely to share and react to negative news. In our research with the NBA, we found that Facebook engagement was far higher on team winning days than losing days.
Positive comments drive further engagement, creating a compounding effect in the newsfeed.
Facebook admitted that they didn’t implement a dislike button, not because it’s bad for branding or because people would abuse it. They saw that positive emotions get shared more and generate more traffic.
Smile and the world smiles with you. Cry, and you cry alone, as the saying goes.
The elephant in the room: who controls the News Feed algo?
There is WAY more content than you can possibly consume in the newsfeed. The control of who sees what is ultimately a powerful tool that Zuck and Co. control, more powerful than any political weapon, I’d assert.
Fox News has a reputation for being right leaning, while MSNBC might be liberal. But what about Facebook? Do you ever hear anyone complain about the editorial of the world’s largest curated news site? The faceless editor doesn’t have a name because it is a robot.
Stephen Pirrie believes that the big revelation is that for years, we’ve thought of Facebook as being a ‘robot’ algorithm crafting news feeds based on more than 100k factors. But all of a sudden, the fact that they have qualitative control scares the crap out of us.
Pundits assert that usage of Facebook is more likely to cause people to get jealous, commit suicide, and suffer from all manner of maladies. Great talk show host fodder. The people who still think video game usage cause violence are in this camp.
But the reality is that you’re being bombarded by content of all types, endlessly. There’s not enough time to dwell on any one item, any more than you can focus on billboards on the highway.
1984 and our buddy, George
(I was born in 1993, by the way). While brands complain about their posts not showing up in the News Feed, I believe this is a red herring to the real issue of what actually is showing up and why.
If you give people what they want, then allow schoolchildren to skip class to play video games. Let the masses feast upon naked women on wrecking balls instead of [insert whatever issue you think is important, dear reader]. Get fed more of whatever you’ve been clicking upon.
I was in New York recently and asked a TV news anchor why they kept showing murders, car accidents, and escaped zoo animals. That’s what people want, and that’s how they fund their operations. If they don’t maintain their audience, they lose their advertisers and subsequent funding.
Might Facebook be in the same situation? Do they have a moral duty to show what is “good”, however and whoever defines that? The beauty of the Facebook News Feed algorithm, which is even more secret than Google’s. Facebook can quietly influence public opinion on whatever issue they choose and nobody would ever know. Or go all in for the shareholders.
Tron Jordheim, the vice president of marketing for StorageMart, doesn’t think so. He believes Facebook’s bias towards good news helps their strategy:
Bad news and disasters mesmerize people and disturb their emotions so they are susceptible to advertising….
Perfect for TV. Our emotions attach to the product to seek safety.Good news and relevant news surrounding your interests, passions and friends cause you to share , comment and become engaged in brands and causes… Our emotions seek to share with others to build our human connection. Perfect for social media…. And a more satisfied experience than just getting a moment safety.The FB slant towards good news is good strategy.
The Invisible Hand at Work
While traditional publications have editorial staff manually deciding what you see, the social networks rely upon your own behavior to determine what shows up.
So the next time someone complains to you about how their newsfeed is full of idiots, you might have a quiet chuckle at the choices they’ve made. Tweet that!
There’s no mass conspiracy or Facebook illuminati at work. Just the combined effect of each of our own actions competing for space in a crowded world. It’s in Facebook’s and our mutual interest for the algorithm to show the most interesting (defined as most clicked on) content. This drives higher engagement, longer session times, more information collection, and ultimately, more monetization.
Adam Smith is alive and well.
Top image courtesy of Shutterstock.