The play on how we feel with new ‘mood’ buttons is the next step for Facebook’s business model
Call me cynical, but emotions are the currency in which all social networks trade. Getting users to feel strong (usually positive) emotions and empathise with one another online is the core weapon that social networks like Facebook, Twitter, Instagram – built on the bedrock of connecting people – have in their arsenal. The ultimate goal: to keep you coming back for more meaningful human contact.
Last week Facebook released “Reactions” – an extension of the “Like” button which expands the range of emotions you can express through the platform. Facebook users (currently only in Spain and Ireland, bizarrely) can hold down the “Like” button of a post to express seven different emotions: angry, sad, wow, yay, haha, love, and the traditional like.
Facebook’s chief product officer Chris Cox wrote in a post on the site, “We studied which comments and reactions are most commonly and universally expressed across Facebook, then worked to design an experience around them that was elegant and fun.”
The social network has also made another announcement this month: starting in October, the ubiquitous blue “Like” or “Share” buttons that you see all over the Internet will be used to track your visit to every web page that displays the buttons – even if you don’t actually click on anything.
Facebook will use the data it collects to build a detailed dossier of your browsing habits, meticulously logging every site you visit.
The new “Reaction” buttons will serve a dual purpose. They will allow you to express yourself more freely on Facebook – you can now “LOVE” your friend’s photo or be “ANGRY” about their status on refugees. But simultaneously, it will also enable Facebook and its advertisers to figure out how their campaigns, products and profiles really make you feel, and therefore target you better.
This isn’t the first time Facebook has been interested in helping us express our feelings more broadly. The touchy-feely, seemingly irrational realm of empathy is something Facebook has devoted an entire team to – appropriately (or creepily), it is named the Compassion Research Team. The team, which consists of engineers and designers working in conjunction with sociologists, neuroscientists and psychologists, is trying to create empathy among Facebook users, replacing what used to happen in real-world settings like kindergarten through real emotional expressions like crying or laughing.
Because of the size of Facebook’s population – larger than that of the largest country on Earth – this is also the world’s biggest laboratory of natural human behaviour. This effort isn’t a side project acting as a peace offering to the Corporate Social Responsibility division – the success of the entire Facebook platform depends largely on solving this problem.
In October, BullyingUK saw calls relating to cyberbullying increase by 77pc over a 12 month period. In an online survey, BullyingUK also found that 43.5pc of respondents aged between 11 and 16 had been bullied via social networks.
Most people who quit social networks like Twitter or Facebook – ranging from celebrities like Lena Dunham, The Great British Bake Off presenter Sue Perkins and singer Nicki Minaj to ordinary schoolchildren – do it because they face spiteful, nasty or downright threatening comments from bullies and trolls online. So it’s safe to say Facebook doesn’t want you to feel bullied, marginalised or discriminated against, because it wants you to stay and chat.
The Compassion team at Facebook uses what they know about human emotions to tweak and refine Facebook itself.
The result of these design changes is a massive increase in people’s engagement with the site’s features – which ties in neatly with Facebook’s goal of getting you to communicate more through their platform.
For example, take the ‘reporting’ tool on Facebook that allows you to report a photo or comment that you want one taken down. In the first iteration of these tools, Facebook gave users a short list of emotions – like “embarrassing” – to communicate why they wanted a post removed. According to Facebook, 50 pc of users seeking to delete a post would use the tool, but when Facebook, advised by its Compassion team, changed the message to “It’s embarrassing” – a more human articulation – the interaction rose to 78 pc. And of those requests, 85pc of the time, the person who posted the photo took it down or sent a reply.
Success! Facebook had encouraged a meaningful conversation, which would not have otherwise happened.
Teenagers were an entirely separate case: they had to be presented with more options than just “it’s embarrassing” when they wanted to remove a post. Now, they are also asked about their emotional states, to get them to open up: what’s happening in the post, how they feel about it (“annoying”, “threatening me”) and how sad they are. In addition, they are given a text box with a pre-written response that can be sent to the friend who hurt their feelings.
In early versions of this feature, only 20 per cent of teenagers filled out the form. When Facebook added more descriptive words like “feelings” and “sadness,” that grew to 80 per cent.
Which brings me back to the new “Reactions” feature. While it is ostensibly a design decision, made to satisfy users who have been calling for a “Dislike” button, it is also a way to get people to engage with content more frequently, as well as naturally. Our expressions can capture a more nuanced picture of our affinity, or indifference, to people, brands and products.
Facebook acknowledges that mining people’s emotions on Facebook is a golden opportunity for businesses. Your emotions can be used as proxies for your brand loyalty.
“We see this as an opportunity for businesses and publishers to better understand how people are responding to their content on Facebook,” the blog said.
So when you start clicking the “Wow” button on a life-affirming Facebook ad or add a “Sad” emoticon to a news story about an earthquake, take a second to ponder how your instinctively human reaction is neatly converted into a single data point to feed a hungry Facebook algorithm.
This article was written by Madhumita Murgia Technology Editor from The Daily Telegraph and was legally licensed through the NewsCred publisher network.