This Is How Facebook Is Secretly Trying To Make You A Nicer Person

Author

Chris Gayomali

October 24, 2014

It’s a modern fact: People are meaner on the Internet than they are in real life. Academics call this the “online disinhibition effect,” a term coined by psychologist John Suler, who once explained it to me like this:

In the real world we react to other people’s body language and facial expressions as a way to modify what we say to them… If we notice a grimace, for example, we ease up on anything we might be saying that is inappropriate. Online we don’t have these non-verbal cues, so without that feedback some people find themselves digressing into an inappropriate rant.

But can that psychology be used to reverse course, and make the Internet nicer? Facebook is trying with the tool it believes in the most: its unfathomably deep well of behavioral data.

Arturo Bejar, the director of engineering for Facebook’s 80-member Protect and Care team, is the man leading this grand experiment, and he’s the subject of a New York Times profile today by Nick Bilton.

Bejar’s goal is onerous. He and his team are trying to replicate real-life emotional feedback inside Facebook’s digital confines. Essentially, Bejar is trying to make the social network a friendlier, more empathetic space by answering the question: Would people act any differently online if they knew they were hurting someone? Writes Bilton:

Mr. Bejar is trying to create empathy among Facebook users, in what used to happen in real settings like the playground through social cues like crying and laughter.

This may seem like a piffling side project to some. But I believe the success of social media largely depends on solving this problem and teaching users to be kinder and more empathetic. Most people I know who have quit services like Twitter and Instagram have done so because commenters were spiteful, insensitive or just plain nasty.

Trolls and cyberbullies certainly seem louder and more petulant than ever. But Facebook is optimistic that online communication could be better. One way the social network is trying to fight back is by giving some of its users the anonymous ability to notify their friends if something they posted is harmful or hurtful:

The company told me that each week eight million Facebook members use tools that allow users to report a harmful post or photo. (The tools can be used by clicking on the little upside-down arrow in the upper right corner of a post or the options button at the bottom of photos.)

It believes that providing that tiny bit of emotional education can go a long way. The system, which is constantly being refined, appears to be working, too: For Facebookers with access, usage grew 80% after the team fiddled with the prompt, adding better descriptors in the instructions like “sadness” and “feelings.”

Teens in particular, who sometimes lack the emotional capacity to handle bullies or disagreements, are being coached through the process of how to respond to mean jokes–which, at least according to Facebook’s data engineers, aren’t usually posted with the intention to cause harm. In Bejar’s estimation, online communication is poor because our empathetic toolkit is much smaller. And he and his team would like to change that.

Read more about Facebook’s plans here.

[h/t: New York Times]

Great ! Thanks for your subscription !

You will soon receive the first Content Loop Newsletter