Facebook employs Arturo Bejar to make sure that the site’s 1.3 billion users are nice to one another.
That means no cyber-bullying, and it also means Bejar has his work cut out for him — since when are people nice online?
The New York Times reports on Bejar’s job, “
As the director of engineering for the Facebook Protect and Care team, 80-strong, he believes that most users are not trying to be mean and that they will retract a comment (and even feel bad about it) if they realise it has caused someone harm.”
Teenagers are the particular demographic being focused on. Bejar says he is in charge of figuring out how people feel when they’re cyberbullied, since facial and audio clues disappear when you’re dealing with online interactions.
Mr. Bejar’s team designed these tools to let people know someone had hurt their feelings, and he said the system actually worked. (This is different from the newsfeed experiment in June, when Facebook received criticism for tinkering with people’s emotions as part of a psychological study to examine how emotions can be spread on social media.)
Creating empathy on Facebook has not been easy. Researchers have learned that a few letters can have a profound impact. For example, in the first iteration of these tools, Facebook gave users a short list of vague emotions — like “embarrassing” — to communicate why they wanted a post removed. At the time, 50 per cent of users seeking to delete a post would use the tool, but when Facebook added the word “it’s” to create a complete sentence (“It’s embarrassing”), the interaction shot up to 78 per cent.