Twitter is stagnating.
The social network isn’t growing, advertiser demand is falling, and the stock is crashing.
The ugly and endemic abuse on the platform has also been highlighted in recent weeks, with a torrent of racist and sexualised messages targeting “Ghostbusters” actress Leslie Jones, an episode that culminated in high-profile conservative writer Milo Yiannopoulos — a ringleader of the dogpile targeting Leslie — being booted off Twitter for good.
(Disclosure time: In 2013, I wrote for The Kernel, a tech news site owned by Yiannopoulos.)
CEO Jack Dorsey addressed the safety of its users on Twitter’s 2016 Q2 earnings call with analysts on Tuesday, admitting that the company hasn’t done enough to tackle it in the past. “We haven’t been good enough at [dealing with abuse], and we must do better,” Dorsey said.
Sounds good, right? Twitter is taking a good, hard look at the mirror, and trying to deal with one of its ugliest problems — what’s not to like?
But the problem is that this rhetoric isn’t new.
First, here are more of Jack Dorsey’s remarks on Tuesday (via Seeking Alpha, emphasis ours):
“We haven’t been good enough at [dealing with abuse], and we must do better. That means building new technology solutions, making sure our policies and enforcement are consistent, and educating people about both. We’ve made improvements in the first half of the year, and we’re going to make more. We named safety as one of our top five priorities for this year, and recent events have only confirmed that this is truly one of the most important things for us to improve, and has motivated us to improve even faster.”
Now here’s a part of an internal memo sent by then-CEO Dick Costolo back in February 2015 (via The Verge, emphasis ours):
“We suck at dealing with abuse and trolls on the platform and we’ve sucked at it for years. It’s no secret and the rest of the world talks about it every day. We lose core user after core user by not addressing simple trolling issues that they face every day.”
Well over a year ago, Twitter was admitting that it “sucks” at tackling abuse and trolls, and “we’ve sucked at it for years.” Costolo promised staff: “now we’re going to fix it, and I’m going to take full responsibility for making sure that the people working night and day on this have the resources they need to address the issue, that there are clear lines of responsibility and accountability, and that we don’t equivocate in our decisions and choices.”
And yet earlier this month, black actress Leslie Jones was hit by a staggering wave of harassment that included people sending her pictures of herself with semen on, and images that compared her to an ape.
Dorsey alluded to Jones’ abuse on the call, telling analysts that “recent events have only confirmed that this is truly one of the most important things for us to improve, and has motivated us to improve even faster.”
Reforming Twitter’s platform to deal with abuse while staying true to its roots is no easy task. Twitter users do not have to use their real names (unlike Facebook), allowing for anonymous expression while also opening the door to anonymous harassment. Verifying all users (as some have suggested) would be a fundamental shift in what Twitter represents — and would likely reduce the number of active users on the app — but there’s not an obvious alternative path to take either.
Dorsey is undoubtedly sincere in his determination to eliminate harassment. But after years of these comments, without a clear strategy outlined, Twitter users could be forgiven for wondering why this time will be any different.
Here’s a transcript of the remarks, courtesy of Seeking Alpha:
Krista Bessinger – Senior Director-Investor Relations
Great. And we’ll take our next question from Twitter. A number of people have asked about recent developments regarding safety, and whether our approach to freedom of expression on the platform has changed. Can you please comment?
Jack Dorsey – Chief Executive Officer
This is Jack. This is really, really important to me and to everyone at the company. So, I want to address both freedom of expression and safety together here, since the two intertwine.
We are not and never will be a platform that shows people only part of what’s happening or part of what’s being said. We are the place for news and social commentary. And at its best, the nature of our platform empowers people to reach across divides, and to build connections, to share ideas and to challenge accepted norms.
As part of that, we hope — and we also recognise it’s a high hope — to elevate civil discourse. And I emphasise civil discourse there. Abuse is not part of civil discourse. It shuts down conversation. It prevents us from understanding each other. Freedom of expression means little if we allow voices to be silenced because of fear of harassment if they speak up. No one deserves to be the target of abuse online, and it has no place on Twitter.
We haven’t been good enough at ensuring that’s the case, and we must do better. That means building new technology solutions, making sure our policies and enforcement are consistent, and educating people about both. We’ve made improvements in the first half of the year, and we’re going to make more. We named safety as one of our top five priorities for this year, and recent events have only confirmed that this is truly one of the most important things for us to improve, and has motivated us to improve even faster.