CHAKRABARTI: Now, Nina, of course, this has been going on for some time and very frequently, the online abuse coalesces from just individual trolls into concerted abuse campaigns targeted at specific people. And one of the perhaps most famous or infamous ones I should say was back in 2014. That was Gamergate, that harassment campaign that targeted women in the video gaming industry. So, folks might remember that one of those women was Brianna Wu, co-founder of the company giant Space Cat.
BRIANNA WU [ARCHIVAL]: A friend of mine told me, 4Chan found you. This is the text I get on my phone. And they give me the link, and I click it, and I see 4chan starting to go through my entire life history, right? Looking into my college, going through my husband, getting my address, posting phone number, doing this like internet research thing on me and starting to coordinate a serious harassment campaign against me.
CHAKRABARTI: Now, this is from a conversation we just had with Brianna Wu because we wanted to understand how long the impact of this kind of abuse and harassment stays with a with a woman. And Brianna told us that in 2014, she had two choices: to say nothing, log off for a few days and wait and hope for the problem to go away; or to stand her ground and continue to speak out in defense of women in the gaming world, which she did. And that's when the online abuse got really bad.
WU [ARCHIVAL]: I'll never forget the death threats that went viral that day. [It] said, "Guess what? Beep. I know where you and Frank live." They posted my address. They said, "You're going to die tonight. Your children are going to die, too. You did nothing worthwhile in this life. I'm going to cut your husband's [expletive] and [expletive] rape you with it until you bleed. I've got a KA-BAR knife, and it's coming for your throat." The most vile stuff, and it went hyper viral.
CHAKRABARTI: That was back in 2014, and she remembers it in detail, and Brianna Wu also told us that even though she ultimately survived that online mob all the way to today, she's not the same person she was.
WU [ARCHIVAL]: Gamergate did change me, if I'm being honest with you. The truth is it damaged me in a way that cost me some of my humanity. Today, I can have someone tell me they're going to murder me online, and I read it, and I feel nothing, and I go cook dinner. You lose something about yourself when that's the response that you have. It's like, eventually, this kind of discourse damages you and then the damage keeps you safe. I would love to tell you that it's going to get better, and we're going to grow out of this. But it's gotten worse every single year I've been alive.
CHAKRABARTI: That's Brianna Wu. Today, she's executive director for Rebellion PAC, a political action committee. Now, Nina, the reason why Brianna's story, I think, in fact, resonates even more powerfully right now is to the point she just made. Gamergate was eight years ago. Eight years is basically eight eons in the in technology, right? So, I understand there are issues of speech and the protections we have vis-à-vis the constitution for speech. But within those bounds? What more can or should do you think the platforms themselves do to curb these attacks, if anything?
JANKOWICZ: Well, all of the platforms, Magna have terms of service or community standards that we're all supposed to be held to. And in those community standards in terms of service that says that targeted harassment or harassment based on, you know, sex based on religion, et cetera, et cetera, is not allowed. And unfortunately, right now, what we see is no consequence for abusers when they're putting this sort of stuff out there. At the very worst, they might be asked to delete a piece of content to get access to their account back. Sometimes, very rarely their accounts are just shut down unilaterally. And you know, I've seen other people create a second account immediately, or sometimes they even have a second account waiting in the wings to continue to abuse from.
So, it's difficult in that regard. That doesn't mean that the platforms are doing enough right now. They need to be enforcing those terms of service more stringently. They need to change the reporting process, which it should be said, there has been some incremental change since I started working on this issue and started drawing awareness to it and others have been talking about it. There have been changes at Twitter in particular to make the reporting process more human-centered, they call it.
One thing I would love to see is for platforms to introduce incident reporting rather than just one-off reporting. Right now, what you do is you report a piece of content, or you'll report the account. But what you were talking about before this — this kind of dog piling effect or coordinated campaigns against a person — they don't see that 40,000-foot view they could if they looked right. They have the resources to do that. But the content moderators are often making decisions very, very quickly, and they're looking very myopically at one piece of content.
So, most of the campaigns that I've spoken with women about have been those coordinated campaigns. Those are the really impactful ones. So, I'd love to see some sort of incident reporting introduced so that women can take a little less time rather than just reporting every individual piece of content and retraumatizing themselves as they do it.
CHAKRABARTI: OK, so let's talk a little bit more about those coalescing campaigns because oftentimes they happen, you know, exclusively online. But there are times where the campaigns coalesce because in not online media, on television or on radio, somebody else you know, points out or singles out a woman and then the mob goes run, you know, goes chasing after that person.
And here's an example a year ago, journalist Taylor Lorenz, who was then with the New York Times and is now at The Washington Post, had been under assault already on social media. So, she sent out a tweet calling on women to show each other moral support when that happens. Enter Fox News's Tucker Carlson.
TUCKER CARLSON [ARCHIVAL TAPE]: Taylor Lorenz writes for The New York Times. She's at the very top of journalism's repulsive little food chain. Lorenz is far younger than prominent New York Times reporters used to be. She's also much less talented. You'd think Taylor Lorenz would be grateful for the remarkable good luck that she's had, but no, she's not. Just this morning, she tweeted this: "It's not an exaggeration to say that the harassment and smear campaign I've had to endure over the past year has destroyed my life." Hmm, destroyed her life, really? By most people's standards, Taylor Lorenz would seem to have a pretty good life. One of the best lives in the country, in fact.
CHAKRABARTI: And so, as you can imagine, thereafter, Taylor Lorenz was hit with many more online assaults and attacks. Now Tucker Carlson is going to — Tucker is going to Tucker, right? But Nina is there — I don't know. Maybe I'm putting too much faith in technology here — is there is there something more that the platforms can do when they see this like surge is coalescing action? You talked about it a little bit more, but there are triggering events here. Can they be interrupted?
JANKOWICZ: Yeah, I mean, so my six-person team at the Wilson Center was able to visualize abuse against prominent women. It looks like a bee swarm around you or like pig pens, kind of dust cloud in Charlie Brown, right? It's totally visible if you're on the back end of the platform and you're tracking this sort of thing. Platforms are able to notice a strange uptick in content being directed at an account that you know from accounts rather that don't follow that account or haven't interacted with that account before.
We recommended in our report that if they see that uptick. One of the things that they can do is proactively, you know, put a notification in that person's mentions and say, "Do you want to turn off your notifications for a little bit? We're noticing this surge," and I believe Twitter has actually made that change. We've seen a little bit less of that from other platforms — from Facebook, from Tik Tok, from YouTube.
There are some precautions that users can take to protect themselves proactively, like muting certain keywords. But when these mobs are coming after you and their coordinated campaigns, there's only so much that you can do proactively to mute that stuff and protect yourself. So, I would love to see the platforms really taking a much more proactive role to protect 50 percent of their userbase, right?
It should be a thing that they want to do to make their platforms safer for half of the world online and potentially to get more users, right? If you look at Reddit, 30 percent of that user base is women. And that's because it's an incredibly toxic place. If you look at Twitter, men's tweets are retweeted twice as often as women's are. And that's because, again, we have this dynamic that these platforms are built for and by cisgender white men. And so, I would love to see women's concerns, especially women of color's concerns being thought of first and foremost in the engineering process, as the platforms are taking safety into consideration. And that's too often an afterthought when we're talking about new platforms or new technologies being developed.
CHAKRABARTI: But when you're one of your most important metrics is engagement, Nina. I mean, that is the — it's the 10-bajillion-pound gorilla in the room for every platform discussion we will ever have — that the whole model is built to maximize the worst and most, most emotive human behavior, and that's how they make their money. And until that changes, everything else just feels like just kind of small beans attempts to take care of in the face of a huge problem.
JANKOWICZ: Yeah, I mean, I would love to say that I didn't think this sort of vitriol fueled engagement online, but I've seen it myself, you know. I've seen it in the way I interact, even when I'm the target of one of these abusive campaigns, I'm spending a lot more time on the platform. So, you know, again, I think there is an investment to be made. There is a positive gain to be had when the platforms are basically, you know, making their platforms safer. If there was a safe platform for women. You better bet that all of us would be spending a lot of our time on there, where we felt valued, where we felt that we could express ourselves freely and nobody has created that yet. So that's a challenge to any venture capitalists and entrepreneurs out there.
CHAKRABARTI: Yeah. You know, I mean, there's no such thing as a 100 percent safe environment, right? It just — it doesn't exist, but I don't think it's unreasonable at all to say, well, in online spaces, we have the same expectation for a minimum level of safety, security and respect that, you know, we do in in the real world.
And even just saying that we should have that expectation, you become the target of this kind of abuse. Now I've got I'm running out of time with you, Nina, as I always do — I feel like we only get started. But there's two things I do want to cover before we wrap up here. First is, we're getting some questions like is there a typical profile for the kinds of people or person who spews the abuse?
JANKOWICZ: Well, again, that's hard to quantify, mostly because the many of the platforms, except for Facebook — and there are ways around that as well — require or do not require that you use your real name so people can, you know, have a cartoon profile picture and call themselves Snoopy. I don't know why I'm into Charlie Brown today, but they call themselves Snoopy. And you have no idea who they are.
I will say that most of the abuse I have received does seem to be from male-presenting individuals. But that's not to say I haven't received abuse from women also. And in terms of political inclination, I know everybody wants to know this, I've seen abuse across the spectrum, both in our Wilson Center study toward Republican women and Democratic women from both sides of the spectrum and to myself. I, during that abusive campaign ahead of the 2020 election, received violent threats from both folks on the far right and far left.
So, there isn't a typical abuser. I would say that we have to reckon with the fact that there is endemic misogyny in our society and the fact that there is no consequence for abusers means that lots of people feel empowered to levy this abuse online.
CHAKRABARTI: News flash: bad behavior knows no one single party. So, I, you know you mentioned earlier, Nina, especially about young women, and I really — my mind is focused on them because the future belongs to them, and we need to pave the best way forward for all young people possible. So, in the last two minutes that we have here, you know, first of all, everyone, I would encourage you to read the book. But if you could give us like a CliffsNotes version of what you think people listening to this can do right now to survive and fight back as you put in your subtitle, what would it be?
JANKOWICZ: Well, we've talked about the physical security steps that everyone should take already. I think another thing that everybody can start to do is build awareness in different ways. So again, expressing solidarity online [is] hugely important. Don't just be a passive bystander. When you see abuse happening online, report it. Learn those reporting procedures. In fact, when I interviewed Brianna Wu for the book, she said, you need to know the terms of service and get the platforms dead to rights because when they see a bunch of reports coming in about a single tweet or a single account, they're likely to take action against the account that will inform their artificial intelligence. OK, something weird is going on here, so don't think that it just goes into the ether, although it can be incredibly discouraging. Please do report.
And then, you know, there's also systemic changes that we can make in our workplaces for instance. A lot of public facing workplaces have social media policies about what their employees can post online, but they don't have policies about what will happen if because as a result of their engagement, employees receive abuse. So, I would encourage you to talk to your H.R. professionals to when you're, you know, negotiating a new job offer to say, especially if you're involved in, you know, public facing work — what's going to happen if I get abuse? How are you going to support me if I need to go to law enforcement or something like that? And that can really make systemic change?
And then finally, especially for the young women out there, builds community, be each other's support networks. I would not be where I am without folks like Cindy Otis and Brianna Wu and and others that I interview on in the book and have met on Twitter, who have helped me through so many of these incidents and let me know that I'm not alone. That is incredibly important as well.
So, making sure you do all of that recognize that you know it's hard, but it's possible to get through it, and don't let the abusers win because our voices matter and the fact that you're getting that abuse means you have something important to say.
CHAKRABARTI: Well, Nina Jankowicz, her book is How to Be A Woman Online: Surviving Abuse and Harassment, and How to Fight Back. Nina, it's always a great pleasure to have you on the show. Thank you so very much.
JANKOWICZ: Likewise, Meghna, thank you for having me.
CHAKRABARTI: I'm Meghna Chakrabarti. This is On Point.