On Sunday, Joy Reid devoted part of AM Joy to discussing the impact (if any) of Facebook banning individual white nationalists from its platforms. It seems that now they've shown up in the Mueller report as a factor in Russia's election interference, Zuckerburg is feeling the heat. Of course, Facebook's official response to this is to act like they've ALWAYS banned hate speech, and this is nothing NEW or divergent from the policies they've ALWAYS held to. To hear their spokesperson talk, anyway...
"We've always banned individuals or organizations that promote or engage in violence and hate, regardless of ideology," a Facebook spokesperson told BuzzFeed News. "The process for evaluating potential violators is extensive and it is what led us to our decision to remove these accounts today."
Suuuuure ya have. Just as extensive as the process Facebook uses to ban Black people who talk about white supremacy, riiiight? Please.
Anyhow, Reid had Shireen Mitchell, founder of Stop Online Violence Against Women and Angelo Carusone, president of Media Matters for America on her panel to assess the impact of this latest Facebook move. She started off with the Mueller report's findings that the Russia-based Internet Research Agency infiltrated Facebook and Instagram to the tune of 126 million people, in an attempt to "use psychographic tricks to get people to vote for Trump or not vote at all" in 2016. Reid asked Mitchell if they should worry more about that than banning individual white nationalists and neo-Nazis.
MITCHELL: It is happening again. I'm sorry. We experienced it in 2018 as well. We are experiencing that now. None of that has changed. Actually, some of facebook's mechanisms to try to deal with what happened in 2016 actually harmed more brown and black people on the platforms and made it more impossible to get their ads out there. If they spoke or said anything about Black people or Latin people, they were banned. But we just saw Trump's ads who were targeting muslim people was let go and they didn't stop them until someone brought the attention to facebook that these ads were there. So we've watched both facebook and twitter, by the way, have said out loud that they won't ban hate speech or use their algos to stop hate speech. I mean, let's be clear. We're talking about hate speech. We're not talking about conservative bias, we're talking about hate speech here. They won't ban hate speech because of the algos may sweep up politicians in the sweep of their algos. That's a problem only because why? Politicians are peddling in hate speech. We should stop that. We should acknowledge that, and not say that this is just conservative bias. Because that's not what this is. Both of these platforms have done that and understand this is part of the process. Facebook had originally -- let's just go back. Facebook had already originally said white nationalism on their platform was fine. They're now saying it's not fine and that little part of this conversation is also being missed, because they allowed that in the first place, they accepted it, and they participated in that part of the conversation that we're still avoiding. Until the platforms accept that, including myself — I was on Youtube in 2009 and was removed from Youtube for speaking up about the tech companies and the race discrimination and diversity issues, and yet Alex Jones gets to say whatever he wants. By the time they took Alex Jones off is when they decided to kind of give me my account back. We have a problem here, in the fundamental frame of both the tech companies in general.
↓ Story continues below ↓
Reid then asked Carusone what thought tech companies should do about this, and his answer showed just how much damage is done when tech companies bow to the wishes of "conservatives" (white supremacists) who complain about censorship. Furthermore, he illustrated that while it seemed super easy for "conservatives" to get what they want just by complaining, in order to get just a modicum of fairness for the people who were harmed by the actions of these white supremacists it took a great deal of mobilization on the part of activists and civil rights organizations.
We should go back to 2016 and think about what are some lessons, and have they learned them? In May of 2016 when Mark Zuckerberg met with all those conservatives complaining about censorship, and then he went in and made a massive overhaul to the Training Topics section based off of no data whatsoever, there was a three fold increase in the reach of fake news that's sort of the REAL fake news that's on facebook immediately after that. Finally exceeded the reach of actual news outlets. They got work and so really what we're talking about here today with the bannings are not necessarily the bannings of these individuals but i'd like to take a step back and look at the rules. What are the rules? And to Shireen's point, they did. We needed to actually GET the rules about what was white nationalism even changed. That took advocacy. That took civil rights organizations and activists speaking up to get Facebook to actually change the rules and say, "Oh, yeah, we're going to include white nationalism and white supremacy and white separatism." Then separately, and I think this is where it comes full circle, is then looking at the enforcement side of it. The one thing about this action that makes me feel a little bit better is that instead of using the basic key word analysis that they did in the past, they took it to a little bit more of a holistic view which is to say what are their activities and actions and are they actually contributing to harm of individuals? So taking a broader view, actually allows for more robust conversation and it eliminates some of the free speech censorship concerns and the reduction of those ideas while it enhances the enforcement and gets rid of people that are trying to game the system and do some of these sorts of activities people that are both abusive, but also designed to suppress the engagement and the participation of all kinds of communities, in particular women and people of color.
Of course, in order to adequately monitor its sites, these companies will have to hire and rely upon actual human beings, and not just algorithms. But that will cost them money, and we all know that's the bottom line for the richest men in the world.