This is my understanding of their analysis, based on a fairly shallow read:
> Are accounts being created on an alternative platform after being suspended?
A: Yes, 59% of Twitter users and 76% of Reddit users moved to Gab.
> Do suspended users become more toxic if they move to another platform?
A: Reddit users became more toxic on Gab. 60% of Twitter users became less toxic and 20% became much more toxic, although the most toxic posts
contained hatred against Twitter and complaints that their free speech and rights had been denied. (Toxicity was determined using Google's Perspective API.)
> Do suspended users become more active if they move to another platform?
A: Yes. A manual inspection determines that at least some of that increased activity is complaints about being suspended.
> Do suspended users gain more followers on the other platform?
A: Although users tend to become more toxic and more active after they move to the alternative platform, their audience decreases.
I think you could read this either way. Deplatforming is ineffective because it "radicalizes" those have been deplatformed. Or; deplatforming is effective because it reduces the spread of toxicity. Your post above is mainly focusing on the former; my post focused mainly on the latter.
The jury's still out, as you said. Personally, I'll continue to lean in favor of moderation, if only for the selfish reason that unmoderated communities are nasty places, and I want to participate in communities that "bring me joy," to indulge in a Kondo-ism. I think we've shown pretty conclusively, though, that your argument "The point is, countering negative ideas with suppression does not work" is premature at best.
> You didn't actually answer my request for evidence, though
I agree with the NYT that censorship is rooted in fear [1].
Evidence is aplenty of the benefits of open discourse. In real-world places where open discourse is encouraged, people and ideas thrive. Also, saying "you need to be protected from other people's words" is not a winning argument in the public sphere. People want to be trusted to make their own decisions about how to feel, not have the importance of speech dictated to them.
It's really only a small minority who seek protection against certain viewpoints, and they too want to be able to express themselves. Unfortunately, censorship is also used against them, often with prejudice and without their knowledge [2]. History has shown how this has happened over and over, for example in "Don't Be a Sucker" (1947) [3]. If you choose to ignore it, that is your prerogative. History is evidence.
> I think is the study we've been discussing
Thanks for linking it, that's not the one I had in mind. To expand on my previous comment about how we should accept "ugly" forums, I think measuring toxicity is problematic. For one thing, it's a subjective measure. One man's trash is another's treasure. For example, here's an article from someone making a case in favor of Kiwi Farms [4]. But also, censorship can chill what people state publicly. I already shared Axios's write-up of a recent study that shows that these days, what people say publicly does not align with what they say privately [5].
> The jury's still out, as you said. Personally, I'll continue to lean in favor of moderation, if only for the selfish reason that unmoderated communities are nasty places, and I want to participate in communities that "bring me joy," to indulge in a Kondo-ism. I think we've shown pretty conclusively, though, that your argument "The point is, countering negative ideas with suppression does not work" is premature at best.
FWIW, I think moderation is fine if the author is informed of actions taken against their content. That is not happening consistently on any of the platforms though, and hundreds of millions, perhaps billions, of users are impacted. Load 10 tabs of randomly selected active Reddit users on Reveddit [6]. Five or more will have had a comment removed within their first page of recent history. Almost none of these will have been notified, and all of their removed comments are shown to them as if they're not removed. I just did it and got 7. Reddit last reported 450 million monthly active users. And, Facebook moderators have a "Hide comment" button that does the same thing:
> "Hiding the Facebook comment will keep it hidden from everyone except that person and their friends. They won’t know that the comment is hidden, so you can avoid potential fallout." [7]
It's hard for me to believe that this has had no negative impact on discourse, particularly when our recent difficulties communicating across ideologies seem to align quite well with the introduction of social media. Things like this 1998 Firing Line episode [8] simply are not happening today. The depth of conversations these days is shallow and combative.
> I'll let you have the last word. Best wishes.
I will reject (graciously, I hope) your offer. I think continued discussion is the way forward.
I think is the study we've been discussing: https://seclab.bu.edu/people/gianluca/papers/deplatforming-w...
This is my understanding of their analysis, based on a fairly shallow read:
> Are accounts being created on an alternative platform after being suspended?
A: Yes, 59% of Twitter users and 76% of Reddit users moved to Gab.
> Do suspended users become more toxic if they move to another platform?
A: Reddit users became more toxic on Gab. 60% of Twitter users became less toxic and 20% became much more toxic, although the most toxic posts contained hatred against Twitter and complaints that their free speech and rights had been denied. (Toxicity was determined using Google's Perspective API.)
> Do suspended users become more active if they move to another platform?
A: Yes. A manual inspection determines that at least some of that increased activity is complaints about being suspended.
> Do suspended users gain more followers on the other platform?
A: Although users tend to become more toxic and more active after they move to the alternative platform, their audience decreases.
I think you could read this either way. Deplatforming is ineffective because it "radicalizes" those have been deplatformed. Or; deplatforming is effective because it reduces the spread of toxicity. Your post above is mainly focusing on the former; my post focused mainly on the latter.
The jury's still out, as you said. Personally, I'll continue to lean in favor of moderation, if only for the selfish reason that unmoderated communities are nasty places, and I want to participate in communities that "bring me joy," to indulge in a Kondo-ism. I think we've shown pretty conclusively, though, that your argument "The point is, countering negative ideas with suppression does not work" is premature at best.
I'll let you have the last word. Best wishes.