I'm not sure what your point is. Facebook and Google were operating at a loss initially as well. It seems like you are arguing that deplatforming Parler is more ok, because they never got to the point of being entrenched tech hegemonies? Or are you arguing that Parler is somehow more quilty because they weren't making money yet, so their intentions are bad?
My point is that Facebook, Google and plenty of tech companies created this insurrection (damaging countless lives and relationships in the process), earned billions off it and are now claiming the moral high ground? Screw them.
In this case, Parler doesn't sound that bad. At least Parler doesn't have billions of literal blood money and doesn't attempt to get into my life like Facebook does.
If Parler should be held accountable for hosting and encouraging this type of content, then should all the other ones. And if the other ones are allowed to stay, then so should Parler.
That is the equivalent of saying that someone with HIV created HIV -- after all, it was their body that kept producing more virus particles! QAnon and other BS is a virus that exploits the way platforms determine recommendations and whatnot. The insurrection was not created by Facebook; at most you can only say that Facebook should have been more aggressively policing extremists on their platform (and in fact they have been improving their approach to moderation for years -- unfortunately the problem has been getting worse faster than Facebook has improved their handling of it). Parler and Gab, on the other hand, were created as a protest against the moderation that happens on other platforms (even though that moderation is itself insufficient), and that is what they were held accountable for: explicitly and deliberately not conducting moderation.
This "both sides" argument is getting tiresome. Parler was created to be a safe haven for the very extremists Facebook is being criticized for not aggressively banning. One side made the effort and came up short, the other side attacked the effort itself. There is really not much of a comparison here.
What about the fact that facebook has an algorithm they made deciding on what posts are presented each user, apparently tailored to drive engagement? They can feature all the controversial posts that stir people up for the clicks. Youtube is similar. One could make a case that these algorithms cause these problems, promoting conspiracies/etc for the clicks.
Unfortunately, the whole concept of "growth and engagement" (and their biggest implementations - Facebook, YouTube, etc) supports so much of our society today that I don't expect neither mainstream media nor politicians to attack it.
The reason we're attacking Parler and not the underlying evil is because Parler is an easy target while the other big implementation of said evil (Facebook) underpins the careers and livelihoods of many of the people who are in a position to ban it or reform our laws.
Like I said, the conspiracy theories are a virus that exploits the algorithm, which is otherwise harmless and serves a very different purpose. Youtube recommends children's videos to me because sometimes I let my son watch children's videos, which is a pretty reasonable proposition. The problem is that the very same system can become harmful when it starts recommending more and more misinformation after a person watches one conspiracy theory video; Google has been trying to address this by displaying truthful information when certain topics are detected, but obviously there is work left to do.
The real problem here is that we are focusing on the way that these algorithms can send people into rabbit holes of misinformation, without stopping to consider what the same algorithms do in general or the fact that people actually like recommendations (which are in most cases harmless to society). Again, the response to "HIV propagates via the immune system" should not be "we should get rid of the immune system to prevent the spread of HIV."
I grant that it's nice to have relevant content presented. But I'm not in favor of profit-driven companies controlling social discourse with their secret algorithms.
Couldn't there be a way where users have more control over this? Perhaps recommendations to friends, user ratings, stuff like web-rings, etc.
Even the fact that these algorithms are secret gives me the creeps. The political problems we're seen have been accidental, what happens when someone is using these to manipulate everyone on purpose?
One side is BS, that's why. That is the side which is whining about how an app that was created in order to provide a safe haven for people who are too extreme for other platforms is being treated different from those other platforms.
To clarify, I'm not defending Parler nor wishing for it to stay online. I'm just calling for the root cause of this incident to be eliminated which is the unhealthy business model of pitting people against each other. This would include Parler but also Facebook and all these social media platforms.
At the moment, Parler is used as a scapegoat to deflect the liability off the catalyst (if not the instigator itself) of the Capitol storming.
Whether you still think my position is BS after this is up to you.
> That is the equivalent of saying that someone with HIV created HIV -- after all, it was their body that kept producing more virus particles! QAnon and other BS is a virus that exploits the way platforms determine recommendations and whatnot
And just like with HIV, we now understand its method of propagation, know how to curtail it, and actually hold people criminally liable if they knowingly spread it.
Why are mainstream social media platforms given a pass here, considering not only do they knowingly operate a system where such content thrives and spreads, but also profiting off its spread?
> One side made the effort and came up short
One side did not make the effort. They profited off not making the effort despite having ample warning of the upcoming crisis. This doesn't make the other side any better, but neither does it mean that the first side should somehow be treated more leniently than the first one.
I'm not defending Parler, but if we're letting Facebook and others get away with this then so should Parler, so that it serves as a reminder to rethink our approach and eventually ban both of them or force them both to reform (as in actually reform, unlike Facebook which merely claims to moderate but only does so when they've been exposed).
The problem with Parler was not that terrorists were using its platform. The problem is that Parler refused to even try to ban terrorists from its platform, which should surprise nobody given that Parler was created for the benefit of such groups. Facebook has never gotten a pass on this, in fact they have been widely criticized for failing to be aggressive enough in their efforts to moderate extremist content, conspiracy theories, and misinformation.
In a nutshell the difference is this: Parler was created as a safe haven for people and content that had been banned from mainstream platforms (and the majority of small, non-mainstream platforms).
> Facebook has never gotten a pass on this, in fact they have been widely criticized for failing to be aggressive enough in their efforts to moderate extremist content, conspiracy theories, and misinformation.
Well now we have an issue where Facebook's unwillingness to moderate has blown up into large-scale domestic terrorism, so big in fact that it created a market for Parler to cater to.
So why are we still discussing Parler's ban (which I don't disagree with) but completely ignoring the core issue that Facebook initially caused this and should be banned too?
I'm not sure what your point is. Facebook and Google were operating at a loss initially as well. It seems like you are arguing that deplatforming Parler is more ok, because they never got to the point of being entrenched tech hegemonies? Or are you arguing that Parler is somehow more quilty because they weren't making money yet, so their intentions are bad?