People always blame Facebook when the existence of Internet forums has always led to radicalization of individuals. Facebook's crime is making forums accessible to all.
These are just your fellow people. This is how they are in the situation that they're in. So be it. Let them speak to others like them.
The cost of that is many angry people. The benefit of that is that folks like me can find my people. That benefit outweighs the cost.
People always blame Facebook when the existence of Internet forums has always led to radicalization of individuals. Facebook's crime is making forums accessible to all.
If it were only that, I would have a hard time assigning blame to Facebook. However, it is not only that. Facebook exercises editorial control through its recommendation engine. Users don't see all posts in chronological order. They see posts ranked by Facebook based on invisible and inscrutable algorithms that are optimized for engagement.
It just so happens that making people angry is an effective way to keep them engaged in your platform. Thus it's not fair to call Facebook a neutral party if they're actively foregrounding divisive content in order to increase engagement.
I'm sympathetic to this position. I've heard people say the same about YouTube and I don't have a concrete position on this.
On one hand, if someone were to tell me "The Mexicans are ruining America" and I were to say "Damned right! Who else do you know who says these great and grand truths about America?" I would expect that person to introduce me to more people like them and my radicalization and engagement would increase out of my own desire to have more of this thing. That aspect of Facebook's recommendation engine just seems like a simulation of a request for more like what I want in a very obedient manner. That is, the tool is actually fulfilling what I am expressing I desire.
On the other hand, the inputs are inscrutable and not clearly editable. For instance, suppose I look at myself and say "God damn it, some of these things I'm saying are really bigoted. I don't want to be like this", I cannot actually self-modify because there is no mechanism on Facebook to modify the inputs. It'll select for me the content I have these auto-preferences for but not the ones I have higher order preferences for.
Essentially it's a fridge that always has cake even though I want to lose weight.
So, yeah, I'm sympathetic that I cannot alter the weights on my recommendation and say "I want to clear your understanding of the person I want to be. Stop reinforcing the one I am now."
Certainly the recommendation engine is a flaw. I do like recommendations though and that's my favourite way of browsing YouTube in the background. It's pretty good at music discovery. So, perhaps it needs to be only opt-in. Imposed by choice rather than by default. It still has to be possible to turn it off.
Even then, I'm not sure. This is an ethical question I've been thinking about for ages: Is it ethical to allow someone to make a choice that could be detrimental and that they cannot recover from? What are the parameters around when it is ethical? Opting in to recommendations could be a one way trap.
The difference is that facebook is unlike a forum. It's not actively moderated, and content is bumped according to engagement/marketing potential rather than chronologically by genuine user interest alone.
I don't think an open society can be built on top of an advertising platform. Facebook is not a neutral party here - they control who sees what content at what time with little accountability or transparency.
These are just your fellow people. This is how they are in the situation that they're in. So be it. Let them speak to others like them.
The cost of that is many angry people. The benefit of that is that folks like me can find my people. That benefit outweighs the cost.
This is just the price of the open society.