> The company itself is a prime source of division in the United States of America
Devil's advocate but why don't you see Facebook as a blank canvas where people can post whatever they want, and it's the people posting the dividing content responsible?
Facebook is just the medium, no? Could be Reddit, could be Twitter. Why are you holding Facebook responsible in your mind?
Because Facebook’s presentation isn’t content neutral. It optimizes for content with high engagement, and users quickly learn to tune their posts towards that so that their posts are seen.
It becomes hard to get the photo of your pleasant day in front of your friends and family, but easy to bait Uncle Roger into a debate about politics or derisively rag on some easy target.
Engaging posts are a specific kind of content and not all platforms optimize for them. Facebook isn’t the only one that does, but it’s looking like it was a pretty socially toxic experiment and Facebook is one of several companies now stuck on the wrong side of the story.
> It optimizes for content with high engagement, and users quickly learn to tune their posts towards that so that their posts are seen.
Why is that Facebook's fault if they are just the middle man for their users (both on the content creation side and the content consuming/engagement side)?
Facebook isn't a neutral arbiter, they make active choices in what their users see. And they frequently choose things that maximize engagement in order to maximize profit. Things that are inflammatory and divisive drive engagement, so that tends to bubble to the top.
This isn't just a problem with Facebook and Twitter either, its an issue in modern journalism as well. Basically anything that is funded by advertisement will fall into this trap.
Why is it Facebook’s fault that they chose to operate their platform a certain way?
Probably because we generally associate responsibility with choice.
Are users also responsible for their own choice to create and respond to engagement-optimized posts? Sure, but that’s an orthogonal concern to Facebook’s own choice and the responsibility they bear for it.
"I don't like the content that ends up on + promoted through algorithms on Facebook, therefore, they should be responsible"
It's not their content, and they aren't the ones engaging with it.
They are pushing it because users post it and then engage in it.
Say we got rid of Facebook tomorrow. This exact same problem would happen at another company/platform unless there was heavy intervention/moderation (which to my understand, Facebook absolutely already does some level of moderation?). What's that equivalent to? I'm not a "free speech advocate" or anything like that but what you're asking for is censorship instead of asking users to bear the responsibility of the content they consume.
I wouldn’t consider only the content/feed selection algorithm - I would take a holistic look at how people interact with Facebook (like button only initially, then expanded to a few more choices; real identity, not many comments shown at a time), then also compare it to other sites and their outcomes like Reddit (up and down votes, anonymous if you wish, lots of comments shown at a time).
Regarding users bearing responsibility for what they consume - I have seen others report running into this and I have as well - no amount of feed grooming keeps the controversial content off my feed for a meaningful amount of time. I actively unfollow / snooze content I don’t want to see, but Facebook has countered this by adding “suggested for you” items to my feed. It’s impossible to only see my friends’ updates. I must be exposed to content selected by Facebook as a cost of using Facebook.
Most people use the word censorship to refer to banning some specific kind of content: banning what. Meanwhile, the discussion around Facebook is usually around how they measure and reward content, not what the content is.
If you're actually being sincere in this discussion, I think you may not see how engagement is just one particular kind of how and it's not a given that social media sites optimize for it. It was an innovation that some sites introduced because it benefited their business model and it caught on for a while. But it wasn't universally adopted, needn't be used at all, and is becoming less fashionable among both sites and users. Facebook's a bit behind the game on restructuring away from it though, possibly because of over-investment and momentum, or possibly because of being a closely-held company that can be philosophically stubborn.
FYI, "gets promoted" is the key phrase there, at least to the many people disagreeing with and downvoting you here.
Promoting that content is the "how" that Facebook has chosen to do and continues to do, and something that isn't a given for operating a social media company. That's what they're being held responsible for.
their content delivery/ranking algorithms prioritize (from my understanding as just a general person in the public who has never seen what's probably millions of lines of code across many different internal private systems at Meta) user engagement over all
If the algorithm is spitting out content that we're all critical of (because it's divisive, bad for mental health, etc.), why are we not acknowledging that it's just a side effect of
1. user input + 2. user engagement to that input = 3. ranked content delivery algorithm
If Facebook went away tomorrow, another platform would fill its place. Humans have shown that they like to doom-scroll on their phone and are ok being fed ads in the mean time because social media scrolling is basically the equivalent of an electronic cigarette for the brain in terms of dopamine delivery.
Because humans have chosen to prioritize the shittier meanier grossier nastier content, Facebook's "models" have been trained on that. Why are we criticizing Facebook's models that we basically played our own individual role in building?
> Why are we criticizing Facebook's models that we basically played our own individual role in building?
Practicality and precedent. We can effectively regulate the algorithms a countable number of companies use in their products, but can't effectively make people not susceptible to doom-scrolling (nicotine, heroin, gambling, whatever).
> If Facebook went away tomorrow, another platform would fill its place.
Not if the troubling practices were illegal or too expensive to operate. It would be a pretty dumb investment to bother. Instead, new platforms would pursue different techniques for content presentation and profitability, just like existing competitors already do.
> We can effectively regulate the algorithms a countable number of companies use in their products, but can't effectively make people not susceptible to doom-scrolling (nicotine, heroin, gambling, whatever).
What does regulation of Facebook's algorithm look like to you? Promote specifically which types of content less/not at all? Obviously "politically divisive/disinformation" but who gets to be in charge of that/classifying what is/isn't disinformation/divisive?
You keep missing that the problem is engagement optimization, not political divisive/disinformation per se.
A site that optimizes for content that people heavily interact with (engagement) is different than a site that prioritizes the relationship type between users (friends original content > public figure OC > shares > etc) or positive response (upvotes), etc
None of these many other systems rely on discerning the specific nature of content or making sure it’s of some certain arbitrary moral character. But they each lead the network as a whole to favor different kinds of content and so the experience for users ends up different. And it’s already known that social media sites can be successful using these other techniques.
> A site that optimizes for content that people heavily interact with (engagement) is different than a site that prioritizes the relationship type between users
I think that's hit the nail on the head.
In a neutral environment, a user would be shown a feed of friends posts with ads mixed in for their demographic.
In an optimised for engagement environment, a user is shown posts that are known to increase engagement and quite often those posts are inflammatory.
This engagement environment is much more likely to expose people to extreme views
Because that middle man is pushing outrage. They purposely changed how it works from your friends pics of their kids to “child molester loose in your community, as what your opposite political party wanted.” That change is well documented, because it drove engagement.
That middle man is a blank canvas. Users submit friend pictures and child molester links. Other users vote the child molester links to the top. Why aren't you sharing any of the accountability on consumption/promotion on the users?
But it’s not just voting. They bias what gets put in front of people, and what doesn’t.
It’s a bit like saying a drug dealer has no accountability because it’s the user who buys the drugs. In this case FB is well aware of the consequences of their actions, and even worse, their total inability to do anything when it’s not in a dominant language. This has had profound consequences in places like Myanmar and India where disinformation spreads like wildfire and leads to the killing of people. You’d think they’d step back and go “whoa, this isn’t what we intended for FB to be” and do something about it. The opposite has happened.
> They bias what gets put in front of people, and what doesn’t.
What came first though, the chicken or the egg?
What came first, the content in an unbiased fashion and then the algorithms were trained on which content generated best engagement and it got promoted
or
did Facebook just behind the scenes start pushing bad content on a whim with no backing information? I don't think that happened
They aren’t a simple middle man though - they are actively choosing the rules for what content gets amplified on the platform.
They’ve done a search over the space of things that drive engagement with the site and have written a feed/content sort algorithm that optimizes for those things. They could just as easily choose to optimize for something else - like number of heart reactions only - but they choose not to because that would presumably result in lower engagement and thus less eyeball time and thus less profit.
This is like saying a journalist is "just a middle man for information". But we all know how propaganda works: it is a careful curation and presentation of certain kinds of information at the behest of others, which ends up creating a narrative that may or may not reflect reality. It is the obligation of the journalist to represent the facts in a way that reflects a common notion of "truth" which includes avoiding deception and misdirection.
Not calling Meta a news company, just making a comparison re: curation and how it can transform the inputs into something else.
How on earth are they just the “middle man” where they are the ones prioritizing certain content through the feed and their algorithms? Do you feel they should bear no responsibility for anything?
Why are they prioritizing the content? They run a platform they created with algorithms they created that react to user engagement metrics. It's the users engaging. Are we saying the user needs to be protected from themselves? It sounds like it.
They didn't post/create the content, and they prioritize what users engage with the most.
They just incentivized it. Your argument is akin to "Yes your honor, I paid the assassin, but I didn't murder that man. The assassin did!"
> Why do they deserve blame?
Because the algorithms aren't neutral.
If Facebook let you follow who you wanted and showed only that content in chronological order, you'd have a leg to stand on. But they are tampering with the flow of information in ways that increase engagement artificially using psychological manipulation.
It's fine if you're content to let them hide behind their algorithms, but you're wrong both in a moral sense and likely soon in a legal sense.
Because Facebook does internal studies to figure out that this is harmful for society, and it continues to do these things regardless. It's a form of tragedy of the commons.
Mind you this is not a question of "just doing fiduciary duty". It could very well be argued that not fomenting fear and extremism in its userbase is good for business and the brand. Yet it chooses to be evil.
And it's obvious why. Facebook is dying. There are almost no "regular" posts and people on there anymore. They're desperate to maintain some sort of engagement even if it means making the platform a gathering spot for the KKK.
And you really believe that at face value? What drives the algorithm is profits. If it profits from pushing you into a bubble at the cost of shaping your world view, it will.
Are you making some kind of Hobbesian moral argument about what people are willing to promote via their own, idealized-as-individualistic habits?
Notice how your responses continue the conversation in a specific direction, and not in another direction. This is something you have choice over. Now imagine you automate this process with an algorithm that adaptively promotes certain replies to your comments and demotes certain other replies in a systematic fashion. Do you not then think that that algorithm has some bearing on how the conversation unfolds?
Sure, but if optimizing algorithms for engagement has negative societal effects, I still think these platforms can be partly held responsible for it. It's their decision what to optimize for right?
At the end of the day they're a business. In America, I can't think of many things that have priority over capitalism (profit over all).
Does Facebook have a right to exist and try their best to earn profit? Yes.
In doing so, they become a middle man for trying to push ads. Does this have measurable negative societal effects? Arguably. I would say "depends on who you ask" but it's pretty hard to look at it subjectively instead of objectively when there is so much data showing it's true.
But what's the alternative? Asking them to stop entirely? Unrealistic. Asking them to make less money not pushing certain content because it's divisive? That's the same content that got optimized to the top based on engagement statistics and is netting them "the most money" from advertisement views, right?
I get what you are saying about the negative effects, I just don't know how realistic it is to hold Facebook accountable when their primary motive is profit. If society tears itself apart in the process because Facebook gave society the platform to do that, at what point does the government intervene and ask them "hey, help us police moral/ethics to control the effects"?
This kind of glazes over the fact that the platform we're criticizing (the one that uses engagement algorithms to display ads on divisive content or whatever we've summarized it to) makes them billions a month/quarter/year/whatever.
"allowed to exist by society" is putting it lightly. It's in demand. High demand.
> At the end of the day they're a business. In America, I can't think of many things that have priority over capitalism (profit over all).
Huh? They could be better, but we have anti-corruption laws because fairness has priority over some degree of profit, we have worker protection laws because worker rights take priority over some degree of profit, we have occupational safety laws and health benefit laws because health takes priority over some degree of profit, we have a zillion laws that plainly demonstrate that social priorities often take some degree of priority over profit.
Social media is a new industry and we're working out the rules. Engagement optimization turns out to be detrimental to society's well-being, so yeah... we're probably going to figure out a way to assert that through law and regulation, and some participants might see their business model impacted along the way.
It's hard not believe that you're just trolling at this point.
> In America, I can't think of many things that have priority over capitalism (profit over all).
You're right. The profit mechanism should be abolished/replaced. But in the meantime Facebook is still morally culpible for their shenanigans, even if market forces agree with them.
> The profit mechanism should be abolished/replaced.
I just can't help but feel this view is wildly unpopular/radical almost anywhere other than an online social media forum for people who like to spend their free time in the comment section (like a Reddit or a HackerNews)
I won't say that the profit mechanism is responsible for all or our problems, but in modern society it can surely account for at least 80% of them.
I believe it's possible to retain autonomous production (ie, to not have a command economy) while using different mechanisms than profit to organize production.
That said, if you're saying "Facebook does evil things because the system incentivizes them to" then maybe the conclusion should be that we change the system, no? And whether the idea is popular or not seems irrelevant, given your admission that it's the root cause. Popularity also might be subject to change given the overall circumstances.
Not been on facebook fore some years now, but it is not "just" a medium. It is optimized to keep you on the site as much as possible, to show you as many ads as possible. This comes with the hard price of showing you what is divisive, as it keeps you engaged. So they do are responsible for prioritizing divisive posts and allowing ads that are so well targeted that they can make you tilt.
No, as usual, don’t hate the player, hate the game. Facebook benefits from a huge network effect that locks people in if they want to keep in touch with their close ones. There is no alternative as you can't built a concurrent because facebook do not allow any interaction from the outside world.
So people come to facebook to keep in touch, and stay because the content is addictive.
Note that moste people are not posting, so they are not even contributing to the content.
Note also that social networks are using our brain biase to keep us addicted.
Personally, I think their optimization of engagement will by default bring out the most extreme viewpoints, and the worst of people. They’d rather show you 10 extreme things in a row to make you outraged, than to show you 2 extreme and 8 normal.
Facebook wasn’t always like this — it used to just be a place where you could see what your friends are up to.
My texting app is neutral. WhatsApp is basically neutral (and, hey, Facebook owns it).
Facebook (and Youtube, and Twitter, and...) chooses which content to promote, for their own interests, which often correlate with generating outrage based on made-up bullshit. They're far from a "blank canvas".
> which often correlate with generating outrage based on made-up bullshit
That's not what my YouTube feed is like whatsoever. I've "curated" it that way though. A lot of "not interested" or "don't recommend this channel" clicks.
Facebook applies some magic to decide which posts show up in your feed and which comments show up at the top of those posts.
When a metric like “engagement” is optimized for, this seems to result in controversial/divisive content making its way to the top. So yes, Facebook is a medium, but it is a medium that encourages certain content by rewarding it with more eyeballs and interaction than other content.
As a contrast, Reddit’s default sort is “best” which tends to put higher scored comments towards the top. Users have the choice of sorting by controversial, but that is transparent to the user that that’s what they’re getting. Reddit’s downvote lets users effectively shun content in a way that Facebook doesn’t - this has its trade offs, as it seems to encourage groupthink and echo chambers.
I was under the impression the content sorting algorithms pay attention to metrics like engagement (likes/reactions/comments) and how long people read it, and then since it's their motive to keep you scrolling for as long as possible (so that you view as many ads as possible), they lean on their user data to promote hot stories that cause good engagement.
Therefore, if the users consumer lots of "divisive" content and it causes good engagement, isn't it the user's fault for being such "fans" of the divisive content that it makes Facebook algorithms promote it?
If a dealer repeatedly puts a needle in front of an addict, and encourages them to take it, and they end up giving in, who's responsible?
The addict is of course not without blame, but at the very least I hope we can find a consensus that the dealer is behaving extremely immoral. This is essentially how I view facebook. It's abusing the human condition for profit with destructive effects.
You’re also forgetting the ability for ads to explicitly target people. So it’s more like finding the person who just lost their family and job and putting a needle in front of them. They weren’t an addict to begin with.
I think a better analogy would be with the tobacco industry. Both are controlled by huge multinationals, heavily connected with the advertising industry, and have a lot of influence and power in developing countries.
Facebook isn’t coming over to your house and forcing you to navigate to their website. This would be more like the addict going to meet the dealer and then being offered what they want.
Would you say it is a heroin addict’s fault for getting addicted to opioids after a doctor pushed them into opioids when the doctor was far more aware of the risks than the patient?
We all have the ability (and some might argue responsibility) to educate ourselves on what our actions cause, and that’s why a lot of people, myself included, minimize or avoid Facebook use. However, that doesn’t remove culpability from Facebook for pushing addicting, polarizing, or divisive content onto users.
They are just pursuing profit, yes. But we as a society should really be pushing for their profit to be diminished by the extent it negatively impacts society. For instance, imagine Facebook were in part liable for terrorist attacks because it pushed radicalizing content, or partially liable for suicides, because it pushed content that destabilized someone’s mental health. We shouldn’t be tolerating them pushing content because the user is “such a fan” of the content any more than we tolerate doctors pushing addictive drugs, gambling or cigarette companies pushing their products, etc.
No. Hijacking emotional/biochemical triggers in our brains is not the same as giving us what we "want". It is the exact same problem as with clickbait/ragebait titles on content (which... often get shared to Facebook).
People ruin their lives doing things that feel "good" in the moment all the time. Facebook is a machine for that.
Not all engagement is equal to Facebook. At one point they considered the angry reaction to be worth twice as much as the happy reaction when deciding if a story should be ranked up.
Facebook optimizes for engagement, and maximizing engagement means making people scared, sad, or mad. Facebook emotionally manipulates its users to stay on Facebook as long as possible so they can show the user as many ads as possible. It was not like this 15 years ago.
Devil's advocate but why don't you see Facebook as a blank canvas where people can post whatever they want, and it's the people posting the dividing content responsible?
Facebook is just the medium, no? Could be Reddit, could be Twitter. Why are you holding Facebook responsible in your mind?