Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"We want people to have choice and agency"

I think this fundamentally misdiagnoses the root of the issue, post 2016-election. People have plenty control over what content they see (via follows, blocks, hides, downvotes etc), and existing social media companies are exceptionally good at figuring out how to show them more of the content they want. The problem is when it shows users content that they want but others don't want them to see, and in our current political environment there is no consensus on where the line is between being a responsible steward and censorship.

Put it a different way, if Truth Social had taken off users would have had "more choice" of social networks. Would the Mozilla leadership have considered that a good outcome?



> People have plenty control over what content they see

I think this is naive, and the overall comment is equally ideological as you're accusing mozilla as being.

Advertiser-funded social networks will always have incentives that are misaligned with users. They want to optimise for impressions so they can drive more views. It's been demonstrated so often that social networks optimise for engagement bait - putting content that drives hate clicks in timelines - to juice engagment numbers and get more ad views.

Decentralised and ad-free are distinct qualities of a social network, but they do work hand-in-hand to produce a pro-user network. It is harder for a decentralised modal to make an anti-user change because everyone else can work against them - we saw this to an extent with third party twitter accounts not having the algorithmic timeline.


> Advertiser-funded social networks will always have incentives that are misaligned with users.

That's fair, and to clarify I am not saying that users have complete control. My point is just that looking back on the last 7(!) years of discourse about the ills of social media (misinformation, radicalization, polarization, mental health toll), I don't really see how control or choice would help?

> comment is equally ideological as you're accusing mozilla as being I'm not sure I completely followed this part, but to be clear I'm not advocating for or against an ideology re: Truth Social. I'm just saying that it feels like a counter example to their position that choice is a solution.


> > > People have plenty control over what content they see (via follows, blocks, hides, downvotes etc),

then

> I don't really see how control or choice would help?

It's possible that someone has some X but could still use more of X.

While users have a lot of "control over what content they see", there's also far more hidden; IOW, users could have additional control and choice.

As an example, "social media" services often promote two-way blocking over one-way blocking (e.g. "muting"), or may not even offer one-way blocking. This means that if I tire of John Smith and "block" him, he's likely to be unable to read anything I've written before or write in the future. That's poor control for users. (Especially if I continue to smear John Smith after I've blocked him.)

Services often censor discovery, even absent a block: a post may be only accessible via direct link and cannot be found via search or user re-posting.


Social networks boost misinformation, radicalization, and polarization because they actively promote content that engagement, which gets them more ad impressions.


That's true, but misinformation, radicalization and polarization also may occur for ideological reasons. Mozilla would not allow everybody with all points of view to use their own fediverse instances; they'll block people they find reprehensible. Those people will then go to another instance, which mutually block each other. Now you've reinvented radicalization and polarization without a profit motive.


There's a large difference between people choosing more or less private clubs and FB tuning their feeds to promote conflict.


Two very different kind of systems can produce very similar results. "people choosing more or less private clubs" is an apt description of many fringe religions / cults which eject people who question the narrative and tell their adherents to cut off friends and family who aren't in the group. The Jehovah's Witnesses come to mind in particular, they're very different from Facebook but they nonetheless polarize and radicalize people.

Private clubs are not necessarily bad, but ideally they should moderate for basic decorum. If they moderate for ideological alignment, they frequently become radicalization machines; echo chambers where people get their biases reinforced without being challenged.


What happens on Facebook is much less legible compared to the effects of organized religion.

But perhaps only because it's so new ? (Parallels with the printing press => Protestantism and other propaganda come to mind again...)


While I agree that ideological reasons will exist in any medium, I think you're proposing an equivalence between the solutions that is made up. Only people with 100% ideological bent will take the actions you suggest; the rest are likely to have many different interests that happen to span purely ideological boundaries, and they will want to remain engaged with those interests such that they avoid finding themselves roped into a place where they only associate with the extreme ideological outliers.

I think you see this in the various ideologically-driven twitterlikes, which have succeeded in only really capturing the extremes and have not made a dent in the mainstream.


I will note that this already had happened with Mastodon when Gab decided to "join" :

https://www.theverge.com/2019/7/12/20691957/mastodon-decentr...

Notice the different reaction of various instance admins and client developers.

From what I have heard, since then Gab broke compatibility with other Mastodon instances ?

So the crisis was dealt with reasonably well for all involved ?

(Truth Social is also based on Mastodon.)


Yeah the behavior of instance blocking is an inevitability for these systems. This creates the conditions for those nearer to or already at the extremes to self-select out of the mainstream by choosing to join already radicalized networks.

I think this instance-level blocking helps contain overall radicalization tho, versus the more pernicious radicalizing via mainstream social networks. Basically I don't see them as equivalent in effect, which was the point I was making.


non ad funded social networks will still have the issue that "influencers" want followers and clickbait hate topics will get them more.


> People have plenty control over what content they see (via follows, blocks, hides, downvotes etc)

Ehh… I'd argue those tools provide an illusory and superficial level of control. Follows and similar… Don't really work these days from what I hear, and blocks, etc. are specific to individual pieces of content. The algorithmic feeds are still working away, and tilt the entire range of content that users are exposed to.

https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-po...

E.G. Do most (2018) Youtube users even know that the platform is constantly and actively (if inadvertently) trying to radicalize them? Or do they just assume they see the same things as most other people, not think about it too much, let it fade into the background, and accept it as normal? Hard to have choice and agency when you're not even aware of the ways that the much larger and more powerful entity is trying to manipulate you.

We evolved to generally think that what we see is a representative sample of what actually is. When instead what we see is a constantly shifting funhouse designed to maximize clicks and ads by profiling us individually, the information asymmetry between the platforms pushing the "content" and the users consuming it makes it hard for me to believe that users even meaningfully consented to the situation, much less "have plenty control over" it.


Which makes the way that threads are shown and transparency about how those programs decide to show them even more important : they work very differently on a RSS feed, Hacker News, Xitter, Mastodon, YouTube, Facebook !


>existing social media companies are exceptionally good at figuring out how to show them more of the content they want

I don't want to see what I want. I want to see what I choose. These are not always the same thing.

I want control over my own life. It doesn't matter how benevolent the dictator or how well they know me. It's not their business to choose what I see.


> People have plenty control over what content they see (via follows, blocks, hides, downvotes etc), and existing social media companies are exceptionally good at figuring out how to show them more of the content they want.

Edward Teach (ie The Last Psychiatrist) has this concept that it's not what we want, but how we want that really matters.

We have the illusion that we have agency because there's so much variation in what we want, but when you look at how we want, it's all served up in the same way. He uses porn as an example, where you can have any fetish you want, but how you fulfill that want is by scrolling though "content" on one of a handful of near-identical websites.

It applies equally so news and social media. The "what" is diverse, the "how" is hegemonic.


> social media companies are exceptionally good at figuring out how to show them more of the content they want

We have no way of knowing if that’s true. Social media companies show users what keeps them most profitably engaged with the site, and rely on the casual fallacy that engagement represents preference.

Decentralized, open networks do promise choice in a way that centralized corporate networks are unable to secure. That was as true for the future of Truth Social as it was for all the existing corporate networks.

Whether today’s most promising protocols for decentralized, open networks are going to be adequate and whether network effects are cohere on them is another matter. Mozilla seems more hopeful than myself.


Mozilla also put out a blogpost title "We need more than deplatforming"

Seems they only care about choices when they are choices they agree with.

https://blog.mozilla.org/en/mozilla/we-need-more-than-deplat...


The 'more' in that post was:

- Reveal who is paying for advertisements, how much they are paying and who is being targeted.

- Commit to meaningful transparency of platform algorithms so we know how and what content is being amplified, to whom, and the associated impact.

- Turn on by default the tools to amplify factual voices over disinformation.

- Work with independent researchers to facilitate in-depth studies of the platforms’ impact on people and our societies, and what we can do to improve things.


> Turn on by default the tools to amplify factual voices over disinformation.

Ah, yes, reinforce the status quo by diminishing or silencing the dissidents. That's never gone poorly.


It's content moderation, of course there's a wrong way to do it. There is also such thing as objective fact and falsehood.


The problem is that signs and that which they signify have becomes so far removed from one another that Pepe the Frog, The Pope, the OK hand signal and the purple teletubbie can be in a state of superposition with regards to meaning. This is a direct result of our ever increasing mediated existence.

The true meaning of a thing is dependent on the actions taken in response, cribbing Wittgenstein to bits.

Take a look at the actions people take while interacting with social media. Clicking, scrolling, sitting and staring at a screen while synchronized pixels entertain or enrage. That’s the entirety of the true meaning. Fictions are of course applied on top as the existential dread of being reduced to a servomechanism is too hard to bear.

The only solution is a conscious effort to engage in object-level reality and to deny the power of these increasingly amplified messages to form what you would consider truths about the world around you.

Is this a broad scale trend or is it just the only 5 videos of X outrage on the planet?


> People have plenty control over what content they see (via follows, blocks, hides, downvotes etc)

Only if you factor in content blocking extensions. Otherwise, how do I get rid of inorganic "recommended" (promoted) content? How do I get rid of the "Trending", "Shorts", and "Top News" crap without an extension? And controls like youtube's "Don't recommend this channel" and downvoting don't actually prevent youtube from recommending those things again, the official controls given to users for controlling their recommendations are unreliable at best, if not outright placebos.

Youtube could implement these features properly but don't care to. Extensions like blocktube are the answer, but it shouldn't have to be this way.


> Put it a different way, if Truth Social had taken off users would have had "more choice" of social networks. Would the Mozilla leadership have considered that a good outcome?

Mitchell Baker - Mozilla's activist CEO - already answered your somewhat rhetorical question when she insisted "we need more than deplatforming" [1].

I'm all for decentralisation and I include decentralisation from an organisation like Mozilla in that drive. Why does Firefox sync insist on centralising authentication and authorisation? Why was the possibility to run your own FF sync server without any external dependencies on Mozilla-hosted services removed? Given that Mozilla has become more and more politicised it is not unlikely that Baker will insist on doing "more than deplatforming" to those accounts which do not fit her ideology.

[1] https://blog.mozilla.org/en/mozilla/we-need-more-than-deplat...


> Given that Mozilla has become more and more politicised it is not unlikely that Baker will insist on doing "more than deplatforming" to those accounts which do not fit her ideology.

Mozilla has been quite clear that they are censorious. From their Mozilla.Social "content policies":

"To ensure an inclusive and safe environment, hate speech and derogatory language is not permitted on Mozilla.Social."

How is "hate speech" defined? Broadly. Moreover, Mozilla recommends people self-censor:

"Simply put, if you’re unsure if a word is derogatory, don’t use it."

They have similar language in their "Harassment" section:

"If you are not sure whether a behavior would constitute harassment, it’s best to avoid that behavior."

And if that was unclear:

"If you are looking for a social network that permits all speech, without limitation, Mozilla.Social may not be the right place for you."

https://www.mozilla.org/en-US/about/governance/policies/soci...


Yeah I wish they'd fix the sync server or release the new one they are still working on. They really don't want us to host our own data :(


FYI, both Truth Social and Gab are based on Mastodon :

https://news.ycombinator.com/item?id=28959468

https://www.theverge.com/2019/7/12/20691957/mastodon-decentr...

(No idea how much they are still technically Fediverse-compatible ?)


That about sums it up honestly, this is the hard problem of the information age and there's no easy answers. Which is why it's so frustrating to see people arguing for totally unrestricted speech, I completely understand the desire, if it worked that would be ideal. But taking this stance requires plugging your ears to the harms and the fact that the lizard brain is not prepared for this. You are not immune to propaganda, to advertising, to emotional manipulation, to disinformation -- no one is. And the certainty that you are is what makes you vulnerable to it. You become your own conman and convince yourself. Being clever and persuasive is a detriment because you deploy all the logic and reason at your disposal to against yourself and your guard is down because the call is coming from inside the house.

This shit is only gonna get worse, the internet isn't getting smaller and the attacks are getting larger and more sophisticated. I genuinely don't know what the right course is because even education isn't it. A defense against the dark arts class won't do it because knowing how it works only helps a little in identifying when it's happening to you. People who join cults know they're in cults and it doesn't stop them. You still need the deprogramming after. This whole situation is awful and one giant mess.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: