And allow actors to spread disinformation anytime they wanted even though the megaphone was given and then amplified(on engagement metric) by these megacorporations in the first place?
Indeed. It's impossible to stop the spread of disinformation completely, no matter what we do. Instead, what we need, is better education on how to parse and digest information we receive from others. To stop taking everything we hear as automatic truth and instead consider all opinions, even if they sounds wrong at first. We need another Age of Reason instead of what we're executing on now.
I think that "responsible news" and "responsible government" are two things that are achieved by societies by pure luck and cannot simply be willed into existence. Given what's happened in the US with the emergence of the alt right I've been wracking my brain over ways we could safely accomplish free expression of ideas and I don't think it's possible.
Either you allow for hate speech to be empowered and gain a megaphone or someone needs to be the decider on what is false. You don't actually need to kill off misinformed reporting or slanted reporting - you just need to censor reporting that is knowingly false and presenting non-facts as if they are facts - if you can kill that sort of information dead then I think society is able to sort out and prevent bubbles from forming around publications that choose to focus on certain stories - or that consistently frame stories to suit their agenda.
I think that the stars aligned from about 95-2010 where all the news outlets were still afraid enough of the government that they mostly kept in line but the government didn't really have the resources to actually police things.
Oh also, the Age of Reason was pretty darn irrational compared to today - don't let a fancy name and rose tinted glasses blind you to the faults of yesterday.
Instead, what we need, is better education on how to parse and digest information we receive from others
The way it works is that you and I reject information based on reflex.
If someone tells you to drink bleach to cure covid-19, would you do it? No, of course not. it wouldn't take a second for you and I to even consider it. No rooms for critical thinking there.
What you want people to do, is based on a slow thought process and careful consideration, but nobody have such time.
Disinformation agents can just keep spamming us with even more facts to consider since they can make up anything they want. Gish gallop as it were. Verification is slow, even just surface level check to make sure the facts match up.
So how did we not become fooled? Because we trust certain authorities and reject other out of hand.
> It's impossible to stop the spread of disinformation completely, no matter what we do
So we shouldn't try at all? It's similarly impossibleto in the near term reduce cancer, Covid or drunk driving deaths to zero. Should stop fighting them?
We have evidence, from de-platforming ISIL, that this method works in reducing radicalization. (A marginalization strategy [1] was found to be more effective.)
I want to embrace arguments promoting the most people having access to the megaphone as possible. But we need to start from a position of facts, not feeling.
> Indeed. It's impossible to stop the spread of disinformation completely, no matter what we do. Instead, what we need, is better education on how to parse and digest information we receive from others.
True, but you're being too black and white. While it may not be possible to stop disinformation completely, it is possible to reduce it significantly. What you're advocating for is akin to admitting that lead will never be completely eliminated from drinking water, so instead of doing anything to reduce it from toxic levels, we should (unrealistically) work on making people who are immune to lead poisoning.
Your solution is a chimera. It is not possible (or at least wildly impractical given current constraints) to teach most people to be so knowledgeable and wise that they can find the truth in a haystack of seductive lies. And even if that wasn't true, the people who've bought into the lies would label your education program as biased and fight it.
I'm all for beating the "more education" drum, but at some point we're going to need to reckon with the fact that there are some very intelligent people out there who should know better and don't or, worse, exploit it to take advantage of others.
Seconded. Establishing what is disinformation is hard; establishing truth is even harder. Restricting speech is fraught with trouble, but there is much precedent in the USA for compelling speech: see drug warnings and nutrition facts.
Censoring and banning should be reserved for actual hazards such as provoking violence. But I think the automatic tagging of social media with "Here is another perspective", while being seen by some as chilling, is the best way to go.
Yes, because the trade off is worth it, and we should treat people like grownups who can think for themselves. Why not encourage critical thinking and wider reading, instead of being the police of literally all the information on your giant social media platform?
Critical thinking? The vast majority of useful information in the world is based on trust and authority, not on critical verification and people examining the evidence directly.
Do you have any idea just how much work it is to verify that just the facts cited actually match up? That doesn't include verifying if the facts are accurate in and in itself.
I did a surface level verification to a virology professor, and I found a few details wrong, and some which I couldn't figure out which fact was likely correct. It took me forever to do, and that was just one part of a very long lecture series.
How about the fact that whenever someone offer me a cure for covid-19 in term of injecting bleach, I basically dismissed it out of hand. Was it because I think long and hard about injecting bleach into my body is a good idea? No. There was certainly no critical thinking happening. It's a reflexive rejection.
instead of being the police of literally all the information on your giant social media platform?
These platforms are not in any way unbiased free for all discussion forums. They are already making editorial choices by choosing to promote viral posts to drive up engagement metric and pushing down any less inflammatory posts.
At the very least they can do is slow down any angry promoting posts.
I understand (and agree) that the majority of knowledge we have is based on authority. I'm not suggesting everyone should do their own climate change research to believe climate change. I'm just suggesting they don't have to believe all the headlines they read, they can read multiple news sources with different political views, etc. Basically, you want people to decentralize their trust in authority. I think that'll quell fake news more than getting Facebook to try to moderate it all a like giant, impossible game of whack-o-mole. It doesn't have to be that complicated. We (try to) teach school children this in history class. I think it's a lot more complicated, and risky, to overly moderate your tech platform for "misleading" news, particularly related to politics.
Please define disinformation, and who gets the right to decide what's correct? And what if they are wrong? I still remember when experts/ CDC / WHO claimed wearing a face mask is useless. I still remember there was once a time claiming earth not flat is disinformation and could be burned to death alive.
If it was the megaphone that was the main problem then simply removing the engagement metric aspect should solve it. Reddit tried that when they stopped listing controversial subreddits from the main page and stopped giving new members recommendations to go there.