It’s depressing that criticisms of Facebook now seem to entirely be focused on them not shutting down speech.
At some point, the root issue with Facebook (as identified best by Jaron Lanier) has been lost in the political noise: their business model is one that incentivizes and enables the creation of a global scale surveillance and behavior modification empire. Getting angry that they are not removing content you feel is “dangerous” is the opposite of fixing this: it’s smuggling in the idea they ought to be presumed an arbiter of speech - that arbitration being the key lever of their toxic business model.
If we agree Facebook should be in a position to decide who gets to talk to whom in the modern public square, we are forced to agree they get to monetize that capability, which is their current business model.
Facebook is not the public square. It’s not even a public square. It’s not a bulletin board or a newspaper or a telephone. There’s no analog analogy. It’s a site run by a private company that you send content to, they analyze it, then they decide what to publish and in what order. The closest analog thing I can think of is Letters To The Editor in a newspaper. They aren't “deciding who gets to talk” or “shutting down speech.” There are plenty of non-Facebook ways to talk that don’t involve sending your message to them and hoping they decide to post it.
> It’s a site run by a private company that you send content to, they analyze it, then they decide what to publish and in what order.
In other words, they are a syndicator.
In the physical world, syndicators remain accountable for what they publish through through their channels. They are not responsible for the material (given they didn't create or commission it), but they sure as hell carry the responsibility for letting the material propagate.
Incidentally, several years back I was talking to FB after they reached out. Visited their office for an informal chat and got to talk to couple of their senior(ish) engineering managers. I asked if they were doing, or planning to do anything like actually educating the users on their platform once they've identified someone having been subjected to propaganda or misinformation campaign. I said that I could imagine working on that type of project.
It's not often that you see a person physically recoil from an idea. "We don't do that!"
As far as I'm concerned, until FB actively fights not only the propaganda being funnelled through their machine, but also its effects on groups and individuals to undo the damage, the company is beyond redemption.
Mobile App Social Media like FB is more like a micro-reality-tv channel in the form of text/image/short-video posts which are co-"produced" (like a tv show producer) by AI algorithms (in ranking) and yourself (in who you friend/follow) and others in your "network" (what everyone comment/like on more vs less).
Just like tv channels optimize their content for TRP ratings and for ad sales, FB does the same – except its engagement optimization cycle is super-micro and super-fast and super-scalable and its ads are super-cheap and super-granular and super-micro.
Its the same business model followed by newspapers/magazines, other forms of content based attention grabbing and holding mechanism which make money through ads. The older business models are more coarse-grain in everything (cost, price, target size, targeting precision, time-cycles etc) and newer tech-enabled business-models are fine-grain everything.
Obviously, the same old social/behavioral/moral rules that worked (or didn't work, but didn't matter) at coarse-grain/slow-cycle/less-massive/more-local won't work (or needs to work better because it matters at scale!) at fine-grain/fast-cycle/huge/global levels. And the answer isn't obvious.
It is obvious same notions/rules/mechanisms won't scale (like content editorialship/moderation etc).
I agree that there is no good analogy for what Facebook is, because it is different. But I think what people really mean when they say this is that Facebook has largely supplanted those things. This might not be true where you are, or for your specific circle, but it is the case in many places.
Facebook penetration in my country is so complete, I would wager there is absolutely no way to, say, successfully run for public office without at least maintaining an active Facebook page. This is why I say that Facebook must be destroyed. It is not acceptable that some company half the world away, based on American morals and American interests, gets to decide who can and cannot realistically get elected here.
Network effects say it would be at least an oligopoly, just as it is now. Unless the whole system would be rebuilt to explicitly disallow this and counter such effects, but who would build such a system and why would they invest in it, knowing in advance they wouldn't profit from it or even able to control it? It would surely face a huge pressure from all the "we can't allow bad people to speak" crowd, and this crowd owns the government, the academia, the banks and the internet infrastructure now. So what exactly the plan for it to happen?
This is technically true but de facto not. It's like arguing that Ukraine controls Crimea because legally (according to themselves and others) they do.
In my experience when this argument is made, I typically follow it up by "well if it's so unimportant, why don't you delete your account?". 99/100 times I've asked this in real life the person responds by saying, "well I don't have anywhere else to say things" (this is during covid).
For the past two years, social media is the only public square allowed by government fiat. Thus it must be regulated as such.
I think you're putting the cart before horse with your conclusion. The reason that Facebook has been one of the only freely available discussion zone for the majority of the world isn't because of government, but in spite of it. The government didn't "allow" Facebook as though there is some permission system involved for setting up social media sites. The government is an unable to disallow it by the constitutional limits set by the 1st Amendment.
Somehow I don't think the 1st Amendment protects a corporation's right to track and monitor its users for profit, even when they're not using the corporation's site/app. Or to fine-tune its algorithms to support behaviour modification.
Without the promises made on behalf of FB's tracking tech and the behavioural feedback loops they farm, the social features are basically worthless.
FB's problem isn't a 1st Amendment issue. It's the fact that it lies about the effectiveness of its ad tech to its advertisers, while also attempting to hide the toxicity of its behaviour mod techniques.
And there are too many parts of the world where it has monopoly status on both.
> The government didn't "allow" Facebook as though there is some permission system involved for setting up social media sites
And that's completely irrelevant to my argument which is that, since Facebook is the only public square by government fiat, it ought to be treated as such regardless of how it got there.
When government exercises eminent domain for the public interest, it does not worry about why it is the house it is seizing was placed there. It just notes that the house is there and then takes proper action to secure it's future aims for public benefit.
Facebook is not the 'only' public square. Please, that's hyperbole, its not even close. Twitter, ticktok, reddit, snap and this very site are all public squares, with tremendous reach. Most of the video content on FB/Instagram come from TikTok, and in case you forgot the previous admin tried to shut them down because users on it organized against them.
I wasn’t making an analogy. Facebook and similarly scaled up communication networks ought to be treated in ways similar to public squares for the purposes of understanding the nature of their effect on speech freedoms, now that we have seen them grown up to cover most of the world.
No, FB users talk to each other not to FB (the editor).
Quantity has a quality of its own - when size is huge it should be reclassified as public interest, not private. There should be mechanisms for people to ensure their voices matter similar to elections and parliament.
They’re the modern public square, as are all social media platforms. You could have made the same argument about the telephone or ISPs, both of those are neutral for a reason
Mentioning hate speech is a pretty good marker though, I'd give 80% probability the author thinks FB should censor more whoever the author considers haters.
Everyone wants to break up Facebook back into Facebook and Instagram and Whatsapp. I don't think that would really solve any of these issues.
I think we're thinking about this the wrong way - we should break up Facebook the same way we broke up Bell into baby Bells. Make "Facebook" a utility. Split Facebook into five different companies, maybe some not even in the US. Each "Facebook" company would have to provide Facebook service to a subset of current users. Allow new people to run Facebook providers.
Unhappy with the customer service from your current Facebook provider, or with the quantity of ads? Don't like the algorithm that's running your timeline? Switch to one of the other providers who may do things differently. Create competition in this market where there simply isn't any.
Some people will point out this forces all these providers to interoperate somehow, so users on one Facebook provider can talk to users on another. This is a really great part of this plan! Our social networks become based on open standards and interoperability instead of being walled gardens.
Didn’t the baby bells that came out of AT&T all merge back into the new at&t? Notice the capitalizations there representative of their corporate logotype at the respective times in the narrative arc. How could we avoid that happening in 10-20 years?
[edit] Further, im not sure how you’d slice it up horizontally instead of vertically - that is to say Facebook is Facebook for “millennials,” Instagram is Facebook for gen z“. It feels like each of these vertical slices retain the same incentive structure they have today but targeting different demographics.
What could slicing it horizontally look like? An independent infra AWS? A social graph? Curious your thoughts.
I think the problem is the incentive structure - a company goaled on engagement will focus on the most engaging content. That’s hate, division, fear and anger. Without a fundamental reimagination of incentives I think the same beast will emerge. The new T-1000 of corporations - as Colbert referred to at&t at the time.
AT&T and the baby bells are not analogous to FB. The analogy is too generous to FB and overstates its importance.
FB is not a medium. It may be a de facto platform for social organisations and advertising, but it's still just an app.
Whilst FB made more money than MySpace and lasted for a longer time, its demise is inevitable for the same reason, and we're seeing the slide occur now.
FB, like AOL and MySpace before it, was fashionable when its feature provided new reach for participants. Sooner or later, these apps reach maximum cachet and after that, they are for "old people".
FB is for old people. And that's not even the worst of its problems.
What's really needed is a legal requirement for open APIs and federation. The only reason why Facebook is so hard to drop once you've been on it for a while, is because they have your social graph locked in - either everybody switches together, or people get "left behind". If they are forced to interoperate with others, that's no longer an issue - and I suspect that this alone would be enough for healthy competition to shrink them down, without any forcible splitting.
I'd break out FAANG's advertising operations. Prohibit conflicts of interest, fraud.
I sorta expected the online advertising bubble to pop, mooting the need for remedy. But somehow it keeps not popping.
Set some threshold. Reach a certain size and your ops get divided up. Spitballing: $10m annual revenue, 100k monthly visitors, whatever. So indies like daringfireball can continue to do their own thing.
Lets break up Facebook into Instagram, Tiktok, Twitter, Reddit and HackerNews.
Lets be serious, the youth market is strongly moving towards TikTok, and its feed algorithm already has created a notion of 'the different sides of TikTok', for conservative vs liberal content.
In what ways is this different than today’s world, where depending on what you want to do (connect with people with similar interests, talk to friends, look at cat videos) there’s a bunch of options for what service to use?
The services available today don't federate with one another. I can't reply to an Instagram post from my Reddit account, or send a message from WhatsApp to iMessage.
While I'm not sure parent's suggestion is feasible or would solve anything, it is more like the AT&T breakup in that the system is still cohesive, even if operated by separate entities.
Those services don’t, but as a Mastodon user, it doesn’t seem such an unlikely suggestion. I have been impressed with fediverse since I joined about a year back, and love the that the feed isn’t manipulated, and that censorship is often the choice of an individual not wanting to see more from a user they find objectionable or from a community/server level when say your server doesn’t want pornographic posts from some other server on fediverse, which frankly reflects sincere human interaction much more than Facebook moderation and their feed algos, and if you did want to see what your community doesn’t, you can both stay and also create an account on another server if you wish.
Federated social media for me has been much less toxic in terms of discussion quality and is much less addictive, I love checking it but I don’t doom-scroll to oblivion…
But server doesn’t get payed to make me doom scroll to oblivion so incentives are much more aligned.
If you maintained the same social graph on each service, you'd still have all the misinformation being shared amongst the same graph members.
Its one thing if the content being served is what some nefarious actor wants a group to consume, for influence sake. Its another when its the content produces and consumes themselves. The current misinformation discourse on FB is the current culture, especially the culture of a large group of people. It originates both on and off FB. 'Q' didn't originate on FB, it originated on 8chan, let that sink in.
Well, there's another option - they are allowed to arbiter, but aren't allowed to control the rules on the arbitration, at least not directly - instead, an overriding entity would do that. The entity would be, most likely, the government, but as a window dressing, an "independent body" whose members are selected by the politicians can be established to give it a veneer of being removed from everyday politics (protip: it never is). While not a lot of people would admit to preferring this model now, I suspect, a lot of them are pushing in the direction that can't lead anywhere but there. Of course, once such body is established, its natural evolution would be first to control all large public speech platforms, and then to attempt to control all speech in general.
It’s not hard to ascertain someone’s business model when they are a public company, nevermind one that you literally can start spending money with in about 5 minutes.
Many alternatives to the Facebook Public Square exist: Discord, Diaspora, Mastodon, er... Ello, erm, Voat.
I’m joking of course. None of those make as much money as Facebook, and I think we can all agree it’s the fact that they are so rich that bothers us most. Far more than any public square argument. Where’s my yacht?!
What's going with the "western" world? Look at Australia, UK and now USA. Citizens are voluntarily giving up on hard-won rights like free speech in favor of what? A false sense of security? Saddens me deeply.
If we choose to go with a real world analogy, the "public square", then why are people sharing messages and photos with friends and family via a "public square". That is not what we do in the real world.
IMO, Facebook (or YT, or any big media company) even being in a position to “shut down free speech” is a predictable outcome of the total failure of net neutrality in the U.S.
If everyone had sufficient bandwidth to host their own content from their home Internet, and ISPs were prohibited from also being media companies, free speech would be much stronger.
Maybe you missed the genocide that happened thanks to a rampant disinformation campaign on Facebook? Or the other more benign cases, like antivaxers (the classical ones, not vaccinating their kids against polio and measles) and co.
Can anyone seriously argue that disinformation on Facebook isn't dangerous? The only possible question should be what can we do about it to preserve debate, free speech, etc. while stoping the literally deadly parts.
We already have done this over the last several centuries by defining the line between legal and non-legal speech. The point is the line is chosen to trade off these risks. There is no silver bullet but acting like there is, and that this debate is just beginning now, is fallacious.
I think one of the key things is what happened in 2016, isn't the same thing that's happening now. Then, actors were able to pay and market content directly to audiences they felt were exploitable. The current deluge of misinformation, especially regarding vaccine, is shared directly with friends or within groups that people explicitly join. If you browse the 'Herman Cain Awards' on reddit, you'll see things like,
- "I'll probably get put in facebook jail for this..."
- A misinformation label attached to the bottom of the post.
- The post completely blanked out, with a statement that this is misinformation.
This seems to indicate some moderation being done. What I don't see from the "facebook should stop' this group, is any attempt to get cable companies to do the same with "news" stations that broadcast misinformation. An attempt to have the FCC take AM radio station's licenses for broadcasting propaganda.
The underlying fact is, what we label misinformation is just what a large group of people wrongly believe, and they really like sharing it with each other.
Social media data science wizardry for targeting likely swing voters was applauded by the likes of the NYT when the Obama campaign did it. It was only when the cold water was poured on the left realizing the same methodology could be used to win campaigns regardless of what team deployed it. You can argue if one side acted more ethical than the other in the lines they were willing and unwilling to cross, but it’s Facebook’s business model and platform that made it inevitable that sophisticated actors would be in a perpetual arms race to try to best position themselves to leverage it to gain power, change minds, etc.
All of this was evident many years ago, based on the ad delivery system Facebook built, and the kinds of information their early APIs were exposing to people. It didn’t take a genius to realize having people building more and more sophisticated systems to spy on people to get data to drive the development of products to persuade people was a dangerous flywheel, and one that was held up by good intentions (making services “free”), so it was likely to be sustainable via an “ends justify the means” rationalization.
Yes, that was my point, that in 2016, actors were using the feed and ad algorithms to influence the elections. The point I'm trying to make is that isn't what is happening now.
Now we are seeing a large group of people freely sharing misinformation amongst themselves. The fueling of this, I would argue, is outside of facebook and is being brought there by the people themselves.
It was really easy to blame the lowering of the level of discourse to social media. However, the "our side at any cost" has its modern seeds in the advent of right wing radio with Rush Limbaugh, followed by the rise of fox news. With the internet, the public has learned that they too can be players in the political landscape by commenting on news articles. All of which predates the rise of social media to a large extent. In the early 2000s, Yahoo News political articles would have 10s of thousands of comments, even now, the 2020 election market on predictit had 300k comments on it, of people posting memes and s*t talking to each other.
This is the culture now. To change this, you can't go and regulate a social media company, you have to change the culture.
Both are still occurring. Either way there are lots of things to dislike about FB but enabling speech isn’t one of them. Mis- and Disinformation are political problems to which democracies are highly susceptible, by design. They’re features, not bugs.
"Soft right wing" propaganda? I can't understand from your response what side you think I'm taking. In one point its "right wing" propaganda, in the other, I'm criticizing right wing media.
I also completely understand how you posted a screed, without making a point at all or taking a side at all.
That Facebook has found a way to have a service that provides content, and not have to pay for it, is one of the reasons they are so immensely profitable.
I kind of laugh at your comparison to a warning in front of a show that its 'fictional' as some sort of equal warning and how that relates to what right wing cable tv news and radio have been putting on the air nearly 24/7 for the past 3 decades.
My point, is that the seeds of the misinformation and right wing propaganda isn't the leaf node that is facebook, but comes from the right wing media outlets, that are carried on cable or over the air. Its also a concern that no one on the 'left' seems entirely concerned about this, when you can dump on social media.
Left wing is plenty upset by the fact that the right wing exists and is allowed to speak. And tries to shut it down at every opportunity. It's not easy to shut down Fox News though, first amendment still being alive. But it's much easier to get a rightwing speaker to be banned from campus, or Facebook or Twitter. So that's what they are doing, for now. Until the government finally passes that "fighting the misinformation" law and packs the court so it wouldn't mess with it - then the sky is the limit.
You are not criticizing right wing media. You are criticizing left wing organizations pushing facebook to place warnings on right-wing facebook posts. It's very convenient how you don't address the main point of the post though.
Your point seems to be that a cable company purchasing shows from content creator companies, is the same as facebook having random users post whatever they want. One of those is a broadcaster purposefully running content they want to run. The other is a platform to let anyone say whatever they want. When you give a platform to someone to say whatever they want, you are free to put limits on that platform, and warnings on that platform. When you are purchasing content, you are showing what you want to show. Things aren't "carried on cable or over the air." Things are bought by the broadcaster, then broadcast by the broadcaster.
I do believe you're being purposely dense here, so this conversation has now come to a conclusion. I won't read your reply.
The arguments that Facebook is evil are really dumb and I'm kind of tired of them.
The core mechanism of political polarization is that the internet inexorably pulled society out of its temporary state of mass media consolidation. For a couple decades, we had an unusual situation where a few companies ran mass media, and those companies all sort of agreed to toe the centrist consensus line politically. Now, we've reverted to something like what we had before radio, which was that socialists read the socialist newspaper, right-wing folks read the right-wing newspaper, etc., if not even worse -- some folks just got their news from the loudest partisans at the bar.
We've gone from an era where news distribution was fragmented because it was very difficult, to an era where it was consolidated because it was easy for large corporations only, to an era where it's so dead simple that anyone can do it, so it's fragmented again.
Legacy media folks hate this. They blame the biggest players helping people share fragmented media sources with one another, rather than recognizing the inevitability of this fragmentation, no matter what products people use to share news over the internet. They demand a return to elite consensus blocking extreme viewpoints. It is simply not gonna happen. That barn was always temporary, and it has collapsed around the horse.
The argument isn’t about Facebook per se, and especially not the people at Facebook. It’s that providing Internet services for “free” on the back of advertising was an obviously great idea when such “ads” were cute product pitches and led to positive cash flow.
But when it became obvious that “ads” was the wrong mental model, and that the products being created were ultimately about the general problem of persuading people by using data collected by spying on them, it should have been realized this was an incentive structure that any sane code of engineering ethics should abandon. Facebook ended up being the best and most successful example of an organization taking this system to its most logical endpoint, but someone always would have unless a code of ethics managed to materialize upon seeing the damage it was causing before it got too far.
I don’t judge people who work at Facebook generally, but do think every one of them at this point should resign on ethical grounds. The situation could be fixed if the company (even just internally) owned up to the malincentives they have fallen into and committed to exiting it and leading on forming a code of ethics on when these kinds of system ought not be built.
I stopped using Facebook when they stopped showing you everything in chronological order and started implementing boosted posts and advertising in your feed.
I really don't understand why it's taken almost a decade for you idiots to figure out what that would result in.
"At this point"
What point? The point they started hiding your friends "boring" posts in order to serve up the most clickbait shit possible?
Or the point they allowed political parties to ram Obama and then trump down your throat.
Or the point they decided to become arbitors of truth?
Or the point they decided to censor you over your beliefs?
I deleted my Facebook account many years ago so I would say the obligation to resign, to me, probably goes back pretty far. But overall I understand this wasn’t self evident for many for a long time and for many it still isn’t. I wouldn’t peg it at their feed engagement algorithm changes, but I think once people started seeing them creating creepy targeting buckets in their ad system, like “target parents who just had their first child” (how would they know this?) it started to be pretty obvious how fucked up the logical endpoint of that was going to be.
I think it’s important to remember the issue with Facebook isn’t speech it’s amplification. Algorithmic amplification of content that drives engagement. This is what folks have an issue with.
Further bear in mind that “free speech” refers to the government not precluding your speech and Facebook ain’t government. Unless you’re proposing nationalizing it.
It’s a positive feedback loop. Introduce people to outrage. Spur them into “action” where they create more “outrage content” (comments, posts, … etc.) for you in return for attention and praise (Likes, upvotes, … etc.). Outrage content grows exponentially like the spread of an infectious disease or a nuclear chain reaction … as does the “engagement” of the site. The job of the site is just to get the outrage content in front of the viewers and keep the chain reaction going.
Basically the modern version of riling up a lynch mob … for profit.
To some extend, all social media use this feedback loop, even this site.
The introduction to outrage can happen off Facebook, I would argue the 'outrage victim culture' started with daytime AM radio in the US in the early 80s, when people realized that people would argue for days over their sports teams. Then people realized that political parties are just basically sports teams that play all the time.
> They're a damn American company, free speech should be a major component.
Given the constitutional safeguards you may be right on this one.
> Thinking any other country would actually respect privacy is truly delusional.
This, however, I had to read several times to be sure you were saying what I thought you were saying. At which point I just shook my head in amazement.
> Seriously, I think one of the biggest misconceptions people with opinions like yours might have is that we are somehow unified in these decisions or occurrences – we are not – and the healthy dissent is what, over time, hopefully bends us in the right direction.
I laughed a bit at this. I guess a recruiting tactic is to now advertise the exciting possibilities of burnout? "Come join Facebook because you can totally change the ethical direction of this billion dollar surveillance behemoth from the inside, one commit at a time. ;)"
My favorite line of reasoning from the article was that folks hold no civic duty to work on the inside for the megalomaniac that is the Zuck. I do find out very empathetic that the author understands why folks would stay employed there: it pays super well and people have obligations.
This post has the energy of someone shouting at a minimum wage customer service rep because they disagree with the company's policies and feeling smug about themselves afterwards.
That's only if you equate the recruiter to a minimum wage worker and yourself to their customer. There's a superiority bias in that rhetoric. The recruiter is a well-employed tech worker. And you're not their client or superior.
You can easily find the salaries of recruiters at FB. They're half of the base of an L3 with no RSUs. So sure recruiters are tech workers that are better paid than eg the service staff but let's not pretend there's not a two class system at FAANG.
Recruiters at FB seem to earn ~50-60% of the total compensation of Engineers at equivalent levels, doing a bit better at the bottom of the ladder (I don't see a single IC3 recruiter earning under $100k in the US; they'd have to be earning ~60k to be earning "half of the base of an L3 with no RSUs").
>Recruiters at FB seem to earn ~50-60% of the total compensation of Engineers at equivalent levels
...
>they'd have to be earning ~60k to be earning "half of the base of an L3 with no RSUs"
The people that email you are not L3s. They're actually often contractors. When they're not they're definitely L1s (or maybe L2s since I think L1s are actually service staff).
If they're contractors I can't "easily" find their salaries, since they aren't listed on levels.fyi (and other sources e.g. glassdoor are grossly unreliable). And, uh, I think the proper point of comparison there would be to line them up with the compensation of contractor devs; from what I hear they also earn substantially less than FTEs.
but it's the recruiter who decided to play along despite that. At that point OP has no choice but to elaborate and counter the 'criticizing from outside' thing
And who could blame them? I'm sure recruiters at most companies have a litany of responses they use based on the tone of the candidate. They get paid on how many people they can get into the interviews, accept an offer, and stay X months past their start date. Why wouldn't they use form responses on the off chance it works?
> Also he works for google, so he firmly believes his company is so much better than facebook so he need to have a big write-up to educate a hr there.
The author of this post, George Mandis, DOES NOT work at Google.
They are a “Google Developer Expert”, which is a person recognized by Google as having exemplary expertise in one or more of their Google Developers products. GDEs are awarded through the Google Developers Experts program established and administered by Google.
Anyone with a relevant background in web development can become a GDE.
I sent a similar reply to a Facebook recruiter. Something song the lines of, "I disagree with the philosophical direction of the company and cannot in good conscience find myself working there."
This, apparently, was a way to get myself permanently and unceremoniously removed from all Facebook recruiter outbound. Not only did I not get a reply, but where I used to get multiple emails a month from different teams, I haven't heard anything from a Facebook recruiter in more than two years.
All valid points but there is another perspective to this whole mess.
I see it through my family members who all have Facebook, how they connect with relatives across the globe and routinely stay in touch during birthdays and other events.
I also see it at most social functions around me that organize over Facebook, because I am left out of those since I don't have an account.
My best connection to social gatherings is a friend who also hates Facebook but is a lot more social than me, moves around a lot more, so he hears from all the people with Facebook what is going on through word of mouth.
So that's a pretty neat service they're offering to a substantial part of the population. But of course they're suffering from moderation issues, something we can see on all large platforms like Youtube and Twitter.
> So that's a pretty neat service they're offering to a substantial part of the population.
For sure, Facebook is great. It just provides services that shouldn't come from one single company in America.
They should be forced to interoperate. Provide a way to subscribe to Facebook content (pages, events) from outside of Facebook, and a way to pull outside content (RSS, ActivityPub, hell, throw in Twitter while we're at it) into your Facebook stream. A sort of standardized stream for updates and comments.
This would clear up all my criticisms of Facebook. At that point I would only wish them well. I'm sure they would provide the best interface to it, and most people would stay with them, and that would be fine.
The thing that's insane to me is you can't even access any profiles without being logged in. This is public information; they've locked things down to an incredible degree and I bet every last dollar I have it's for cynical reasons and not "user privacy"
Seriously, I think one of the biggest misconceptions people with opinions like yours might have is that we are somehow unified in these decisions or occurrences – we are not – and the healthy dissent is what, over time, hopefully bends us in the right direction.
Oh, the irony. Well it looks like the "healthy dissent" gets you out the door. Perhaps that is the right direction the recruiter was referring to?
"While Zuckerberg promised that thefacebook.com would boast new features by the end of the week, he said that he did not create the website with the intention of generating revenue. "
This is very much like the promise that Page and Brin made in their early 1990's paper announcing Google. How easily the idealism gave way to greed. These kids were never suited for management, let alone leading a large organisation. There was no "business plan". Even today, they still fall back on advertising. Competition is far too challenging.
There is likely no person on the Facebook dole who wants to "fix" the problem of Facebook if it means losing their paycheck, bonus and stock options. On the contrary, it stands to reason these people will be compelled to act in the interests of self-preservation, which means the preservation of "the business" (high surveillance advertising). The reply from the recruiter is perhaps an example of such desperation.
Bill Gates was recently interviewed on PBS News Hour about his relationship with Jeffery Epstein. His answer seemed reminiscient of the younger Gates. Getsuring with his hands, we can see his wedding ring is gone. Perhaps he has come full circle, he is who he always was. He refused to disclose what he knew. He was apparently told to say he regretted having the meetings, which he repeats several times. Eventually after evading more questions, he is asked if he learned from the mistakes of the past, and his reply is "Well, he's dead..." and then he tries to play up his role in philanthropy. His kids are likely afraid to speak out.
IIRC, back when he was active on HN, PG was a staid Zuckerberg fan. I also recall HN commenters claiming that Zuckerberg's "dumb fucks" comment referring to Facebook's early users, his fellow classmates, was no longer representative of the person who made it. I wonder if he can do better than Bill Gates in interviews.
I felt the opposite about Gates. He had a few dinners with a guy that turned out to be a sexual predator. I don't see at all how that is "bad". It was simply a media fuelled witch hunt as far as I can see it.
A "public good" approach to digital social graphs is the broadest and most urgent social imperative since, I don't know, nuclear non-proliferation treaty?
It will decidedly not solve all problems associated with online digital platforms. We know for a fact that even the earliest and most innocent online forums would degenerate into flame wars. We know that filtering and struggle for the political control of news dissemination is "tech-independent": print, radio, TV etc all "solve" this in their own arbitrary and varying ways.
What facebook managed to achieve is to combine all those, together with many others (personal data collection, algorithmic profiling of people, leaking of such profiles to third parties etc) and deploy it at an unthinkable scale.
All this negativity around the company and no good way to make a profit on it. Shares are still reasonably to fully priced, markets don't seem to care a lot.
This negativity is mostly a moral panic confined to the elites who feel threatened by Facebook's power to disrupt their world (which they misinterpret as the end of the world). Facebook's actual users don't seem to care much.
If you work for Facebook, you are contributing to a system which will grow into being the most effective mechanism, by far, for a bad actor to deploy capital to successfully alter the beliefs, behavior, and choices of your children. One day, through VR/AR/AI agents, they will even be able to turn your children against you for someone paying the right price, if the development of Facebook, the system, continues on its present trajectory and no ethical framework is put in place to restrain their actions. (Regulation won’t work, it is the wrong solution to this problem imo.)
> the most effective mechanism, by far, for a bad actor to deploy capital to successfully alter the beliefs
TV works better.
> One day, through VR/AR/AI agents, they will even be able to turn your children against you
Again unless your kids hate you, I doubt that's going to be the case. AR is far more likely to give the user "Perfect Memory" which will have all sorts of interesting side effects to how we manage forgiveness and growing up.
AR has a whole bunch of vectors that will fuck with the fabric of society, but this isn't one.
> ethical framework is put in place to restrain their actions. (Regulation won’t work, it is the wrong solution to this problem imo.)
Ok this is an interesting one. Laws are ethical frameworks, its just their values tend to lag society's values.
Facebook's "ethics" are codified here https://transparency.fb.com/policies/community-standards/ they are, much as it pains me, quite good. If I was going to write a guideline to civic engagement, this would be a good thing to base it on.
Facebook's problems are threefold:
1) nobody likes them so _anything_ they do will be negative. They could pay off all medical debt for the US, and it'd still be a negative action.
2) They don't enforce the rules evenly, some of this is down to scale, others because they don't want to piss off noisy operators(ie trump, modi, etc).
3) people are fucking stupid in groups.
Don't get me wrong, facebook have trespassed on a number of occasions. In practice to the same level as google/apple/amazon etc etc.
>You should resign if you do.
so that facebook can stuff it's self full of people who lack ethics? thats going to end well isn't it.
Eventually, unless something changes, Facebook will be able to commandeer your avatar to allow a third party to speak on your behalf to persuade others. Eventually, you will come to have personal relationships with non-human agents which also can be paid to persuade you via their ad system. TV will be a joke in comparison.
but they can only do that if people still engage with the company.
TV only has high esteem because it is seem as a "verified" medium, ie its had some level of fact checking. (whether thats true or not is another matter.)
If you undermind an entire new medium by abusing people's avatars to sell shit, it will sink the entire medium, unless the platform offers something compelling enough to overcome the stink.
Well, obviously. Their current system offers enough value for users to keep using it. The question of if working on something is an ethical pursuit is only partially connected to it's demand. Examples abound of things which people happily buy en masse but the question of if they are the right thing to push forward as a contributor is unclear.
>you are contributing to a system which will grow into being the most effective mechanism, by far, for a bad actor to deploy capital to successfully alter the beliefs, behavior, and choices of your children.
I believe you are actually describing government employees.
Here's my opinion about a topic, that includes my prediction for the future along with my assumptions of what solutions won't work, and so I know what everyone else should do.
No, my opinion that people should resign from Facebook is not a forwards looking opinion but a backwards looking one. But if you agree with my position on what you're contributing to, it is an additional reason to do so.
If the disruption is the mass distribution of antivax and antiscience propaganda that kills 1000s of people, the manipulation of opinions by hostile foreign state actors, the flaming of genocide and the utter division of nations into completely antagonistic sides, then I'm sorry but maybe the elites have a point.
Lol the phrase "we're fighting to change it from the inside" sounds like every student socialist who gets a job in the bank after graduation.
You have the moral backbone of a jellyfish and you're doing it for the sweet smell of green. Just own it, like the arms dealers and hedge fund managers.
I think the topic here is not what FB did wrong, but what's the difference b/w FB and, say twitter or medium. They are all platforms, but why only FB got grilled on fire.
Some meta feedback to the OP: please test your site on mobile. I wanted to read what you wrote but got tired of scrolling endlessly to the right in the block quotes so I gave up.
I like to think of Facebook as a niche horse girl forum and analyze all criticisms of it through that lens. Why should we care about how this strange site is run?
Scale matters. Each power of 10 is a whole other level. A niche horse girl forum might have thousands, maybe 10s of thousands of users? FB has billions. That's 6 orders of magnitude more relevance and impact, which is planet-wide. That's why we should care.
I think scale is a reasonable distinction to focus on. It could be argued that due to network effects that Facebook enjoys a sort of natural monopoly of being "the" social network. Nonetheless, it still has big competition in the ad space with Google and traditional media, and though much smaller and with different emphasis, Linkedin is a viable large social network that can survive alongside Facebook. I think it's hard to define at what scale a company should be subject to public scrutiny/control in part because it's hard to clearly define the market in which it competes. I think the negative impact that Facebook has on society is largely overstated and would like to see more testimony from users who believe that they have been personally harmed by Facebook's practices.
> it's hard to define at what scale a company should be subject to public scrutiny/control
FB is one of the largest companies on the planet with worldwide effects. There isn't another 10x for it to grow. If it's not subject to scrutiny/control at this scale, then it will never be subject to it.
> would like to see more testimony from users who believe that they have been personally harmed
Victims of genocide are notoriously unable to testify.
> If [Facebook]'s not subject to scrutiny/control at this scale, then it will never be subject to it
Yes, exactly, cat's out of the bag. I generally don't think we should use federal democratic decision-making bodies to decide how a company should be run.
> Victims of genocide are notoriously unable to testify
Noted, but I also do not believe that Facebook's policies either equate to genocide or indirectly result in increased genocide as understood in the most obvious/severe definition of the word 'genocide'.
Facebook is a massively evil company, a true net-negative to life on earth, a destroyer of personal privacy, responsible for facilitating genocides, constantly lying to its users, the most egregious example of how corruptive a monopolistic corporation can be on a society at a global level.
If this recruiter is to believed -- and I wouldn't believe a goddamn word of any of it -- there are all these poor, principled people who really truly want to "make Facebook better" but dangnabit they just don't have critical mass yet.
Bullshit. All of it.
This is a carefully calculated response, like all of Facebook's responses (PR or otherwise), to make you drop your guard and stop using your critical thinking faculties. It's a siren song, with promises of six figures and early retirement.
Listen... if you're at Facebook, you've personally made a decision that you'd rather collect your paycheque than stop contributing to the beast that surrounds you. You're fine with the pervasive corruption and rampant disinformation and hypocrisy because you're making bank. Don't be shy. Own it. Say it with a full chest. At least it would be genuine.
But don't lie to yourself, or your coworkers, or anyone outside the company, and say you're trying to "make Facebook better". It can't be made "better" in its current form. It will never be better. It needs to be dismantled. It needs to be put on trial. It needs to be bled dry.
>Listen... if you're at Facebook, you've personally made a decision that you'd rather collect your paycheque than stop contributing to the beast that surrounds you. You're fine with the pervasive corruption and rampant disinformation and hypocrisy because you're making bank. Don't be shy. Own it. Say it with a full chest. At least it would be genuine.
So true. I would never want to work for Facebook, but I’ve always wanted to reply back to their persistent recruiters with an obscene salary expectation just to see if they’ll consider it.
Companies for the most part aren't evil or good, they're a tool. And they behave like a gas that fills the container they are in.
If not facebook, some other company would make money in some way you would find similarly objectionable.
> a true net-negative to life on earth,
Facebook has really facilitated family, friend, community and volunteer groups I'm involved with to connect and communicate easily. Social media in general has been great at elevating the voice (at least the collective voice) of the common people to something vaguely competitive with the ruling class and their dinosaur media / propaganda corporations too. For a couple of examples.
There are also downsides of facebook and social media in general, but I don't know that it's so clearly been a net negative.
> a destroyer of personal privacy,
Law enforcement and intelligence agencies already destroyed personal privacy before facebook.
> responsible for facilitating genocides,
Language and writing is also responsible for facilitating every atrocity committed in history but they have been overwhelming net positives to humanity. The same is and will be true of computers and mobile phones from now on.
It's a great go-to for the emotional argument and outrage, but anything can be misused.
> constantly lying to its users,
Like virtually all politicians, and every corporation does (or would if they thought it might help them in any way).
I'm not giving facebook a pass on its behavior, I just don't think it's useful to be fixated on them as though they are the source of evil and problems with society, as opposed to an unsurprising product of the environment created by society.
Facebook's lies and actions did not result in the invasion of Iraq that destroyed a sovereign country, resulted in the deaths of hundreds of thousands of people, and took trillions of dollars, for example. That was the doing of the corrupt corporate-political system in the country. The vaunted New York Times was one of the mouthpieces beating the drums for war, no less. Just as they did in the lead up to American involvement in Vietnam. Just as the media corporations did in the calls for the wars and interventions in Syria and Libya. And on and on. So much for disinformation, eh? Yes, internet and social media corporations have or will be pulled into that system (as traditional media companies were) and made to facilitate this kind of thing but again I see it as a symptom rather than a cause. And not really facebook specific.
> the most egregious example of how corruptive a monopolistic corporation can be on a society at a global level.
I don't really think it is at all. The entire military industry are basically arms dealers and war profiteers. Banking industry was complicit in housing collapse that destroyed many people's assets. Pharmaceutical companies literally create epidemics of drug addiction. Tobacco companies similarly. Fossil fuel companies tried to bury climate science and lobby against externalities created by their product, not to mention the way war follows them around like they're a horseman of the apocalypse. Clothing companies (and many others) infamously use child labor, slave labor. Mining and extraction companies have pretty commonly taken full advantage of high levels of corruption present in developing countries. The list just goes on and on. I'm willing to hear you out, but it's a pretty damn high bar that your evidence for facebook being the worst-of-the-worst is going to have to overcome here.
I'm not going to defend facebook the company or social media in general here, because I don't know enough in depth about either subject to really offer a worthwhile opinion on it (and I don't work for them, hold their shares, or have any association with them). I just want to give a bit of balance and perspective to the fashionable "facebook is the devil" opinion.
> Listen... if you're at Facebook, you've personally made a decision that you'd rather collect your paycheque than stop contributing to the beast that surrounds you. You're fine with the pervasive corruption and rampant disinformation and hypocrisy because you're making bank. Don't be shy. Own it. Say it with a full chest. At least it would be genuine.
I think it's also on us to stop holding facebook at such a high level.
People are proud to have facebook on their resume, maybe they shouldn't be. Maybe facebook should be seen as an embarrassment. Maybe facebook should be something you want to hide from some employers because they're going to judge you for accepting a job from them. (Exceptions for h1b, first job, etc apply)
It's kinda how if you tell me you were a cop I'm going to immediately see you as a power hungry and abusive thorn in society. The burden of proof is on you to...show me that's not true. Some decisions have consequences I suppose.
> I think it's also on us to stop holding facebook at such a high level.
> People are proud to have facebook on their resume, maybe they shouldn't be. Maybe facebook should be seen as an embarrassment. Maybe facebook should be something you want to hide from some employers because they're going to judge you for accepting a job from them. (Exceptions for h1b, first job, etc apply)
> It's kinda how if you tell me you were a cop I'm going to immediately see you as a power hungry and abusive thorn in society. The burden of proof is on you to...show me that's not true. Some decisions have consequences I suppose.
You appear to be in some sort of bubble, where people are ashamed for their engineering work. This is never going to be true; anyone who puts something on their CV that indicates a high level of technical competence is going to be in demand no matter how much you wish otherwise.
Same with cops - you appear to be believe that law enforcement is forced upon an unwilling population. This is also not true: people prefer to live in places with an active and working law enforcement. If there isn't one the populace, by clear and almost unanimous majority, creates one.
You are free to move to places that don't have law enforcement officers. However it seems that the rest of the world (outside of your bubble) doesn't care for your ideals. I can't blame them.
While we're on the subject of judging people, what do you think people think of you when you say you want to ostracise law enforcement officers?
Listen, even if you disregard all of the privacy concerns and support of cancerous communities, Facebook is a shitty social network. The platform has decayed into a junk drawer that is legitimately terrible at the things it was originally designed for. It's bad at connecting you with people you know or might want to know. It's bad at showing you updates from your existing friends, and vice-versa. The site at one time felt cool and fun; it feels like AOL now.
Agreed, Twitter also brings out the worst of people in short schizophrenic vile without much thought. Reddit’s front page is full of outrage videos. Tiktok sucks people into tiniest echo chambers possible. YouTube comments section is completely insane.
Social media is great if it is limited to say 100 people. When you broadcast to millions, it is toxic.
Has anyone else kinda decided they're done doing technical interviews? If you *really* want me, then just give me an offer and let's negotiate.
I'm getting to the point where I'm just going to say if my work experience doesn't provide you with enough information about my technical ability then you can go hire someone fresh out of school who's done l33tcode 24/7 for the last few months.
The thing about tech interviews, is they're not just testing you, but also ensure your colleagues are at a minimal viable level. The alternatives are:
- You only hire folks with prestige; pulling the ladder up for everyone else.
- You fire fast; potentially screwing up peoples lives.
- You carry dead weight that slows everyone else down.
Of course, some of the interviews go a bit far. I had one with at ~15 calls, at least 7 of them "interviews". Most of them went well enough, but the architectual was somewhat egregious where I was asked a bit of a trick question, asked some clarifiying questions and got conflicting statements. I asked for a redo, and got a nope, then ended up in the weirdest offer stage I've experienced.
I try to keep that in mind when people propose we add more hurdles to our process. :)
same, but be prepared to no longer be offered those kinds of positions
as long as your resume/linkedin shows that you have clearly passed once and have exceeded "leetcode level" for a number of years, you should already be getting emails for "Staff/Lead" positions that end with "got 15 minutes for a chat to see if we're right for you?"
helps to have a site/project/blog/portfolio along with a github to breeze past the phone/culture screens :)
In my mind, and I acknowledge YMMV: Without doubt, FB and websites similar to FB are responsible for all the evil in this world. I know they are making money - and while they continue to make money, they will continue to be in the business of facilitating evil in the world. Of course there are good things that comes out of it, but nowhere close enough to offset the problems it creates.
Now if FB stops doing what it is doing, another company will take the lead, so I guess ultimately it's a problem that we can never get away from.
Having said that, FB is the current leader in facilitating evil, so I hate them for that and will never work for them. Ever.
If you work for an immoral organization and are OK with that by virtue of continuing to draw a wage there, YOU ARE AS IMMORAL as the company because you aid and abet it.
I won't have anything to do with you based on that. I'll be polite and arm's length interacting with you but NOTHING beyond that will ever be possible; not personal, not business, not social!
This. I've lost my respect for so many developers who have decided to go to work for these net-social-negative surveillance companies for a fat paycheck and justify their decision by the argument that if they didn't accept the offer, then someone worse would accept.
What annoys me is not that they took the job for the paycheck (we all have to eat at the end of the say), but rather the fact that all it takes to appease their moral conscience is such a weak and flimsy bit of moral acrobatics. Have a spine and accept the real consequences of your actions at least.
At some point, the root issue with Facebook (as identified best by Jaron Lanier) has been lost in the political noise: their business model is one that incentivizes and enables the creation of a global scale surveillance and behavior modification empire. Getting angry that they are not removing content you feel is “dangerous” is the opposite of fixing this: it’s smuggling in the idea they ought to be presumed an arbiter of speech - that arbitration being the key lever of their toxic business model.
If we agree Facebook should be in a position to decide who gets to talk to whom in the modern public square, we are forced to agree they get to monetize that capability, which is their current business model.