If you're a Facebook user and you are unhappy with the way the company strongarms, censors and manipulates its audience, the most effective way for you to express this dissatisfaction is to close your account, block social media bugs and encourage your friends and family to do the same.
Facebook doesn't care how you feel when you use their service; their bottom line simply depends on your contribution to the statistics they use to sell ads. Apathy, or even outrage, are perfectly acceptable provided you express it through channels they control and profit from.
As far as I'm concerned, as long as this conversation is couched in trying to appeal to Mark Zuckerberg's imagined sense of ethical responsibility it will lead nowhere.
Absolutely not. Government should have no say what a private entity can/can't do in their walled garden. The fact that over a billion people use it is a different issue.
A private individual, sure. I'll accept that argument.
But a corporation should have no such leeway. Companies exist for the mutual benefit of the citizens, and their shareholders. Just because we've lost that sense of ethic doesn't make it not true.
Now, the predominant view is "A company can fuck over whomever they want, as long as it lies within the confines of the law", is preposterous. Public corporations are already required to be sociopathic, given the legal requirement of profit (exception given to benefit corps).
Or in much more shallow words, "I'll believe corporations are people when they execute one in Texas."
Well it's an interesting idea to consider a corporation more like a derivative of the state, as it comes into existence via state power, and is in effect immortal until its existence is ended via state power; rather than like the derivative of a person. Therefore there may be a compelling argument that as the 14th amendment extends the bill of rights requirements to all of the states to uphold, that a corporate charter extends some narrow portion as well. As in, the corporation can abridge free speech of employees as spokespeople for their company, a consistent message of what the company "says" is important to be kept clear. But abridging the free speech of its customers seems a lot more problematic, to have a broad brush to just censor anyone using the service for any reason.
I doubt such treatment will come to pass anytime soon though. It's Facebooks infrastructure, and the EULA everyone agreed to gives Facebook, contractually, the power to engage in this censorship. If you don't like it, then maybe you shouldn't have agreed to their EULA which basically says all of your contents is not yours it's theirs. And this is something people who actually read these EULAs warned about and most people just gave it a hand wave as if it wouldn't actually ever be an issue.
A corporation crosses the line when it has a single person who is protected from the legal actions of the corporation by statue (i.e., a shareholder). At this point the quasi-libertarian theory of a corporation as "just a bunch of guys" completely breaks down as there is a special legal protection for shareholders.
A corporation isn't a partnership. A partnership is just a group of people who have agreed to do business together. All of them are jointly legally liable for the criminal actions and liabilities of the corporation.
A corporation, on the other hand, has legal protection for shareholders. A VC can bankroll eMurder.com and the VC isn't liable when the officers of the corporation go on a shooting spree. The options are : Investors get no special legal protection, and creditors and the law can pursue them and their personal assets for the liabilities and crimes of the corporation OR a corporation really is a special instrument of the State, and only has particular standing because we as a society believe the corporate veil provides social benefits.
The state can still attack people in the corporate structure individually. For example, if VP Jones decides to go execute a competitor, VP Jones can be brought to trial for murder.
But if that company decides to, say dump toxic chemicals in the waterways, it's only a few hundred thousand dollars, versus millions to dispose of them. Or if the company decides to refuse doing a recall because the cost of recall doesn't equal or exceed its liability, well, too bad.
Corporations can, and do, a great deal of harm. And many of those behaviors are collective across multiple levels and people. But yet, the most we do is a hand-slap of a fine.
> Or if the company decides to refuse doing a recall because the cost of recall doesn't equal or exceed its liability, well, too bad.
Is that wrong (I think that's what you're implying)? I'm serious. More money can be used to make things safer, but there's a point where the benefits become marginal compared to the costs.
People can, and do, a great deal of harm. There's nothing in corporation that is not actions of people consisting it. Corporations are not able to do anything by themselves, there's always some person that takes specific action. "Company" never decides to dump toxic chemicals, there's always a person deciding to do it, and there's always a person doing it. At least until we invent AI, then we may have something else. Until then, there's only people all the way down.
People seem to lose sight of that: corporations are REQUIRED to be as ruthless as they can legally get away with, or they are literally in violation of the law in terms of making money for their owners.
The law is the ONLY thing that curbs their bad behavior, other than publicity so bad that it (temporarily, until forgotten) causes a sales drop.
Yeah, I'd like to see more corporate death sentences handed out, as well. It may or may not deter any other bad corps, but it would cut down on second offenses :-)
This is false. There is no such law that requires nothing but ruthless profit optimization. I invite you to disprove my claim by pointing to a relevant piece of legislation.
First, your "claim" is loaded with a false question to begin with.
Laws can either be created (legislature), or interpreted (judicial). The basis of corporate law has been founded upon the judicial branch, as well as the requirement to shareholders.
> The case still most often used in law schools to illustrate a director’s obligation is Dodge v. Ford Motor (1919)—even though an important 2008 paper by Lynn A. Stout explains that it’s bad law, now largely ignored by the courts. It has been cited in only one decision by Delaware courts in the past 30 years.
> Oddly, no previous management research has looked at what the legal literature says about the topic, so we conducted a systematic analysis of a century’s worth of legal theory and precedent. It turns out that the law provides a surprisingly clear answer: Shareholders do not own the corporation, which is an autonomous legal person. What’s more, when directors go against shareholder wishes—even when a loss in value is documented—courts side with directors the vast majority of the time.
Real world example: Tim Cook said outright, in an Apple shareholder meeting, that he did not consider the ROI on every decision. He's still CEO; to my knowledge no one has even sued over the statement.
Do shareholders not elect a board of directors? Do the directors not have hire/fire authority over the executives?
Not saying this chain of authority micromanages every decision, but I suspect it's the rare CEO who goes out of his way to upset the board. Or is there some law that allows the CEO to say "You cannot fire me" (outside of anti-discrimination laws or such)?
(I'm not a lawyer so take this all with a pinch of salt.)
Fiduciary duty is not profit optimization. It basically means you need to be responsible with the company's money. You can't spend it on hookers (well, unless that's your companies business). It's basically a "don't waste money" rule, not "earn lots of money". Basically, the idea with corporations is that the shareholders own everything and the management is taking care of all the assets; it's not the management's so the management has to be careful not to waste money.
For example, legal guardians have a fiduciary duty.
> Dodge v. Ford Motor Co.
"The Michigan Supreme Court held that Henry Ford could not lower consumer prices and raise employee salaries. In its opinion, the discretion of the directors is to be exercised in the choice of means to attain that end, and does not extend to the reduction of profits or the nondistribution of profits among stockholders in order to benefit the public, making the profits of the stockholders incidental thereto. Because this company was in business for profit, Ford could not turn it into a charity. This was compared to a spoliation of the company's assets. The court therefore upheld the order of the trial court requiring that directors declare an extra dividend of $19.3 million. It said the following." (from https://en.wikipedia.org/wiki/Dodge_v._Ford_Motor_Co.)
"Among non-experts, conventional wisdom holds that corporate law requires boards of directors to maximize shareholder wealth. This common but mistaken belief is almost invariably supported by reference to the Michigan Supreme Court's 1919 opinion in Dodge v. Ford Motor Co.[5]
Dodge is often misread or mistaught as setting a legal rule of shareholder wealth maximization. This was not and is not the law. Shareholder wealth maximization is a standard of conduct for officers and directors, not a legal mandate. The business judgment rule [which was also upheld in this decision] protects many decisions that deviate from this standard. This is one reading of Dodge. If this is all the case is about, however, it isn’t that interesting.[6]"
> eBay v. Newark
"When eBay refused to sell, Jim and Craig deliberated with outside counsel for six months about how to respond. Finally, on January 1, 2008, Jim and Craig, acting in their capacity as directors, responded by (1) adopting a rights plan that restricted eBay from purchasing additional craigslist shares and hampered eBay's ability to freely sell the craigslist shares it owned to third parties, (2) implementing a staggered board that made it impossible for eBay to unilaterally elect a director to the craigslist board, and (3) seeking to obtain a right of first refusal in craigslist's favor over the craigslist shares eBay owns by offering to issue one new share of craigslist stock in exchange for every five shares over which any craigslist stockholder granted a right of first refusal in craigslist's favor. As to the third measure, Jim and Craig accepted the right of first refusal [7] offer in their capacity as craigslist stockholders and received new shares; eBay, however, declined the offer, did not receive new shares, and had its ownership in craigslist diluted from 28.4% to 24.9%."
" eBay asserts that, in approving and implementing each measure, Jim and Craig, as directors and controlling stockholders, breached the fiduciary duties they owe to eBay as a minority stockholder of the corporation."
I didn't understand this completely but it looks like craigslist was trying to restrict eBay's usage of the stock and that's a breach of fiduciary duty because eBay is a minority shareholder of craigslist and so craigslist is harming shareholders by their actions.
Perhaps it's an "emergent property" of the system of laws, and culture, that are in place?
How much "charity", percentage wise, do you think corporate management could actually take part in? (I don't mean P/R posing as charity costing 1% of net profit, I mean things like deciding to do something cleaner/safer which would cut net profit on a major product in half)
Those can also be looked at as cost reduction measures: medical premiums, lawsuits and regulatory fines result from blatant negligence. (as well as other indirect effects such as dissuading anybody competent from staffing your death-trap)
> Government should have no say what a private entity can/can't do in their walled garden.
I have some sympathy with this point of view. I would like to see a world where anyone can build whatever website they want, and the amount of attention the website gets is determined by its "worth" as defined by how many people choose to use it.
However...
> The fact that over a billion people use it is a different issue.
...there's the rub. The usefulness of a social network depends on how many people are already on it. Network effects mean that its very vary hard to build a competitor to Facebook or Twitter.
One possibility might be if social networks, one they reach a certain size (say >10 million users) be required to put all their public data on standard APIs (such as RSS feeds) under terms that allow anyone to re-use them.
This would make it a lot easier for anyone to piggyback a service on top of Facebook/Twitter/etc and if their censorship or other policies got too cumbersome, there would be much less stickyness preventing people from moving to sa competing service.
> One possibility might be if social networks, one they reach a certain size (say >10 million users) be required to put all their public data on standard APIs (such as RSS feeds) under terms that allow anyone to re-use them.
I'm genuinely glad that someone had made this point of putting public data accessible to other people after reaching critical mass.
I'm equally sad that it might never happen.
PS: There are other discussions going on having a common standard/protocol for social media communications so that it can be picked up by anyone like an email provider without locking out the people using a particular service. I'm now wondering what is stopping this from happening. FB and other private monopolies can be forced to adopt the protocol and data from them can be shared between other services like FB. How FB would react to this is whole another matter.
> Government should have no say what a private entity can/can't do in their walled garden.
Except in matters of de-segregation, food and drug safety, safe and healthful working conditions for working men and women, a minimum wage, waste disposal and environmental protection, truth-in-advertising, Employment and Labor discrimination, privacy of employees and customers, and licensure of trades...
Absolutely yes. Facebook has more members than entire countries and is the primary method that many of those people use to communicate. It has long surpassed any kind of protected walled garden and government absolutely has a say in how Facebook operates.
You don't think there should be laws over obscenity, terrorism or monopoly ownership, then?
The reason I'm asking is that if a Facebook user decided to send out full instructions about how make a pneumonia virus with home dna equipment, for example, how would you feel about that?
Let's say they were in another country where they law couldn't or wouldn't stop them? Is it ok for Facebook to keep distributing the information in its walled garden?
> Government should have no say what a private entity can/can't do in their walled garden.
If an entity has no say what another entity can do within some domain, then the first entity is not the government with respect to the domain, and (assuming that no other entity has such a say), the latter is the government.
Facebook has become a de-facto political "agora" for the politicians and high-Government officials from my country (I live in Eastern Europe). More exactly, that's where now they first express their views on current political, social and economics affairs affecting my country. Nowadays I would say that most of the important stuff first gets posted on FB, and only then it gets re-published by the local media (whatever it's left of it at this point).
So, even if I would want to "get off" FB and decide to ignore it completely then I would just have an "inferior" experience as a citizen of my country, i.e. I would not have first-hand account on what's shaping the things around me. It's not that I like what Facebook has become, but there's nothing that I can do to change it at this point.
So once your platform gets popular enough, the government can tell you what to do with it? Why? It's not Facebook's fault that politicians started using it.
Because with great power comes great responsibility, and companies like to cherry-pick only one out of two. It's in the interest of society that when something becomes mainstream enough to be considered "infrastructure" - something with so little competition that ignoring or bypassing it is infeasible given the state of the market - that this infrastructure works within frameworks that we've built and refined over centuries.
Rent-seeking is bad, it's facilitated by ownership of too large a share of the market. In return for allowing Facebook to leverage its size and influence for rent-seeking behavior, we as a society, a.k.a. the government representing us, have the right and responsibility to set the ground rules for what we get in return.
That's not about fault or entitlement, it's about how we can continue to uphold the values that are important to us. Free speech and the ability to publicly disagree are a crucial part of that. If your platform gets popular enough that strengthening or suppressing your voice has a material impact, profit and shareholder interests can't be the only thing that decides how things are supposed to work. There's a lot more at stake here.
Exactly yes. Once your platform becomes this popular, you wield real power over human society. And society should absolutely start to dictate to Facebook what it can and cannot do to prevent it from abusing its power. How many times do we need to go over this?
Any organization, when it becomes sufficiently large, essentially becomes a government.
>And society should absolutely start to dictate to Facebook what it can and cannot do to prevent it from abusing its power.
We have no good mechanism to prevent the government from abusing its power.
Those here defending FB are not arguing that FB can do no wrong, they are arguing that fixing a problem _with another broken system_ is not an improvement, and is, in fact, a step backwards.
I can opt out of FB, and I have. Facebook exerts no influence over me, compared to the influence the government exerts over me.
The problems of power abuse and effective communications are ancient ones. There's nothing that prevents private individuals or corporations from abusing power either, and there's a very long history of them having done just that.
Government, for better or worse, is a vehicle for channeling and aggregating power, in a way that at best benefits society as a whole, expresses the preferences of the majority, and respects the rights of the minority. It's far from perfect. But it's better than most alternatives.
Facebook is granted rights by governments, ultimately also to benefit the public at large. Facebook is not itself a government, though it transacts the online communications of a population larger than all but the very largest countries.
Along these lines, the argument says "Facebook is large, and should therefore be subjected to regulation".
Government is far larger than Facebook, and far more insulated from the people than is Facebook. I can't imagine why we extend trust to the government to fix a situation where we don't trust facebook.
It's all run by people, with their own competing goals.
And lets not forget that facebook could collapse overnight if enough people decided to stop using it. I don't think a population could exert that level of influence over a government.
>Government is... far more insulated from the people than is Facebook
Is it? Facebook doesn't hold elections. It seems entirely unaccountable to me, with no system of checks and balances to reign it in.
>facebook could collapse overnight if enough people decided to stop using it. I don't think a population could exert that level of influence over a government
They could and do, all the time. See: any successful rebellion ever. See also: the collapse of the Soviet Union.
Any given API change, content promotion or censorship, freebooting support (or restrictions), represents a taking of usefruct or allocation amongst various parties. With little or no recourse to them.
Does the victim of pollution, or of a highway rerouting, or a relocation of a major traffic draw for brick-and-mortar foot traffic, have any contractual claim to their previously-enjoyed benefit?
Power and rights are far more about the ability to effectively press a claim than specifically detailed binding pacts.
The matter was whether or not Facebook could effect a taking. Not the rights of recourse of others. You're dragging goalposts.
I disagree that anything on Facebook is "public" in the sense of "public ownership". Facebook is a private platform. Just because things are (maybe) publicly visible doesn't mean they're public property. Should the government be able to force people to put up or take down posters in public-facing Windows?
If I buy or rent space for a billboard I am not allowed to put anything I wish on it. That is no different than putting a poster in a public-facing window. It gets more interesting when a website in one country is offensive to another country.
I'm pretty sure that if you put a two-story tall porn image on your property it's not going to last long. I'm pretty sure that if you post a hugely racist message as big, or some other things, not matter how public that space will be, you'll run into some problem with the law whether at local or federal level.
Whether you're a private or public platform here is not as relevant as the fact that is is a public forum, it is practicing certain forms of censorship, and it a medium of information. Other platforms get regulations applied to them in many respects too.
Governments do in fact regulate posting of notices, including both commercial and political speech, routinely. This includes both forbidding specific types of notices in some instances, and compelling them in others.
> once your platform gets popular enough, the government can tell you what to do with it
I'd say there is considerable public interest, yes. Facebook without the users is nothing, but the users without Facebook are the same society, just using another software.
I don't think Facebook has changed society in a way that isn't repeatable. It's a low risk to the system when google+, or any other competitor, could meaningfully take its place given enough pressure.
Huh? What are you talking about? I'm defending the right of everyone to be free of excessive government interference, whether it's an individual or a popular company.
They can tell you what to do with it and not use it too. This is the way it should be. Think of the harm that can be done in all sorts of abusive ways if there was no regulation of sites and services.
Facebook arguably has a monopoly position in the way that your local mall, movie theatre or newspaper does not.
If I don't like the Times, I can read the Guardian instead. If I don't like what's on at one cinema, I can go to another (granted this may be harder in a small town with only one).
Of course I can close my facebook account and move to diaspora, but then I'd lose everything from keeping in touch with my friends' latest accomplishments to the schedule of my local hiking club.
I already find if I'm away hiking for a weekend without internet access and couldn't be bothered to catch up on facebook on Monday morning because there's already a pile of e-mails to deal with, at coffee break it's like I've missed some important social information that everyone else knows, because team member A has done something and posted about it on facebook and everyone else has been liking and commenting on it, and everyone expects you to be up to date with this. Without a facebook account you may as well wear a badge that says "antisocial".
I personally think governments should regulate media outlets in proportion to their influence on people's lives. FB is the new IE.
It's perfectly possible to be happy and healthy without a Facebook account. You might lose out on one particular kind of "social" connection, but that doesn't mean you lose actual social connections. You just catch up with people in person, via telephone, via email, via chat, or a million other ways.
Personally, I don't really find the "social" information Facebook has to offer all that valuable. It's just a stream of trivia from extended not-really-friends.
The parent poster just said that he felt like he missed on some important communication that he was expected to have been aware of. It's not possible to be happy without FB, it's not possible to be happy with FB either.
It is so possible to be happy without FB. Just decide your life is fine, despite the occasional time lags in you discovering certain information.
FWIW, I deleted my FB account months ago, and my quality of life has gone up. If someone expects me to be available via FB, I give them my email address.
>I personally think governments should regulate media outlets in proportion to their influence on people's lives. FB is the new IE.
IE went into a decades-long decline, because superior products came to market. No regulation required.
Who regulates the governments in proportion to their influence on people's lives?
I wish my government didn't bomb other countries. Can I stop paying taxes, so I can stop supporting the murder of people around the world?
You're asking an organization that backs its wishes with the threat of violence and jail to regulate an organization that lets you keep up with your friends hiking accomplishments.
Facebook could be irrelevant in five years. The laws you're asking to be implemented would be in place for another 100. Are you sure this is a good idea?
> IE went into a decades-long decline, because superior products came to market, because superior products came to market. No regulation required.
The decline followed behavioral changes in response to regulatory action and ongoing litigation (even it began before the resolution of the litigation), so I'm not sure you can concluded from it that regulation was unnecessary to the outcome.
Because it is becoming the predominant way people consume media. If it were only used by 10% of its current use, it shouldn't be, but it current levels it should.
As Stalin said "quantity has a quality all its own".
.. both of which are subject to different kinds of government regulation. But Facebook is definitely a medium. In some ways like a telecom or transit operator.
Start criminally charging Facebook for every criminal act they fail to remove. If they're playing the censor, then they need to be punished when they fail in that role.
The other choice, is not to censor anything, other than by the requirements of the law. Like the telephone system does.
It may be "more effective" to implore the government to crack down on media policies you don't like, but it's also as much a threat to free expression as Facebook's ridiculous and overt censorship of this image.
Free expression already has limits (No "Fire!" in a crowded theater when no danger exists), I'm absolutely fine with government saying what Facebook isn't allowed to do (remove journalistic posts without due process).
Yes free expression obviously does have limits; It probably doesn't have that silly "Fire in a theater" restriction.
Holmes (the supreme court justice you are quoting) said this with respect to this case: https://en.wikipedia.org/wiki/Schenck_v._United_States where they prosecuted Schenck for anti-war pamphlets. This is something that would be protected by the constitution today. This quote is a pretty lousy excuse for suppressing free expression considering Holmes's track record of dismissing free speech.
With regards to Facebook removing posts, why should there be a "due process" for a (specific) company to remove posts. What sense does this make?
FB isn't just a specific company. FB has grown to such a size that it has become a large part of the media/press. So it has to be regulated like press.
The first amendment says: Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press...
Where do you get that we should regulate the press?
Obviously you could argue that the constitution is wrong, and that's fine with me. It's just a document, but when you say "it has to be regulated..." what is the mechanism you're referring to?
For example, there are laws that restrict defamation, incitement to violence and guarantee the right of reply. They can't put false information out, like "The president of X has said Y", or use a cooked up image as real. They have to keep secret the names of minors when they are involved in police-related stories, they are required not to divulge privileged (state secrets) or private information, such as was the case with Gawker. All of these have their exceptions, but in general press is not quite free to do as they like, and it's better that way.
Asking the government to protect freedom of speech is the exact opposite of a threat to free expression. It's analogous to congress passing a law saying that newspapers cannot edit letters to the editor. Some newspapers might complain about having to publish things they find disagreeable, but nobody is being censored anymore.
Imagine if a paper newspaper were told that they had to publish every letter sent to them or no letters at all. They would stop publishing letters entirely. There are a lot of ways to construct laws such that they have the opposite of their nominal effect. It's important to note that the US First Amendment bars the government from passing laws respecting both freedom of speech and freedom of the press (among other related rights).
Wrong analogy. Newspaper does not offer a service of publishing customer letters. They offer a service of receiving and reading letters on premise that they may choose the letter to be published. It's a different story.
If a newspaper published all letters, then government could act on behalf of citizen to protect freedom of speech, by making sure there's no arbitrary censorship of publications.
The relevant detail is that you can indeed censor a publication by adding a rule that says "you can't censor your publication in the following ways". I don't personally use Facebook at least in part because I don't want to use a centralized, censored system, but the fact of the matter is that it's a bad idea (not to mention likely an unconstitutional one) to tell Facebook what they cannot publish and indeed what they must publish.
Although it's irrelevant to the point, I also don't totally understand the nature of your objection. If you are saying that any publication gets to define what it does and does not publish (i.e. Facebook says "I am a company that publishes non-nude photos" or "I am a company that publishes censored material"), then they merely have to re-define their service in order to censor people while complying with whatever scheme you have in mind.
Facebook is not a newspaper, because it does not produce the content that it makes available - freedom of speech cannot be applied to Facebook itself. FoS applies however to the content producers and limits the terms of contract that Fb may offer to them, by banning the arbitrary censorship (the distinction between artwork and offensive content is arbitrary and decided by Fb moderators). Also, by saying "unconstitutional" which constitution do you apply to it? It's multinational and has the business in many countries - each one can regulate Fb within its sovereignty.
If you measure everything purely by effect, there are more efficient ways of restricting media organizations from doing certain things. E.g., many news organizations avoided republishing the famous Muhammad cartoons, not because their parliaments were prohibiting it, but because they were afraid of violent retaliation. If you are OK with abandoning morals, you can be very efficient in preventing people from doing things. Mafia can be very efficient in preventing witnesses from talking, for example.
Now, you can say, employing threat of government-mediated violence through congressional law is not the same as employing the threat of direct violence. But the only real difference I can see is that you can claim "it's all for common good". Oh, wait, they do the same too...
That's a very dangerous path on which to travel. China passes some strong laws what media organizations can and can't do. Free speech isn't a right given by governments, but one cultivated and protected by the people.
The marketplace solves these problems. Nobody is forced to use Facebook. If so many people care, perhaps they ought to vote with their dollar and seek alternatives. If they are unwilling to do that, it's on them and not the company that would support censorship.
Don't like tacos? Don't buy tacos. The government ought not tell taco vendors they can't sell tacos but instead reduce regulation to make it easier for a pizza guy to start a business. Less regulation by definition, increased freedom.
How does the market solve the problem that it is morally wrong (or at least dubious) to delete an iconic picture that exemplifies the cruelties of the Vietnam War?
Do you believe that billions of Facebook users will switch to another social network because they are some sort of moral saints who recognize Facebook's mistake and act accordingly? Or do you instead believe that the majority of Facebook users are that much of a moral authority that we should readily concede that Facebook is right, because almost none of their users complain?
"the market solves this" is a funny meme, but it's often plain wrong. Markets can solve certain optimization problems and find price equilibria, but they cannot somehow magically solve the vexing moral question of how much (if at all) a social media company should censor their user's content and who should control this in which way.
There are a few problems with your examples, I think.
There is a difference between bad regulation and good. Some regulation is necessary in many circumstances. China's strong laws over media organisations are an example of over-reach: Not calling emergency crews to your house to complain about how your think they are misusing tax dollars isn't bad to regulate.
The marketplace solves some problems, but definitely not all of them. Discrimination is a lot harder to solve just relying on the market, for example. Regulation comes in to help.
> Perhaps they aught to vote with their dollar and seek alternatives The problem with alternatives is the same problem folks have with seeking alternative utilities. The alternatives are few and far in between and offer an inferior service comparatively. In fact, facebook is more like a telecom that refuses to connect to other telecoms at this point. Your basic options are to participate... or not.
It isn't like buying tacos at all - tacos has a market that competes to an extent, and this just isn't such a thing. And to get it to the point that it is easier for other folks to start the business, you are knee-deep in regulation, including standards so the different services can communicate with each other easily and other such monopoly-busting sort of regulation, which will probably last for many years.
It's not dangerous, it's the only path that society can choose. Facebook is dominant on social networking market and is subject to anti-trust regulations, that must be adopted to the nature of the service. Same can be said about Google in some countries.
It's almost impossible today for a person to switch SN, because for this to happen it has to be a group decision - most of the contacts must do the same switch. I have accounts in Path, VK and few more "yet another" social networks, but they are useless without social graph and it's beyond my control to move this graph there. It has to be regulated, because having freedom of speech in hands of a single corporation is wrong.
For sure. A few dozen bureaucrats should be dictating what is and what isn't acceptable content and the punishment for not adhering to their sense of appropriateness.
Thats not really true. I dont use facebook (i use wechat, never had anything censored, and the privacy controls are awesome), but i dont think facebook wakes up in the morning and thinks "why doesnt Hugh use facebook? We should make it more like wechat". If you want change, the most effective way to get it is to ask for it.
> is to close your account, block social media bugs and encourage your friends and family to do the same.
Much easier said than done.
First, there is no way to 'close' your account - you can only 'deactivate' it.
I did it in July, but then I noticed that a lot of social activities - like concerts, festivals, parties, etc, are organised using Facebook so if you're not on it, you can't participate.
This is very frustrating and wrong, but that's what the world does.
I personally think that Facebook is breaking the Internet and we're just starting to see the first signs of the bad things to come out of it.
Back to account 'deactivation' - if you log in to your deactivated account, Facebook conveniently 'reactivates' your account automatically, so you can't just log in to look at your data, which of course is not yours and is the currency which Facebook exchanges for real cash.
Telling friends and family to do the same is useless - most don't care even for 1 second about 'privacy' or things like that.
So 'closing' your account is more than just stopping to use a web application. It's a lifestyle choice - do you want to stay secluded, excluded from a lot of social activities and considered an 'introverted loner' or do you go with the flow and get trapped more and more into this social experiment ...
> I did it in July, but then I noticed that a lot of social activities - like concerts, festivals, parties, etc, are organised using Facebook so if you're not on it, you can't participate.
They are organised this way because it is an effective way to organise them; so the only sure way to stop them being organised this way is to make it ineffective (by not participating, by urging others not to do so, and, crucially, by making your reasons known to the organisers). It is true that not using Facebook is an inconvenience, but there is no guarantee of a right to protest without inconvenience!
You can delete it there are what only 5 datacenters...finding your data and cleansing it should not be too hard just alittle repetative ;) But the more effective way to make Facebook listen is to hit them in the pocketbook. Keep your account use an ad-blocker if enough people did that long enough...
They'll still profit from additional ad revenue when other people, like your friends and family, view your posts and use facebook longer than they would have because they are interested in you and your life.
> You can delete it there are what only 5 datacenters...finding your data and cleansing it should not be too hard just alittle repetative ;)
I don't know if your comment was tongue-in-cheek, so here goes. Actually, it's not about the number of data centers Facebook has, but the number of CDNs and edge caches around the world and how FB manages those, including third party companies (like Akamai) that provide this service for Facebook. Plus, Facebook has had a lot of trouble, in a very shameful and absolutely incompetent kind of way, in removing the visibility of photos that were "deleted" by users. See this saga spanning from 2009 through 2012 as reported by Ars. [1] [2] [3] [4] [5]
> But the more effective way to make Facebook listen is to hit them in the pocketbook. Keep your account use an ad-blocker if enough people did that long enough…
Facebook is already trying to push more users to use its mobile and desktop apps so it can have more control over the content (read as "ads") shown and collect more information that's not easily wipeable by end users (like cookies, cache, etc.). We will see a time in the coming years when there won't be a browser interface for the platform, and the cat and mouse game between ads that look like content and ad blockers (to block FB ads that look like content) will continue on. Depending on the platform, people may start needing content blockers on their routers (or an internal proxy server) to deal with this.
For many users, abandoning Facebook is like abandoning the web. So no matter what shit Facebook will pull they're not gonna leave, unless something better appears.
From the other hand, this whole situation is a bit ludicrous. The journalist doesn't seem to understand how Facebook works and is directly attacking Zuckerberg like he explicitly asked to remove the picture. Come on guys, it's just an algorithm that detected nudity and decided to censor the pic. You can't write a rant like that and not understand how the damn thing works in the first place. It's poor journalism. And then you make the assumption that just because Mark has power he's using that power to manipulate the world. How the fuck is that objective reporting?
Or the journalist understand that it is automatic, but does not want a society where algorithms blindly determine what is allowed or not. After all "it's the algorithms" is a very convenient excuse for Facebook that allows them to be a faceless entity beyond criticism. It is Facebook that decide which algorithm they use to censor content, and they have chosen a rather prude set of policies they think will avoid offending anyone in any culture, and are very quick to remove controversial content. This is something Facebook have chosen to do, and to little degree something that is enforced on them.
The picture was removed by moderators, not algorithm. So Yes even though Zuckerberg did not delete the picture, he is responsible for acts and policies of the company.
Technically it's possible to have an index of all images published is some registered media and assume they are already vetted by editors of that media. Then you compare the published picture and voila - you do not have this problem at all. Of course, it adds some costs, but it's exactly what M.Zuckerberg can decide to spend money on (before delivering this service to people in Africa and other places where it's easy to abuse Facebook censor mechanisms - ask Russian opposition how Putin's media warriors take down their posts).
I think that it’s much more safe to prohibit all nudity than nitpicking what to allow and what not. There are TBs of pictures uploaded daily on Facebook, it’s not a trivial task.
What's wrong with this image? Of course it should be allowed. I see no difference with David by Michelangelo, for example. Banning all nudity is just stupid censorship that has no moral ground.
I believe that a lot of conservative people would find a picture like that offensive. We have to keep in mind that Facebook's audience is in the billions.
It's not the reason for censorship. Facebook may offer parental control and display explicit content warnings to serve this audience. After all, people who find nudity offensive may limit their subscriptions to accounts which do not post it - it's not that they are forced to look at it.
There's no need to jump straight to closing your account. There's a wide range of escalating actions you can take when companies dissatisfy you. Going straight to firing a company every time there's a problem doesn't really help.
Facebook absolutely cares how you feel when you use their service, because certain feelings are associated with less engagement (and thus less revenue) or people quitting the service. Expressing your dissatisfaction to them is a way to let them know that you're moving in that direction.
Do you really think that anyone in a position of power in Facebook is personally opposed to this photo, at least to the extent of feeling moved to change policy if it is shared a lot? (Also remember that, if they do feel so strongly about it, then Facebook has the power to remove it at will.)
On the other hand, don't you think that everyone in a position of power at Facebook cares if they lose users, especially en masse?
Don't make a dent in Facebook. Make a change in your own life, and maybe lives of your friends.
There is a number of large things that you might not like. The smallest effective thing you can do is to not be a part of such things, to prevent them from consuming your mental resources and from defining any bits of your agenda.
But, I'll need something else to replace it with. Even if we ignore the network effect, it will be owned by some other corporation whose policies I'll not agree with. A govt owned will be worse for simple reasons. Do we have truly free system which is open, easy to use and still safe enough to let a 13 year old to use it.
>Do we have truly free system which is open, easy to use and still safe enough to let a 13 year old to use it.
Of course not. "safe enough to let a 13 year old to use it" can only be accomplished through censorship. You can't have a truly free system that also respects cultural norms or legal authority - any system that does so isn't free.
It's too useful to be ignored. Soon, interaction with businesses will be done through their platform. It's easier to set up a FB page with services + marketing included than to create a website.
I closed my account back in 2010-2011, and I haven't missed it the least. Only problem is that the board game group at work organize through a Facebook group, but we overcome that by talking to each other.
If a company can only communicate with me through Facebook then it's probably not a company I want to communicate with in the first place.
Facebook doesn't care how you feel when you use their service; their bottom line simply depends on your contribution to the statistics they use to sell ads. Apathy, or even outrage, are perfectly acceptable provided you express it through channels they control and profit from.
As far as I'm concerned, as long as this conversation is couched in trying to appeal to Mark Zuckerberg's imagined sense of ethical responsibility it will lead nowhere.