A fundamental problem with CCPA (and GDPR and other similar state legislation) is the definition of personal information.
For example, IP addresses are not really personal, but including them as such creates layers of ambiguity that undermines other positive aspects of the law. It's the typical outcome of politicians not really knowing the domain they're affecting.
Also what's especially interesting is that CCPA was effectively bankrolled by a single person, which should raise some alarms about political power used by the people.
Never understood the cookie popups mess. I remember back in the ages of internet explorer it showed a popup when a site tried to use cookies, why do the sites need to implement the cookie popups rather than let the users configure it in their browsers?
It's a classic example of a legislative loophole. What the cookie law was actually trying to do was provide a way for users to opt-out of cookie-based tracking. But, someone figured out that if you just ask for permission to use cookies (for any reason) and refuse service if the opt-out, you'd still be following the letter of law (but not the intent).
Arguably, this is one of the reasons why the GDPR was necessary.
It (in theory) should be, but most often if you click "opt-out" they kick you off the site -- hence "refuse service". With GDPR (loosely) that is no longer allowed when it comes to the opt-in nature of data processing disclosures (if you opt-out, they can't refuse you service for not opting-in -- with certain limitations).
Most often there is no opt-out button. Just a big banner that you have to ignore, click accept, or block using ublock origin. Pretty sure that they use cookies regardless of your choice.
Interestingly, a strict reading of GDPR suggests that "consent gating" should not be permitted, but admittedly the wording is quite weak, and it isn't clear cut.
> When assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract.
This would suggest consent may not be freely given if it was obtained by conditionally providing a service based on consent bring obtained for processing of extraneous data.
Recital 42 adds:
> Consent should not be regarded as freely given if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment.
I don't think many users have a genuine free choice on many websites, although admittedly it's now mostly the worst offenders to blame here - the average site probably does have an opt-out now that actually works (!)
Recital 32 also appears to deal with the annoying, interrupting, semi modal nature of prompts we see on ad-laden sites:
> If the data subject’s consent is to be given following a request by electronic means, the request must be clear, concise and not unnecessarily disruptive to the use of the service for which it is provided.
Because browsers make it an all or nothing setting (all, 1st party, none) and there are many more combinations. On top of that, the laws are there to prevent companies from tracking people at will, and the DNT header that was supposed to make that an easy browser setting completely failed.
> Because browsers make it an all or nothing setting (all, 1st party, none) and there are many more combinations.
Browsers can be changed, but instead of picking a reasonable well thought through solution the governments crammed a quite horrible solution through. I honestly don't see how that protects the people who are most vulnerable nor anyone else really.
People don't understand or care about cookies maintaining state in browsers. It was complete folly for politicians to focus on them instead of actual personal data used.
GDPR rectified some of it but added much more granularity which is why cookie popups have now turned into giant selection windows.
This is not a good thing indeed, because it's terribly naive. Passing a law that more or less forbid huge companies the very same activity that gets them the bread and butter. What are they expecting to happen? That the companies will surrender their business, just as that?
Fundamentally, the solution to cookie warning spam is simple: Stop letting these companies disclaim their way out of unethical business practices. Start making those business practices illegal and shut down companies built around them.
Companies built on surveillance capitalism should be shut down. Full stop.
What business practices? Cookies aren't just used for ads. You could get rid of adtech and not have any change in cookie notices because of how the laws are written.
Not all cookies are functional or ad related. Also the EU cookie directive requires you to inform users that cookies are being used regardless of the reason why.
I'm pretty sure most sites that asked me to allow cookies in the past week were not "companies built on surveillance capitalism". Youtube rarely asks me about this and to be quite frank I don't mind adds on youtube nor that they are targeted adds - even if they are rather poorly targeted - I'm not sure why this should be illegal.
I think whatever problems the cookie laws that got us all the popups are trying to solve would be better solved in conjunction with some technical changes. Like I could say in my browser what sort of cookies I allow - or set up some rules. Sure sites can just break the law and disregard this - but they can do it now anyway by just saving cookies even if I click "don't allow" or "decline" on the dumb popups.
Yeah, that's basically how it should happen. In California, there's ample precedent for taking products off of the market when they're shown to harm consumers, and if that happens to kill off manufacturers who aren't diversified, then so be it. Are you really going to be sad if Facebook can't survive CCPA?
On the other hand, we also have things like Prop 65 warnings that have become so commonplace that they don't really impart useful information. Any useful signal is totally overwhelmed by noise. Putting the same warning on a restaurant that serves french fries and a pack of cigarettes diminishes the usefulness of the warning (unless parking garages, coffee, and fried food are actually significant carcinogens?)
CCPA isn't just about warnings, though. CCPA also affects the data-harvesting abilities of businesses, and requires that a business be prepared to explain which personal information is stored and for what purpose.
I suppose that, to draw a better analogy with Prop 65, the requirements of Prop 65 did supposedly cause some manufacturing materials, certain dyes, rubbers, foams, and plastics, to be drastically removed from the marketplace. The story of lead alone is worth considering; as usual, lead was in the pipes. [0] We are not expecting a wave of cookie warnings, and indeed CCPA's language doesn't allow for it. Some businesses will have to alter their practices; some products may have to be withdrawn from the market entirely. The worry that children might get used to clicking through EULAs and giving away their data has already been shown true by the previous generation of Internet users; at this point, we are merely trying to curb the damage continuing to be dealt and done.
And remember: For every ingredient that needs a warning label, that ingredient also can't be dumped into streams or rivers. It's not just about a prettier warning label on the product, but about real improvements to the manufacturing process.
Agreed — CCPA makes the same primary mistake as GDPR, which is that while the spirit of the law (protecting users’ privacy) is reasonable, the ultimate consequence just ends up being bombarding users with popups, a worse UX for basically everyone.
To me, this indicates that the drafters of the law probably didn’t really think it through.
Which will win the crown for destroying more Californian livelihoods & wasting more citizen time in 2020, CCPA or AB5?
(AB5 is the anti-freelancing bill whose intended targets, Uber & Lyft, may escape its application, while many other California freelancers have their traditional contract work patterns made illegal.)
Some people understand it. We have dug in deep because companies are turning to us to interpret it. Here’s our resource to learn in case anyone is interested.
Note since these were originally published, the big thing that changed is that employees and employees of clients and vendors are now generally exempt from the CCPA. There is still a general notice obligation (think short privacy policy) for your employees, but that is about it. They also tightened up the FCRA and GLB exceptions, but those are not generally relied on by the majority of businesses out there.
Under CCPA, businesses do not "demand" data because there is no restriction on its collection. Businesses simply need to provide notice of what they collect.
Consumers cannot opt out of this collection.
Consumers do have the option to opt out of having their personal information resold to third parties. The CCPA then specifically restricts businesses from withholding services or providing you with reduced services as penalty for this opt-out.
CCPA may not be perfect or even well-explained, but it's a first step in a positive direction within the United States. I think it's unfair to call it "useless".
> 95% of users choose to be tracked in exchange for access to websites and services
The GDPR explicitly disallows the practice of conditioning access to a site or service on acceptance. Without that it would be rather useless. Once that bit is also enforced (current fines for violation sadly week focused on poor data safety measures and similar) I think the online ad landscape actually may start to change.
So if I run an online shop - and someone does not accept cookies - I still have to make it work for them somehow without cookies (or any equivalent technology)? Or am I misreading this? And if I'm not misreading this - how would you approach it?
No. Things you need to do to make the site/service function at all such as your login cookies or shopping carts don’t count. Non essential cookies is what it’s about.
Sadly, some sites seem to interpret that (incorrectly) as “well out business is to show news paid for by ads so the ad network cookies are essential”. These are the players I wish would be fined out of business.
Just for interest sake, do you ever read news on the internet, and if so how do you pay for it?
I personally read a lot of news online, most of it is such utter trash that I would not want to pay for it. I have paid for some specific sites intermittently - currently I pay about $20 USD a month to one specific content creator that produces news content - but that is mostly because it is rather niche news that nobody else (that I know of) is reporting on. For the rest of the news I consume I feel that it is of such a low quality that the ad revenue is all they deserve.
I read dozens of sites, but I’m not going to allow being tracked if I can help it. I also don’t much care whether these sites survive or not, and I absolutely wouldn’t care if they disappeared because people behaved like me (answering no to tracking and/or using ad blockers).
My thinking is that if everyone blocked tracking ads, then money would return to dumb ads (that now aren’t worth anything because of tracking/targeted ads). So I hope that’s the future. If it isn’t then I guess the less optimistic future is that half of all “free” content online disappears while the rest is concentrated in silos like Facebook and YouTube that can ensure eyes on ads. I think both futures are better than the status quo.
Thats an odd thing to say. Websites don't need to allow free access either. I can't comment on any particular revenue model but I imagine many websites chose personalized ads as they provide better revenue returns than non-personalized ads.
would switching to non-personalized advertisements without taking additional steps support the website enough? Maybe but its hard to say one way or the other without looking at the data.
Correct, they don't have to offer free access. What they do need to do is pick a business model that is legal in countries they wish to operate in. Things like GDPR doesn't prevent ads. Maybe they make less money off non-personalized ads, but companies can't do whatever they want just because they make more money that way. I don't find it odd that laws can hamper the revenue of certain business models.
> Thats an odd thing to say. Websites don't need to allow free access either. I can't comment on any particular revenue model but I imagine many websites chose personalized ads as they provide better revenue returns than non-personalized ads.
Another way to get revenue, which doesn't itself transgress against these privacy-focused laws, is to charge directly for providing your service. That's totally legal! Well, but maybe some companies would find that they don't get enough subscribers to fund their business—then the solution, in a privacy-focused environment, is that those business don't exist, rather than that they get a shadow source of funding by accepting bribes for participating in scummy privacy violations. This would be a very different environment from the one in which we live—clearly better in some ways and clearly worse in others—but it's far from impossible.
> Another way to get revenue, which doesn't itself transgress against these privacy-focused laws, is to charge directly for providing your service. That's totally legal!
Is that what the consumer wants? Also - is depriving services to those who aren't able to pay the fee to participate the right thing to do when they don't mind something like privacy focused advertisements?
Judging by all the people unwilling to purchase Youtube Premium I would say no.
Why are people not allowed to choose for themselves? Should we start banning what people can share on social media too for their own protection? Also not everyone can pay for content so you're punishing people who can least afford it by having direct payment be the only way forward.
Privacy laws that remove freedom and opportunity aren't very good laws.
I absolutely think that site owners and visitors should not be allowed to enter an agreement where content is provided based on selling PII of the visitor. The reason is simple: the visitor can’t be properly made aware of what they are actually paying. So it should simply be banned. Yes, at the expense of maybe a majority of content online disappearing. And yes at the expense of people who can’t afford to pay for content in cash being denied it entirely.
You are still allowed to consent to tracking cookies under the GDPR, so nothing has to change for you if you don't want it to. The difference is that you now have a choice, companies are no longer allowed to make that choice for you. What is so terrible about that?
> Privacy laws that remove freedom and opportunity aren't very good laws.
All laws remove somebody's freedom and somebody's opportunity, so either this argument is flawed or it indicates that we shouldn't have any laws.
I'm not a maximalist in allowing people to choose everything—the very notion of inalienable rights indicates something that a person cannot give up, not even by choice—but, even if I were, the problem with the current model isn't that I don't like the particular trade-offs people are making (though I don't), but that people aren't aware of those trade-offs. That, and the unfortunate confluence of companies' lack of desire to educate customers and customers' lack of desire to be educated, means that we're not really in a situation of informed choice.
Privacy isn't a singular action and should be a choice, which is a fundamental part of this legislation.
If you're talking about criminal laws then those are designed around harm and the greater good. There's no harm here because it's up to the individual, allows them to gain value from content, and their decision doesn't affect anyone else.
Informed choice is something else entirely, but people go throughout the day making choices out of complete ignorance and that alone isn't a valid reason for preventing their freedom. Considering the relatively trivial risk, this falls well under personal responsibility. I encourage more education around privacy but am absolutely against making the choice for them, because that's how we got into this mess in the first place.
Cookies and (wrongly considered) PII data like IP addresses are not only used for personalized ads. Contextual ads would also require them to function properly, which is an immediate "legitimate use" workaround.
I dunno, man. As you mentioned, all of the people who implement or enforce GDPR compliance seem to think that "our business model relies on ads" is a sufficient reason to require tracking. Maybe the regulators are just biding their time before pouncing, but I'm not sure why they'd want to do that or what they're waiting for after a year and a half. It seems more likely that GDPR as an ad industry killer was just a piece of HN lore that didn't end up panning out.
The wording was very clearly made to avoid any doubt about “we need to track people to survive on ads”. As I said I think it’s sad that so far the fines have been for data security and not yet for tracking-ads-without-opt-in.
I really do hope regulators will take a few high profile sites and make an example with a massive fine for blatant violations.
The rule is: if I visit a site then tracking is OFF until I switch it on. Seeing the content can’t be conditioned on accepting, and the default “ok close the popup and show me the article” should always result in the miminum cookies allowed - that is, typically no ad networks at all.
The basis in GDPR I've seen people use for your position is that consent must be "freely given", and consent isn't freely given if my other option is not using the site. The line of argument seems pretty sketchy on its face; can it really be true that my consent to an employment contract isn't "freely given" because my other option is not taking the job? It's certainly not clearly made to avoid doubt.
There are plenty of things that you cannot consent to under various circumstances. As an extreme example you cannot freely give consent to become a slave no matter how much you would like to (except by going to prison in a certain well known country north of Mexico, I suppose). In many cases you cannot consent to intimate acts with authority figures (e.g. boss, teacher, prison guards). I think in the case of the GDPR the desire was for consumers to be able to make a choice about their data, but such a choice would be meaningless if websites were allowed to make a "click here to consent, or here to close this browser tab" popup, and the "cookie law" shows that this likely the approach that 95%+ of websites would have taken. The cookie popups have shown quite clearly the balance of power between consumers and websites: website owners know consumers will get the exact same deal (accept or fuck off) everywhere else, so the consumers have no real power in this "negotiation". The definition of "freely given" consent in the context of the GDPR was probably written the way it is to correct for this power imbalance.
It's more complicated than that for ads but GDPR is incredibly vague and contradictory to the point that it has major issues in effectiveness.
It can be costly to be in perfect compliance but nobody is really is afraid of GDPR risk anymore, especially since most internet companies are not in the EU anyway and are completely unaffected by regional legislation.
No it doesn't. The "freely-given consent" clause is incredibly vague and cannot force a business to provide a value to consumers against their choice and/or for free.
There are also dozens of workarounds from legitimate use of data to contract-in-effect (like email newsletters). This is an example of the poor legislation aspects of GDPR and other privacy laws that are not based in technical reality.
This article makes it sound like this entire industry wasn’t deliberately set up and developed by corporations for years. They know exactly what they’re doing and they know exactly how to turn the whole thing off.
> they know exactly how to turn the whole thing off.
The problem is that no, they don't. Companies wrote infrastructure on premise that user would never ask/get/delete the info that is stored on him/her therefore now it's really difficult for companies to actually gather/delete data that is tied to a specific user in a safe matter.
Policy trade-off. Small businesses can only do limited damage from a consumer privacy perspective. Navigating policy for small businesses can be toxic as they don't have teams of high powered lawyers, and considering California's startup industry the last thing policymakers want is to hurt small businesses. This is a compromise.
> Surely if X is bad it should be bad for everyone rather than only for big companies.
I worry that regulations like GDPR benefit the existing monopolies, and make it too difficult/expensive for startups. The legal ambiguity of it, and the additional software requirements add a lot of roadblocks if you're a small company. I'd rather the burden be limited to established companies (but I guess that isn't quite fair either).
And for small/side projects, I just don't want to worry about this stuff.
I would just remove the articles. So my version: "California is rewriting rules of internet. Businesses are scrambling to keep up". Does it look better or worse?
Personally I think it is appropriate. California has long been known to do such things, for instance having different requirements for cars regarding emissions.
In general I think California is using their huge size and role (obviously, as home to most of the dominant internet companies like Google and Apple and Facebook) to push the rest of the country forward.
Typically it is forward. That doesn't mean "better", but it does mean that it is moving things in directions that the rest of the country eventually follows.
Can you give an example of something California did that was a move backwards? As in, moving back to a way that we did things in the past but the rest of the country has moved on from?
Well, disposing of excrement in the street was a normal part of life in many cities. Now that rules about that are not enforced in California, it's becoming dangerous to simply walk in some parts of it.
I think you are confusing individual cities with the state as a whole. This is also far more of an issue with lack of resources rather than a general policy.
San Francisco's homeless issue has a lot to do with the fact that tolerance is not equally spread across the nation. If every city was equally tolerant, there wouldn't necessarily be more homeless total, but they'd be more evenly spread out across the nation. As it is, they tend to migrate here from other places.
That said, I've lived in SF for nearly 20 years and seen human poop on the sidewalks maybe twice.
Yeah I'm sure the homeless in New Jersey buy themselves plane tickets to SFO because the weather is better and nobody kicks them off the sidewalk where they would prefer to lay. Blame everyone else for San Francisco problems...
Within limits. There is stuff in the Constitution about interstate commerce. For instance, California had to refund me an "emissions fee" after I moved here with a car bought elsewhere that didn't have the same emissions standards. It was found to be unconstitutional. Other climate initiatives are facing similar objections.
Of course the question when you want to block California or run different code for users in California is "how do you legally and reliably tell if a user is in California"; eg. if they're using a VPN to new york could they sue you/file a complaint and win? What if Maxmind's IP database is outdated and you miss a bunch of California users?
I suppose so, but it's a similar situation as GDPR/cookie consent when you need to perform 0-interaction data collection like running Ads for incognito users or creating a session that might also be used to track you on other websites. Maybe these use cases will just fall out of play as compliance when using them gets harder (which would not be a bad thing).
This single state contains Silicon Valley and is 14% of the US economy. Jurisdictionally, it could be challenged - but again, a business could just decline to do business with all customers in CA.
btw - nothing new here (fta): "Big companies are signing deals with firms that specialize in compliance".
Big companies already hire compliance companies for a large number of other regulatory requirements. This is just yet another such requirement - and it's still being hammered out in a public comment period prior to going into law.
Yeah, but this is the statement that puts into sharp relief what's going on: This is a trade dispute between two of the Top 10 Economic Powers, more or less, and it can't really be adjudicated in a way that the USA can take advantage of. The USA's primary establishment of data abuse is institutionalized but not written into law, and no legislator would dare do such a thing. California is doing what they have always done in these situations; they are demanding a better quality of life for their citizens, and they know that the USA has no reason to object other than corruption and aristocracy.
It's good that each state should be able to go off and do its own experiment. IMO the state rules which are the most smelly are those over taxes and non-disclosure of financial information.
How are these different from GDPR? Most global companies should already have most of this in place, so it will just hurt the small businesses which has yet to expand outside the states. Is there some limit to company size here to avoid that?
Most companies do not comply with GDPR, and it's not about the minutia, it's about just very basic concepts regarding user data and privacy.
I'm a US expat and honestly its extremely frustrating to try to use sites I formerly was able to access (with µblock) due to either absolutely obtrusive and confusing opt-in/out policies (e.g., Oath) or complete rejections/refusal of anyone with an IP outside the US (this is exceptionally frustrating as I don't even live in the EU, but because the lazy solution of applying the GDPR block to any non-US address is often used, I cannot read many US new sites)
Global companies have __not__ adapted to GDPR, and instead have just dragged their feet. I'm curious if the EU is going to do anything about it for sites that frequently do business in the EU, or if they will start enforcing actual GDPR policies on the big players (for example, Google is a beast to use in any EU member nation if you choose not to grant consent for data collection by just using element blocking)
The idea that companies have complied with GDPR does not mesh with my experience actually trying to use websites in the Schengen Area. And I don't blame GDPR, I blame companies for the insane amount of data and tracking they want for viewing even a minute amount of content.
50,000 is not a lot. If you set up a YouTube competitor and the view counter on a video has more than 50,000 views, do you qualify? What if you use cookies? It’s a personal ID of all 50,000 viewers.
It’s just a passive-aggressive hissy fit. The cost of writing the middleware to write out your angsty nonsense is higher than the cost of writing the middleware to just remove all cookies from every request. The EU detection logic is the same in both cases.
GDPR has a lot of burdensome requirements, like hiring a commissar and multi-month government review periods before features launches. Do not bet 21M USD that merely deleting cookies and logs will get you anywhere near compliance.
The theory is that Article 3 stops the EU from coming after you (however they might try) if you don't offer goods or services to EU data subjects or monitor their behavior. The GDPR is roughly the size of a novel, and proving you're doing everything it requires is a hell of a lot more work than proving you're out of scope entirely.
It's still extremely vague how many releases will require the eight-week review with six-week extension in https://gdpr-info.eu/art-36-gdpr/, but this is the kind of friction and cost they're imposing even on groups who aren't doing anything bad.
And my point is that I don't see EU competitors jumping into the gap left by companies that simply firewall off the EU.
Now, it could simply be that everybody is waiting to see if the GDPR enforcement is going to become aggressive before doing so. There is no point in trying to compete with an entrenched US company until the EU actually neuters them.
For starters, one thing that the big global companies did was carefully internally segment EU/non-EU users so that they make the technical implementation and business processes to comply with GDPR, but apply these things only when necesseary, and keep the data processing of their (very profitable) USA users the same. A small or medium website most likely went either with privacy-for-everyone or ignore-the-GDPR, but any large global company can afford separate flows and has large enough financial incentives in mining that data so that it makes sense to differentiate.
For example, Equifax has a bunch of subsidiaries in EU who have to follow GDPR - and mostly are following it, or at least make a decent show in attempting so; but the main USA Equifax business could not bother with it, to great benefit of Equifax itself and detriment of all the USA citizens whose data Equifax mismanaged.