China is doing great. Not saying the UK will do well, just that authoritarian regimes can be successful as states although not great for the commoners.
China only started doing great when they relaxed their ultra-centralized economic rules a little bit in the 1990s.
Read business books and news from the 80's - 90's, and they almost never mention China - it's all Germany, UK, Japan, USA. The stats tell the same story - China spent half a century going nowhere fast.
After liberalizing their economy, China spent the 90's quietly growing, and only started making real waves in the news around 2000.
All this to say that economic authoritarianism has never worked and there's no reason to suppose that the social kind is going to fare any better for anyone either.
Economic liberalism isn't really relevant to the question of social authoritarianism. While an enterprising individual in guangzhou can sell whatever he wants to the world without much state involvement, he can't really go around discussing Tibetan sovereignty for example.
For the record, I'm not actually against age verification for certain content. But it would have to be:
1) private - anonymous (don't know who is requesting access) and unlinkable (don't know if the same user makes repeated requests or is the same user on other services).
2) widely available and extremely easy to register and integrate.
The current situation is that it's not easy, or private, or cheap to integrate. And the measures they say they will accept are trivially easy to bypass - so what's the point?
I worked in a startup that satisfied point 1 back in 2015. The widely available bit didn't come off though when we ran out of runway.
Add to that
3) Verifiable to a lay person that the system truly has those properties, with no possibility of suddenly being altered to no longer have those properties without it exceedingly obvious.
This whole concept runs into similar issues as digital voting systems. You don't need to just be anonymous, but it must be verifiably and obviously so — even to a lay person (read your grandma with dementia who has never touched a computer in her life). It must be impossible to make changes to the system that remove these properties without users immediately notice.
The only reason why paper identification has close to anonymous properties is the fallibility of human memory. You won't make a computer with those properties.
It's easy to demonstrate (3) for an age verification system - practical experience will amply demonstrate it to everyone.
Voting is very different - you do need to be able to demonstrate the fairness of the process verifiably to everyone - not just crypto nerds. Age verification - well, some people might get around it, but if it generally seems to work that is good enough.
>It's easy to demonstrate (3) for an age verification system - practical experience will amply demonstrate it to everyone.
No. Absence of evidence that I am not anonymous does not constitute evidence that I am anonymous. Verifiable unlinkability is also difficult to prove.
It may be possible to create a system like this technically, but all social and economic incentives that exist are directed against it:
- An anonymous system is likely more expensive.
- The public generally does not care about privacy, until they are personally affected.
- You have no idea as a user whether the server components do what they say they are doing. Even if audited, it could change tomorrow.
- Once in place its purpose can change. Can you guarantee that the next government will not want to modify this system to make identification of dissenters, protestors or journalists easier?
Any well designed privacy system does not rely on the server components doing the right thing. Servers and providers and governments are the main threat actors to be defended against. There should be no way for third parties to compromise that, by design. Almost certainly involving advanced cryptography.
Unlinkabilty and anonymity is not that hard to demonstrate in the design. At it's core it just means each proof or token is unique each time it is presented, and having no mathematical relation to others (and therefore not tied to any persistent identity either).
Client implementations may need auditing of course to make sure they are doing the right thing. But this is not really different to any other advanced technical system which we rely on every day (e.g. TLS).
As you say though, most of the public don't massively care about privacy (unless you mean their visits to porn sites I guess). But they do seem happy to accept crypto coin security assurances without being crypto experts.
As for "the purpose can change" well - so? That is also true or anything else, it does not seem like a reason to avoid having good protection now. Any change that could compromise that would not be undetectable - the fundamental crypto should not allow it. We would know if it happened.
All your arguments are technical. It's the social layer that is the problem.
>Any well designed privacy system does not rely on the server components doing the right thing.
This is more expensive than just throwing everything into a centralized database. The extra costs needs to be justified when explaining the price to the voters.
>Servers and providers and governments are the main threat actors to be defended against.
Agreed. And they are the ones implementing the system. Clear conflict of interest.
>As for "the purpose can change" well - so? That is also true or anything else, it does not seem like a reason to avoid having good protection now. Any change that could compromise that would not be undetectable - the fundamental crypto should not allow it.
Introducing an age-verification system requires a lot of political capital (as seen by the repeated failures of introducing it so far). Nudging an existing age verification system to cover new purposes requires far less political capital.
>We would know if it happened.
Only if every technical decision goes the right way, despite all incentives and conflicts of interests pointing the other way. I wouldn't bet on it.
Age verification should be done at the point of buying a laptop or a SIM card, the same way as when you buy alcohol. And there would be no need to send your ID to a company so that it ends up on the black market eventually.
there's some irony that the EU is set to have a fairly anonymous solution like next year. they could have waited or tried to use similar tech for this, in theory
Important to note: Their anonymous solution is reported to be temporary until their digital ID system is released[1], which does not offer that same anonymity, but rather functions as a server-side OpenID-based authentication system.[2] While you can share only your age with an online service, it still creates an authorization token, which appears to remain persistent until manually removed by the user in the eID app. This would give the host of that authentication system (EU and/or governments) the ability to see which services you have shared data with, as well as a token linked to your account/session at that service. There is also no guarantee that removing an authorization will actually delete all that data in a non-recoverable way from the authentication system's servers.
It's anonymous to the sites or companies you use it with and not to the government, but that would still be more robust than the uks checks so far. it's only end of 26 though, I thought it was at the end of this year instead.
And that really shows the difference in how the EU operates Vs the UK.
They see a general need which the market cannot easily satisfy on its own - it needs standardisation to be cheap and interoperable, and it needs an identity backed by a trusted authority. So they establish a framework and legislation to make that possible.
The UK instead just states it's illegal not to do it, but without any private and not-trivially bypassed services available.
Proactive vs reactive.
It is often said that legislation tends to lag behind technology. At last, the UK is beating the world by legislating ahead of it!