As far as I know, eSIMs are usually implemented in physical hardware, close to (or maybe even on) the baseband, but independent from the baseband firmware.
This is because they are considered a trusted execution environment; if they weren't, it would be possible to "clone" eSIM instances.
Exactly. The whole baseband in your phone is considered "trusted" by the network because you can't easily control it. Don't give the carriers ideas - if they thought they could move to a "ma Bell" style of you leasing the phone from them without ever gaining ownership rights of it, someone would try to! Look at carrier locking, and the world of CDMA, where your phone has to have its ESN (serial number) manually whitelisted to join the network... It's a whole different world from general purpose computing!
The mobile standards are built around the assumption that the baseband does as it's told by the network - your phone's transmit slots get scheduled by the base station, and your phone sits quietly until those slots to speak. This extends to the wider architecture and design of the ecosystem - the user is not "meant" to be in charge of their device in the mobile ecosystem. With the split between AP and CP (application processor and cellular processor), if you put the CP on a suitable bus like USB which doesn't give DMA access, you can build a phone you have sufficient control of (see Pinephone etc).
In the world of SIM, this is back to carrier thinking - they control the SIM as it's "theirs". The keys on the SIM are known only to them, not even to you. You're not trusted to know your own SIM authentication parameters. This can be helpful in some ways, as it makes the threat model different to other systems and you can't unwittingly disclose your keys to someone through social engineering... But it's less helpful as customers generally don't think like security architects who designed this, and end up just having their physical SIM stolen, or their carrier ports their number after social engineering...
Where I'm from, phones were always decoupled from carriers. The carrier sells you a SIM card, that's it. It's on you to buy or already have a compatible phone to stick it into. I don't think any of the big carriers ever offered financing as part of the contract the way US ones do. Also we only have prepaid plans.
It's a shame still that you can't have a 100% open-source phone. I'm the kind of person who believes that all of the humanity's knowledge must be freely accessible to everyone. Including schematics and documentation for every device ever made, including ICs. It's counterproductive when multiple companies have to reinvent the same thing... and then keep it secret like the others.
not exactly modern, or dystopian - see also paper money, for instance, which may be owned by you, but is controlled by another entity and contains features making it hard to modify or duplicate.
the analogy is flimsy, i suppose (paper thin, lol?) but the problem is that the user cannot be trusted to be non-malicious. however, with esim technology i had assumed the trust was assured using keys owned by the proivider, so i'm not sure whether there's something else going on here?
> with esim technology i had assumed the trust was assured using keys owned by the proivider, so i'm not sure whether there's something else going on here?
There is trust both ways:
- You trust the provider's keys so that nobody can later intercept your traffic, as the keys encrypted under it will later be used to encrypt and authenticate that traffic. (Of course the networks themselves have ample security holes and allow for lawful interception, but that's another topic.)
- The provider trusts your eSIM to not expose your keys to the baseband or application processor ever. If it wasn't for that, the provider's invoices might not be defensible in court in case of a billing dispute: You could easily claim that you've been subject to malware that stole your authentication keys and then went on to call toll numbers for hours.
Theoretically, the first point is only addressing your own risk, but it seems like the eSIM designers seem to have taken the position they did (mandatory GSMA PKI signatures). Unfortunately, this also means that "homebrew eSIMs" are out of reach for now.
The latter is very similar to the idea of chip credit and debit cards: The issuer relies in both users and fraudsters not being able to extract and duplicate a card's keys, so that use of these keys can be seen as proof of the authentic card being involved.
Money is only valuable because the society makes it so, and especially because governments only accept taxes in their own currency. But if you own a banknote, it's fully yours. You can spend it on anything — including something illegal like drugs. Or you can draw something on it thus invalidating its value. The government that issued it doesn't have a say in any of this.
But with modern locked-down electronics, you could only do what the manufacturer intends, and nothing more. Continuing with monetary analogies, it's like a credit card that only works for things your bank considers "good" for you.
Sandboxed trusted computing actually offers a way out of this dilemma: Rather than having an entire phone/computer etc. locked down (so that some third party can trust it), there is only a trusted subsystem that can interact with the larger system only in limited and well-defined ways.
Microsoft's plans for the TPM back in the early 2000s have given the entire concept of trusted computing a bad reputation, but besides DRM, there are many legitimate use cases for it that are not anti-consumer/anti-freedom.
Sure. Cloud servers are a good one. But I still see no benefit for the end user to lock down any consumer devices like that. It only benefits the device manufacturers themselves. Like, you know, Apple forcing its online services onto people literally by burning stuff into silicon. I don't have a problem when hardware and software are tightly integrated. I do have a problem when said software isn't modifiable and has a hard dependency on servers you can't control and can't self-host.
Let people modify their modem firmware, just make sure they understand what they're doing. But they might interfere with other people's service, you say? They could as well do that with a $300 SDR, or they could buy a purpose-built cellular jammer. Let governments enforce their laws, don't make something technically impossible because making it possible might enable someone to break a law.