Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If users do abort their usage of the site, the site is effectively bricked. If they don't, then HPKP accomplished nothing because users are using the site despite a possible mitm.

That's exactly the same situation as any other TLS failure, not at all unique to HPKP in any way that I'm seeing.

It's still effectively bricked for non-advanced users and partially bricked for careful advanced users in the way you noted, but at least users can choose for themselves how to proceed, and admins of bricked sites can give them guidance that doesn't involve following convoluted instructions to navigate about:config or chrome://net-internals.

It's not that it's hard to understand, it's that it's hard to actually implement it. You need to have multiple certs in case one of them gets compromised. And if you mess that up, then you either self brick yourself or you need to keep using a known compromised cert.

More accurately, multiple keys, not multiple certs. All you need is to back up the spare key somewhere without throwing it out, which is a minor annoyance but not at all technically difficult.

If users are having trouble with understanding and/or following through with this, I would start with building a better interface than the openssl CLI (possibly as a certbot command) before deciding that the entire concept of key pinning is somehow inherently too difficult to be useful.

HPKP just wasn't worth it for most websites - a reduction in the risk of someone presenting a forged cert in exchange for the risk of accidentally self bricking your website.

Yeah, it should certainly be highly discouraged for almost everyone, but getting rid of it after we already have it is a huge step backwards for the 1% of sites with strict enough security requirements to justify it.



> That's exactly the same situation as any other TLS failure, not at all unique to HPKP in any way that I'm seeing.

Yup. But the feeling I'm getting is that browser vendors see this behavior as non-ideal since it trains users basically ignore the error. Yeah, in theory the user gets to make their own decision. My theory is that almost no user is actually equipped to make such a decision.

> admins of bricked sites can give them guidance that doesn't involve following convoluted instructions to navigate about:config or chrome://net-internals.

I see this as a worst case outcome - explicitly telling users its ok to bypass a security warning.

> All you need is to back up the spare key somewhere without throwing it out, which is a minor annoyance but not at all technically difficult.

Not technically hard, but still plenty of ways to mess it up. And once it's messed up, there isn't much of a good way to fix it.


Ah, well that's fair, and I think I'd generally agree with that. I don't have an alternative proposal for handling TLS failures in general, but I think it's silly to arbitrarily make HPKP's UX a special case, and then cite that special case UX as a reason for deprecating it.


How is this UX behaviour a special case? HSTS also requires the brickwall UX, so does the OpenSSH key change scenario.

The original sin the Browsers had is that the initial SSL UI was built by people who had no security UX background because almost nobody had any security UX background. This was the era when PGP was considered usable security technology.

So when HCI studies start being done (e.g. at Microsoft) and they come back with the scary result that real users just perceive TLS error dialogs and interstitials as noise to be skipped, there is a problem. Lots of real world systems depend upon skipping these errors. I worked for a large Credit Reference Agency which had an install of Splunk, but for whatever insane reason it was issued a cert for like 'splnkserver.internal' and the only HTTP host name that it accepted was 'splunkserver.internal'. So every single user of that log service had to skip an interstitial saying the name doesn't match. For years. Probably still happens today.

Browsers couldn't just say "OK, that was bad, flag day, now all TLS errors are unskippable" because of the terrible user experience induced, so what happened instead is a gradual shift, one step at a time, from what we know was a bad idea, to what we think is a better idea. That means e.g. "Not Secure" messages in the main browser UI replacing some interstitials, and brick walls ("unskippable errors") in other places where we're sure users shouldn't be seeing this unless they're being attacked.

HPKP was new, so like HSTS it does not get grandfathered into the "skippable because this is already so abused we can't salvage it" state. If you went back and asked HPKP designers "Should we do this, but with skippable UI?" they would have been unequivocal, "No, that's pointless". HPKP and HSTS only improve security if the users don't just ignore them, and the only way we've found to make the user actually pay any attention is to make the error unskippable.

Yes that means "badidea" and subsequent magic phrases in Chrome were, as they say themselves, a bad idea. Because users who know them just skip the unskippable errors and end up back in the same bad place.


Thanks for all the interesting context and backstory; I wasn't aware of any of that.

In any case, if it was unclear, my point here wasn't that I necessarily dislike the brickwall UI. In light of the studies you've referenced, I definitely prefer it, and if it were up to me it would be enabled for all of TLS regardless of how many existing services with broken deployments are out there.

My point is that, if the more secure UX is part of the reason for Google's decision, I would rather have HPKP with a less secure UX than not have it at all.


Fair enough. The way that HPKP was deployed and designed and then undeployed was really quite awkward.


The more awkward point to me was that HPKP and HSTS were invented to begin with. It's like everyone is sitting on top of a pink elephant, going "Well everyone, we can't acknowledge the pink elephant in the room, but we can make it a nice hat."




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: