Hacker News new | past | comments | ask | show | jobs | submit login

This is the ultimate nihilistic take on security.

Yes, 'cyber' security has devolved to box checking and cargo culting in many orgs. But what's your counter on trying to fix the problems that every tech stack or new SaaS product comes without of the box?

For most people when their Netflix (or HN) password gets leaked that means every email they've sent since 2004 is also exposed. It might also mean their 401k is siphoned off. So welcome the annoying and checkbox-y MFA requirements.

If you're an engineer cutting code for a YC startup -- Who owns the dependancy you just pulled in? Are you or your team going to track changes (and security bugs) for it in 6 months? What about in 2 or 3 years?

Yes, 'cyber' security brings a lot of annoying checkboxes. But almost all of them are due to externalities that you'd happily blow past otherwise. So -- how do we get rid annoying checkboxes and ensure people do the right thing as a matter of course?






Actual accountability. Do not let companies be like "Well, we were SOC2 compliant, this breach is not our fault despite not updating Apache Struts! Tee Hee" When Equifax got away with what was InfoSec murder by 6 months of jail time suspended, Executives stopped caring. This is political problem, not technology one.

>So -- how do we get rid annoying checkboxes and ensure people do the right thing as a matter of course?

By actually having the power to enforce this, if you pull our SBOM, realize we have a vulnerability and get our Product Owner to prioritize fixing it even if takes 6 weeks because we did dumb thing 2 years ago and tech debt bill has come due. Otherwise, stop wasting my time with these exercises, I have work to do.

Not trying to be mean but that's my take with my infosec team right now. You are powerless outside your ability to get SOC2 and we all know this is theater, tell us what piece of set you want from me, take it and go away.


It's a two-sided coin though.

We should be stopping leaks, but we also need to reduce the value of leaked data.

Identity theft doesn't get meaningfully prosecuted. Occasionally they'll go after some guy who runs a carding forum or someone who did a really splashy compromise, but the overall risk is low for most fraudulent players.

I always wanted a regulation that if you want to apply for credit, you have to show up in person and get photographed and fingerprinted. That way, the moment someone notices their SSN was misused, they have all the information on file to make a slam-dunk case against the culprit. It could be an easier deal for lazy cops than going after minor traffic infractions.


The problem with "identity theft" specifically is that, in itself, it's just a legal term for allowing banks to save on KYC by letting them transfer liability to society at large.

If someone uses your SSN to take a loan in your name, it shouldn't be your problem - in the same way that someone speeding in the same make&model of the car as yours shouldn't be your problem, just because they glued a piece of cardboard over their license plate and crayoned your numbers on it.


> For most people when their Netflix (or HN) password gets leaked that means every email they've sent since 2004 is also exposed. It might also mean their 401k is siphoned off. So welcome the annoying and checkbox-y MFA requirements.

Not true. For most people, when their Netflix or HN password gets leaked, that means fuck all. Most people don't even realize their password was leaked 20 times over the last 5 years. Yes, here and there someone might get deprived of their savings (or marriage) this way, but at scale, approximately nothing ever happens to anyone because of password or SSN leaks. In scope of cybersec threats, people are much more likely to become victims of ransomware and tech support call scams.

I'm not saying that cybersec is entirely meaningless and that you shouldn't care about security of your products. I'm saying that, as a field, it's focused on liability management, because that's what most customers care about, pay for, and it's where the most damage actually manifests. As such, to create secure information systems, you often need to work against the zeitgeist and recommendations of the field.

EDIT:

> This is the ultimate nihilistic take on security.

I don't believe it is. In fact, I've been putting efforts to become less cynical over last few months, as I realized it's not a helpful outlook.

It's more like, techies in cybersecurity seem to have overinflated sense of uniqueness and importance of their work. The reality is, it's almost all about liability management - and is such precisely because most cybersec problems are nothingburgers that can be passed around like a hot potato and ultimately discharged through insurance. It's not the worst state of things - it would be much worse if typical cyber attack would actually hurt or kill people.


This really resonated with me because I'm also working to avoid becoming more cynical as I gain experience and perspective on what problems "matter" and what solutions can gain traction.

I think in this case the cognitive dissonance comes from security-minded software engineers (especially the vocal ones that would chime in on such a topic) misunderstanding how rare their expertise is as well as the raw scope of risks that large corporations are exposed to and what mitigations are sensible. If you are an expert it's easy to point at security compliance implementation at almost any company and poke all kinds of holes in specific details, but that's useless if you can't handle the larger problem of cybersecurity management and the fallout from a mistake.

And if you zoom out you realize the scope of risk introduced by the internet, smart phones and everything doing everything online all the time is unfathomably huge. It's not something that an engineering mentality of understanding intricate details and mechanics can really get ones head around. From this perspective, liability and insurance is a very rational way to handle it.

As far as the checklists go, if you are an expert you can peel back the layers and realize the rationales for these things and adjust accordingly. If you have competent and reasonable management and decision makers then things tend to go smoothly, and ultimately auditors are paid by the company, so there is typically a path to doing the right thing. If you don't have competent and reasonable management then you're probably fucked in unnumerable ways, such that security theater is the least of your worries.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: