Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Early-stage startups with just a couple of founders who are too overworked to give a good answer about data protection are probably too overworked to actually protect the data itself, law or no law. We generally don't think it's reasonable for an early-stage startup to be too overworked to get their security right - you still have to write secure code, patch your servers, set up HTTPS, etc. Why is this different?


> you still have to write secure code, patch your servers, set up HTTPS, etc. Why is this different?

It's very different. Here you're requested to answer fairly detailed and potentially tripping questions with potential legal implications on your business. This has little with how you secure things technically. It's all about jumping through some bureaucratic hoops, and wasting your time doing it. Answering those questions won't in any way, shape or form improve the security of your business. It's pure distraction.


> It's very different. Here you're requested to answer fairly detailed and potentially tripping questions with potential legal implications on your business. This has little with how you secure things technically.

The difference is that you know offhand how to do one of these things but not the other.


> Answering those questions won't in any way, shape or form improve the security of your business. It's pure distraction.

Answering the questions is not intended to improve the security of your business, it's a form of serving your customers.


I think the legal implications people talk about are overwrought. Regulators are more interested in chasing down people who open flaunt the law, rather than repressing startups that can’t cross and dot their legal t and i’s.

My personal experience with the ICO has shown their quite lenient to mistakes, if you can show that you’re your honest best, and getting better.

No point crushing companies that are trying, better of getting the ones that just don’t care.


I certainly hope so. I'm not sure about Germany though. I have a feeling they're much more sticking to the rules (My company is based in Germany, but I'm not German, so it's kind of an outsider's observation).

But it's not even just about getting to a point of getting fined or under some kind of investigation or audit. It can be all those clever customers who would use some automated service or a template, just to waste your time ... At least that's what the original post is about, but I hope it won't be too common.


If the law relies on the regulator being in a good mood and be sensible, it is a badly written law. The law should give the regulator strict mantinels, not be subject to broad leeway in interpretation.

Around here, regulators are prone to scoring easy points by going after the small, naive fish. All it takes is the wrong incentives: the department needs to show results, so it gives bonuses, or establishes quotas for successfully handled cases. Bam, your small business is now investigated because a government employee needs to meet a quota and correctly guesses you can’t afford competent legal defense.


It's not designed to help you. Improving the security is your job. It's designed to reveal, through your answers, whether you're doing it right or not.


Early-stage startups with just a couple of founders who are too overworked to give a good answer about data protection are probably too overworked to actually protect the data itself, law or no law.

People keep making this kind of argument, but it makes no sense.

Personal data isn't protected from leaks and privacy intrusions by documents or emails. It's protected by encryption, or only being processed by software with a clear purpose, or simply not being stored in the first place.

I suggest that it is not only possible but also quite likely that a reasonably diligent startup will be taking reasonable practical steps to secure personal data but will not have formal documentation or automated processes in place of the kind that would deal with a SAR like this.


I agree that it is likely that a reasonably diligent startup is generally doing the right things just out of general competence. But I disagree that they are reliably doing the right things.

We expect programmers to write working code out of general competence (and we even make sure they know how to write working code in the interview process), but we still write tests and insist that they pass. We expect finance folks to handle money correctly out of general competence, but we still have written policies about how money should be handled. The reason we do these is that good, well-intended people occasionally make mistakes, and in both of these cases, the mistakes have real consequences.

A written policy about how you handle data isn't going to save you if you're messing up in general. But it should be easy to write, and it will save you from "Wait, why did one of our interns add a library that sends stack traces and local variables to a third party? How did this code review even get approved?"

The documents don't protect your users' data. Your general technical practices protect your users' data. The documents protect your general technical practices.


That seems like a reasonable argument, but I can't help observing that when we write code, documentation is generally viewed sceptically because it so easily gets out of sync with the actual behaviour of the system. Automated tests have become a more trusted check on whether code is doing the correct thing, because they aren't vulnerable to that same effect, but there doesn't seem to be any direct equivalent in this context.

So I think I would still argue that the security benefits of this law in terms of any documentation and processes it requires are at best unproven, and that a startup could be doing the practically useful things needed to protect personal data regardless of how compliant or otherwise they might be with any documentation requirements.


I wanted to send you an email or a twitter DM, but your HN profile doesn't list contact info. (I'm anonymous because I am a moderately visible figure in the tech community and don't want what I say to result in my company getting flamed.)

I wanted to tell you how impressed I am with how patiently and clearly you've responded throughout this comment section.

I likewise think the intent of the law is admirable: prevent future Equifax-es, give people control over their data, and centralize the requirements so that companies need to comply with a single EU standard, instead of 28 country-specific ones. But the amount of discretion left to regulators and the lack of any sort of proportionality built into the law make this all very scary. We are expecting a fifteen person small business to have a totally impractical degree of _documentation_ and _formal_ processes, which are 1) very expensive to produce, 2) totally unnecessary for an otherwise reasonable and well-intentioned group of people, and 3) crucially, basically orthogonal to actual data privacy and security best practices.

And even if you comply with the letter of the law, just reading and understanding an email like the one in this post will require hundreds of dollars of company time – beyond reading it, it will need to be escalated, someone will need to loop in a few other people to help with any new technical details, and so forth. If the fully-loaded cost of a white collar employee is $75/hr, this all gets expensive very quickly, and that cost can be levied on a company by an email that can be sent in one minute. Nobody is going to bring down Google with GDPR-spam but it would not be hard to do serious damage to a company of ten people.

There are a lot of well-meaning thoughts in this thread from people who are frustrated at the status quo but unfortunately don't understand how little this law will do to change it and how huge its costs will be.

When you try to deliver a novel product and build a business around it, you are forced to develop a strong sense of practicality and an understanding of the machinery of a business. Most people have never done this. Despite being very intelligent, a lot of these people haven't experienced the realities of creating a business, and as a consequence they don't really understand just how harmful this kind of law can be.

I admire how patient and articulate you are. (And I think your thoughts are clear and your point of view is correct and badly needed.) Would love to buy you a beer sometime.


Couldn't agree more. It's not just that I can totally relate to everything Silhouette was saying, but he/she definitely presented their thoughts calmly and thoughtfully, even in the face of quite blatant trolling in a few instances.

Since Silhouette (and gdpr_throwaway) want to keep their anonymity, I opted for virtual beers by upvoting :) But happy to convert those karma points to real food or drink -- and hopefully an insightful conversation -- if you feel like getting in touch (my details aren't so private).


Thank you, that's nice of you to say. The ability to contribute honestly to this sort of controversial discussion is exactly why I have a pseudonymous account, so sadly I won't be able to take you up on that beer, but I do appreciate the thought.


I disagree with you in the burden that GDPR places on a company. If a company takes data protection seriously handling such a letter would be a matter of minutes because they already have the processes in place. The GDPR is almost two years old now and it's just an update of the DPR which has been in place since the mid-90s: nobody should be caught by surprise by now except companies that deliberately decided that making sure you're compliant with the law is something that should be ignored right until the cops are knocking at your door.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: