Hacker News new | past | comments | ask | show | jobs | submit login

I hold Sony primarily responsible for the release of private data, due to their ignoring basic security practices. Why are health records stored on Sony Pictures servers along with everything else? Why were data silos and graduated access not in place? I never see any of these corporate officers held to account for their decisions to not spend resources for security. The only people I have any measure of sympathy for are the rank-and-file employees caught in the middle of decisions made by well-compensated executives who never have to face the consequences of their disregard for anything other than themselves and their own compensation.

I have to take issue with "norms" for intelligence services as well. These are groups with no morals or ethics, what makes you think they would ever adhere to any sort of "norm." These are criminals and criminals do not adhere to norms imposed from anyone other than themselves.




I seem to be in the minority on Hacker News, but as someone in the professional computer security field I know that any company or state/department/organization can be hacked by a motivated attacker. In the case of SONY, the attackers were able to enter the network through spearphishing emails - something that essentially no investment in security is going to prevent. The malware similarly could not have been detected, as signatures for this specific compilation were not known.

I have a hard time blaming the victim of a cyber attack that would have been practically impossible to prevent. I agree that SONY made bad decisions with regard to its hording of unnecessary data, but also recognize that this is hardly unique to SONY and not standard advice given by security professionals (it should be).

Norms are important so that you can accuse 'groups with no morals or ethics' of doing something wrong. Norms may only discourage and not prevent behavior but without norms its difficult to find common ground for behavior that may otherwise be chalked up to 'culture' or 'tradition' or 'nature'.


> but as someone in the professional computer security field I know that any company or state/department/organization can be hacked by a motivated attacker.

You seem to give Sony too much credit, and also forget that they had a file server with open internal access which had a directory called "Passwords" which contained a plain text file with all the credentials to their internal servers.

That's something I'd expect to see at some small business with no professional IT on staff... certainly not from a multi-billion dollar company with thousands of employees and a full-time professional IT staff.

Sure, the attackers may very well have spearphised their way inside, but once inside, they didn't have to go through any of the normal hassles of island-hopping with more exploits, etc. They just logged in like they belonged.

Motivated attacker or script-kiddy, once inside, Sony made it awfully easy.


> You seem to give Sony too much credit, and also forget that they had a file server with open internal access which had a directory called "Passwords" which contained a plain text file with all the credentials to their internal servers.

FWIW this is my experience with multi-billion dollar companies with thousands of employees and full time professional IT staff.

Perhaps we can get other security professionals to chime in.

Once you get a foothold in a corporate environment, it is the unfortunate truth (I'm sure others will back me up here) that it is very easy to move around without 'island hopping with exploits'. For the most part, pivoting by passing-the-hash will work for 99% of networks.

It is also my understanding that the malware that was purchased for this compromise had the capability to persist across the network, to exfiltrate data, and to sabotage computers.


> the attackers were able to enter the network through spearphishing emails - something that essentially no investment in security is going to prevent

I'd challenge that assertion. Employee's are often the first line of defense for any company, be it seeing something suspicious or knowing when to alert the right people. Investing in phising attack training can be very worth-while. Or at least adopt a strict company policy that helps ward off the basic forms of this attack.

It's not uncommon to have a company-wide policy that users are not allowed to open attachments in any email from anyone without IT's approval. It's inconvenient, sure, but it protects against multiple email-based attacks (everything from simple viruses to more advanced phishing attacks).

There's even phishing attack training specifically targeted at large enterprise (they send phishing attack emails to your targeted employees and when they fall for it, they get a quick lesson and explanation). [1]

[1] http://threatsim.com/how-it-works/


I have never seen corporate policy with regard to attachments and link following effectively thwart a spearphishing campaign and have been privy to studies done at large corporations before and after phishing-awareness training. The short of these studies is that after approximately a week employees mostly reverted to regular habits and that during the week of high alert many employees fell to the internal audit anyway.

Then again, this is only from two studies done at one large corporation.

I looked around but could not find any studies or data about the long term effectiveness of phishing awareness campaigns (only PR junk), nor could I find evidence that SONY did not engage employees with these sorts of policies and training. Do you know of any such studies?

Do you believe that #GOP would not have gotten in if there were more strict policies and more frequent training?


> In the case of SONY, the attackers were able to enter the network through spearphishing emails - something that essentially no investment in security is going to prevent.

Investment can make spearphishing much harder. Defense is not always absolute, but about raising the cost for the attacker.


I agree that all security is a cost-benefit tradeoff. This is of course folklore wisdom. The importance with regard to the SONY case is that SONY was not the victim of an opportunistic attack but was targeted specifically. In this case, it is highly likely that SONY did invest in training its employees in corporate policy and security awareness (at least as much as any other corporation).

I have trouble thinking of a cost-effective way that SONY could have prevented #GOP from getting in.

IMO SONY had two failures:

1.) The hording of data. Again I don't think that this is uncommon. I would expect to see this at pretty much any company of their size.

2.) The lack of an ability to respond to the APT once it was discovered. This is extremely tricky business, but a critical piece of security. It is common now for businesses to assume that they have been compromised and to build out the capability to recover and isolate issues as quickly as possible. Unfortunately for SONY, all of their data had been exfiltrated out of the network by the time they knew there was a problem.


> The importance with regard to the SONY case is that SONY was not the victim of an opportunistic attack but was targeted specifically.

Amazon, Google etc are specifically targeted all the time. What's different?


Nothing is different if they are also targeted specifically.

The context of the discussion is that SONY, even if it 'increased spending on defense' would have been compromised because it was targeted in an attack rather than an attack of opportunity.

Amazon and Google also get hacked. So does Adobe and Microsoft. So does the DoD and Whitehouse. So does JPMorgan and Wallstreet.


> Amazon and Google also get hacked. So does Adobe and Microsoft. So does the DoD and Whitehouse. So does JPMorgan and Wallstreet.

The difference between Sony and the other companies you listed is the effort they put into security/technology-defense.

Yes, anyone might be hacked. That doesn't mean you just throw your arms up and let it happen. Sony effectively threw their arms up.


Ditto keep in mind that so called "hacking isn't just digital. Social engineering in many instances is involved in hacks. Boil it down to not only discovering vulnerabilities in code, but people as well.


I agree about lack of basic security, and that's the reason we have security compliance programs. Security Awareness Training, classification of health records as sensitive, and properly segmenting those sensitive health records from the rest of the environment are all appropriate controls that security compliance prescribes. It took me 6 months to decipher PCI and 3 months to implement. To others, compliance may seem like a joke, but I felt very confident that at least I had done 100% my due diligence in protecting our customers and employees. I think that's all they can ask and all we can give, 100% honest due diligence.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: