Hacker News new | past | comments | ask | show | jobs | submit login

> That was 11 years ago.

The problem is, these kids are from college. They don't teach you stuff like "writing a secure web application" in college, or even try to.

(Not that this is unreasonable, though perhaps I'm suggesting that there should be different career paths for CS majors and people who intend to be professional programmers. (I say as a CS-educated professional programmer))




I've always thought it would make more sense for CS degrees to be for computer scientists (ie, people who want to do more high-level theoretical work), and that software development was more of a trade school, where you learned the languages, and were soon thrown into real-world style projects and apprenticeships.

Imagine if your nurse came out of college having never stepped foot into a hospital, having only read about how to take vitals and such, but never having done it on a live human being.


I agree with the sentiment, but it's important to note that excellent programming requires some pretty high level theoretical understanding of Computer Science (i.e. Algorithms). There's a slow way to do everything and a fast way to do some things; programmers need to understand the theory behind this. In addition, if you're trying to teach someone how to write secure code, they're going to need at least some understanding of crypto. Crypto has theoretical aspects to it as well. Basically when you keep all the parts of theory that are useful to programming, there isn't much extra stuff left. It turns out the things you could remove only amount to about one theory course in your standard competitive Computer Science curriculum.

The real reason why we have an issue producing quality programmers is more fundamental; we just don't have enough people who are good at teaching it. It's not because we "waste" time on one or two courses on some theoretical aspects.

Another thing to consider is allowing Computer Science majors to opt out of general education requirements in favor of more programming classes. Allowing this would free up a semester or more for most undergraduates (as opposed to the one course saved from cutting theory). Even with just average teaching, a semester can make a big difference.


That's why professional programmers should get a B.S. in CS and then an M.S. in software engineering (or a few years of internship experience such as that required for licensed architects). Unfortunately, this is one of those things you can't say because it increases the opportunity cost of a programming career.


Or a professional programmer could just get a job and learn software engineering that way.


...by getting scolded by more senior programmers for playing amateur hour and allowing authenticated, unauthorized actions?


The theory is that they know your entry-level and do some mentoring, code review, etc. to teach you about those sorts of things. That sort of depends on you getting a job with a good company, though.


Cooperative Eduction (http://en.wikipedia.org/wiki/Cooperative_education) is another great option. I think it gave me a huge leg up on the students who didn't participate.


We're conflating a lot of different professions here. Phlebotomists only have trade school, but they're still going to be exposed to some pure science - at the least biology but likely chemistry as well - in high school.

Nurses and especially Doctors have years of pure science before they're ever allowed into their trade schools.

Likewise there are a variety of software careers, from sysadmin to developer to architect that require varied levels of education (though much like nurses developers can only benefit from better understanding of the principles behind their art.)


>Nurses and especially Doctors have years of pure science before they're ever allowed into their trade schools.

This is an (unnnecessary) North American tick, whose pernicious influence is spreading.

Medicine has traditionally been an undergraduate degree in Europe and all former European colonies aside from the US, and those countries that are in its cultural sphere (like S. Korea, which got rid of its undergraduate medicine degrees.) My cousin started his Medicine degree at 17. It will take him five years. It's not like this is even unknown in the States, IIRC UCSD has a runaround where you get a Bachelor while doing an M.D.

And even in the US there are different types of nurses, some of whom went to college, some who didn't (LPN, RN and Nurse Practitioner). I understand demanding continuing education and testing to ensure competency, but college is a means of doing that, but not the only one.


And demand for doctors is kept artificially high via a small number of available med schools[1], incenting the profession to erect increasingly high barriers to becoming a doctor.

[1]: http://www.nytimes.com/2010/02/15/education/15medschools.htm...


I've always felt the problem is that class projects never have to be production ready. You're expected to show that you've learned and can implement the theoretical material, but you never have to turn that proof of knowledge into a complete, well tested project.

That's one of my biggest regrets about my CS degrees (BS/MS). I took so many classes and did so many class projects that I never had to see one project all the way through to complete, tested, usable release.

And I don't think it's a CS department's responsibility to expect class projects to be built to a shippable standard. However, I do think that it's the school's duty to encourage students to work on a real, production project, of their own creation or as a contributor, on their own time - even if it means taking on a reduced academic load.


of their own creation or as a contributor, on their own time - even if it means taking on a reduced academic load.

But that's why the apprenticeship model would work so well. I've always thought the way union electricians are trained (a four year apprenticeship, which includes a lot of practical and theoretical schooling, until becoming a journeyman) would be a great fit for software development.


In Austria there are Höhere Technische Lehranstalten (HTL, http://en.wikipedia.org/wiki/H%C3%B6here_Technische_Lehranst...) which offer vocational education (but the whole educational system is very different from the US and other countries, which is reflected in the problems faced in the conversion to the Bologna system, i.e. bachelor/master etc.).

Over 5 years, students (usually aged 14 when they enter the school) receive practical training in programming as well as a theoretical foundation, though in no way as thoroughly as in any university program. In the first year of the informatics branch they let students enjoy the beauty of programming linked lists in C, which is quite tough for many.


As an ex-student, I would rather take a CS degree though, even though I was in it for the software development. You end up learning about software development anyway, in the beginning of your career, so it's not a good use of your time to devote yourself to this in college.


You end up learning about software development anyway, in the beginning of your career

...and that's exactly what people are complaining about: the Diaspora devs learning about software development practices in the beginning of their career, while writing code for release.


I wouldn't really call CS degrees "high level theoretical work". Plus, I blame the professors.

I see Master's level students all the time who don't even know the basics of programming. That's just not acceptable and gives the university a bad rep.


Depends on the course - the one I did was heavily theory/maths based. We had to do a lot of development but we weren't taught that much about good development practices - those kind of practical issues are pretty much orthogonal to CS and are much better picked up in a practical environment anyway.


If you need a college professor to tell you "Check that the resource is owned by the logged in user requesting a change before you change it", you might never, ever be a good programmer no matter what. This was apparent to me when I'd been doing PHP for like 6 months. These issues are common sense.


My first web app had authorization. I didn't even go to school for CS.

Simply put, they focused too much on trendy tools and libraries like MongoDB and CarrierWave and neglected the basics.


First of all, this type of thing, preventing users from just changing around URLs, shouldn't need to be taught. It is pretty common sense. When you make something like this, if you don't wonder: "Gee, what would happen if somebody changed photo_id=123&delete=true to photo_id=124&delete=true, would it delete photo 124?" then I'd have to say you aren't a very curious individual. That likely doesn't bode well for your programming prowess.

Validating user input is probably the first thing you learn about web application programming, which is frequently taught at universities, or in books titled "web application programming" which you should at least skim if you're going to start a project like this. Don't blame college for this. Just because it is something that isn't focused on in college (it is, though), and they went to college, does not mean it was college's fault. Would it be fair to blame college for any other mistakes they made, just as long as college did not "focus" on it? No. Some things are common sense.

Most likely, the culprit was time constraints, which is far more excusable.


What a stupid excuse. I'm a "college kid". I don't neglect simple access control in my software.


The problem is, these kids are from college. They don't teach you stuff like "writing a secure web application" in college, or even try to.

Bingo. I went to Georgia Tech, which has a pretty damn good CS program, and I had to hunt for security classes. One was a "special topics" course that wasn't available very often and didn't have anything to do with application security (was a Net. Sec. course). The other was not a CS course but a Comp. Eng. course and was focused on penetration testing. :/ I actually earned a "Network Security" certificate with my degree which I never even knew was available (it wasn't mentioned anywhere in the course literature).

Since I've graduated they have redone the whole CS dept so I don't know if things have changed, though.

But like someone else said, a lot of this stuff is common sense, especially if you're a programmer and have systems knowledge. And I think most programmers have the habit of imagining all the different ways things could break when they are coding, too. Like a hackers curiosity that most of us share. I know when something looks obviously wrong on a website or in an application I'm using I start to poke around and see what I can uncover.


They likely didn't learn Rails in college either


> They don't teach you stuff like "writing a secure web application" in college

We had a class on it. They basically pushed us through OWASP from front to back :)


I disagree, this should be an obvious security capacity: Don't let people who are not permissioned to modify a given resource modify a given resource.

I might be able to excuse this since they're fundamentally still in alpha (or pre-alpha) and were rushing to get code out.


"I might be able to excuse this since they're fundamentally still in alpha"

I wouldn't. Authorization is the sort of thing that has to be done first.


Any decent rails book will have a section on common security flaws and how to avoid them. There are plenty of web tutorials on the same topic. All the flaws in the linked article are very basic and should have been avoided if they took the time to RTFM.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: