Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think the industry is driven by academia. Plus, no CS degree program worth the paper it's printed on even claims to teach Java. If yours did, I'd ask for a refund.

Typically schools USE Java to teach computer science topics, for instance my university used it in the Intro course, where we learned program flow control and some OO principles, and in our Data Structures course where it was nothing more than a tool to complete assignments. We used Scheme to explore other programming paradigms, ASM for our architecture courses, C for our OS course, etc. Other than that we could use any language we wanted for other courses.

Basically if you come out of school claiming you "learned" Java, or any language, it probably means you are lying or went to a bad school.



All of our (the CS degree program I attended) 100s/200s courses in the late 90's were taught using C. From what I remember they didn't "claim" to use any particular language and basically if they did I wouldn't have known what they were referring to anyway.

Java came onto the scene with much fanfare. It was a big thing and all the professors were quick to foist it upon us.

On a side note, it’s funny for me whenever anyone says they chose the JVM for speed. It’s still ingrained into me that this is not the case. I have to take a second and assure myself this is not 1998, Java is fast now.


I also remember all of the courses taught in C (with some C++ for OOP).

I find it really odd, though, working in a University environment and seeing the massive number of students in Java courses while IT is trying to limit the installation of the JVM on clients (especially Windows), as it's one of the major infection vectors on the campus (another being Flash).

Personally, I learned Java for Android/Blackberry development (guess that makes it half dead for me), but I also learned Objective-C for iOS/OS X development. I don't really see myself using either for any other environment, though.

I guess I'm just glad I never really dreaded learning/using any particular language.


You'd be shocked at how many bad schools exist. I graduated from a university that tried to drop "Data Structures and Algorithms" because it didn't translate well to the workplace, and that university is deemed to be okay for Computer Science in the UK. I ended up taking it, but a lot of other students in my class weren't happy. We even complained to the British Computer Society, as the Computer Science degree was accredited by them, but nothing came from it.

My university pushed Java down our throats whenever they could. We also spent a lot of time with Prolog and C, but Java was taught as if it was a perfect language for anything a developer could possibly want to do. I joined a Masters degree programme at a top ten university in the UK and I was blown away by the difference. The facilities were no better, the lecturers no better, and the students no better. The only real difference was the curriculum, and the administration. At bad schools the curriculum is focused on raising the hiring numbers and the adminisration deal with so many kids that they couldn't give a shit about their needs, whereas at good schools they try to stay true to the subject and those running the degree programme have done a good job, and will make things run smoothly.

I would say that students from bad universities probably end up just as good at programming as those from good universities. The big difference is how they cope when they leave their comfort zone. I'm a .NET developer and I've seen some students brought up in Java really struggle, which is fairly shocking when you consider the similarities between the Java language and C#. A good student from either will adapt either way.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: