Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's lots of money in computers, which has increased the proportion of graduates who know only how to add an IDE to the factory install of their computer. I saw this a lot back in my desktop support days.


This brings up another interesting point... "back in my desktop support days."

I had a few jobs once out of college (phone support, QA (before automation was much of a thing), sysadmin, back to phone support) before I got a programming job. That was a good two years out of learning the other parts of a computer. It was because I knew perl that I was able to do the sysadmin - web programmer transition.

That wasn't too unusual of a path back in the mid-late 90s.

A good chunk the developers over in the engineering department and nearly everyone in the programing wings of IT and customer support had done tech support or system administration. It was just part of the "working your way up" within the career path.

Many of my classmates in college went to a "the computer guy" job which did all of the computer stuff for a small company (hardware, software, helping with excel, writing programs to serve dynamic text to the web server). Being a computer person back then was a much more generalized skill set.

As an aside, recently I was tapped on my team (Java developers) to do work on an older website. LAMP. I had to brush up on Linux from the past decade (I was another decade out of date with some of the package management)... but being able to do that sysadmin, read perl (I know of one other perl coder in the department, but he doesn't have any sysadmin background) and get it running was a skillset that didn't exist much of anywhere else.


I was telephone support for people installing freeview boxes in the UK in the early 2000s. Same skills. Same skills as programming your VCR without the manual. Same skills as getting Doom to work over a direct pc to pc connection. None of that crap worked first time. We had to learn how to debug to get basic stuff working.

In some ways computers are too usable these days. In other ways the experienced programmers are too sloppy and put in too many barriers to get their tools to work flawlessly in the modern world. We don't live in the tinker world any more.

But ultimately to be a good dev you must learn to debug.

Hard problem to solve.


(I'm going to reference it again... one of my favorite essays on programming) How To Be A Programmer - https://github.com/braydie/HowToBeAProgrammer/tree/master/en

The first skill listed in the first section is "Learn to Debug" ( https://github.com/braydie/HowToBeAProgrammer/blob/master/en... )

> Debugging is the cornerstone of being a programmer. The first meaning of the verb "debug" is to remove errors, but the meaning that really matters is to see into the execution of a program by examining it. A programmer that cannot debug effectively is blind.

I don't think its a coincidence that that's the very first bit. It is indeed the most essential of skills.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: