I couldn't disagree more with this entire article.
> the self-taught web developer knows surprisingly little about the web’s underlying technology.
I know very few developers who aren't self-taught. In reality the opposite seems to be true. It's the developers who take a course and don't continue learning that stall. This is something technologists (even non-developers) absolutely _need_ to do -- self-teaching. The article even contradicts itself with the very next line.
> Language-oriented courses cannot cover the complete web stack, and students will end up clueless about what an htaccess file does, or how to restart a Unix daemon, or how the different types of POST encoding work.
Then, there's this...
> There was a time when looking things up on Stack Overflow whenever you had a problem just wasn’t an option, and many pieces of software had unreadable documentation, if they had any at all ... That is the environment where hackers thrived, and that’s what we are going back to, sooner or later.
It's almost as if he's nostalgic of the way things used to be. Like things were better when the documentation was terrible and you couldn't look things up. To all of that, I say good riddance.
> If every time you find yourself dealing with a complex issue that affects multiple technologies your first instinct is to search on Google, you should reconsider your working habits.
I'd actually argue the opposite. If you ever find yourself with a complex issue that affects multiple technologies, your first instinct _should_ always be to google it. Sure, you should understand the problem and all, but there's no need to make your life difficult just for the sake of "being a hacker".
> I have met many programmers that don’t like to code in their spare time, and that has reliably revealed them to be sub-par developers.
Disagree 100%. Some of the best developers I've met understand very well how to separate their work. The stereotypical hacker that works all through the night on obscure problems tend to pidgenhole themselves and lose sight of the purpose of programming. They also tend to be the ones that burn out to epic proportions. It's a very unhealthy mindset to have, and to be encouraging others to have.
> Disagree 100%. Some of the best developers I've met understand very well how to separate their work. The stereotypical hacker that works all through the night on obscure problems tend to pidgenhole themselves and lose sight of the purpose of programming. They also tend to be the ones that burn out to epic proportions. It's a very unhealthy mindset to have, and to be encouraging others to have.
Yes.
I'd add: anyone smart enough to be considered an expert in any subject will eventually find other subjects to explore. I'd argue this helps them become a better developer, more than coding in their spare time ever would.
"Full-stack" was never a reference to the OSI model.
You can spend your whole career deep in the layer 7 weeds nowadays. A legit "full-stack" technologist will dive into layers 6 and 5 occasionally.
Below that you're talking about TCP/IP, BGP, Ethernet, and fiber. The beauty of the model is that each layer doesn't have to care about the implementation details of other layers, and that's reflected in hiring as well.
Nomenclature aside, some of us really do span layers 1-7. But layer 1 is pretty boring, and there's not a lot of value to add in layers 2-4 honestly, except in extraordinarily demanding environments.
Even the people who think they work at layers 3 and 4 are usually just using layer 7 tools to manipulate configurations for layers 3 and 4.
> the self-taught web developer knows surprisingly little about the web’s underlying technology.
I know very few developers who aren't self-taught. In reality the opposite seems to be true. It's the developers who take a course and don't continue learning that stall. This is something technologists (even non-developers) absolutely _need_ to do -- self-teaching. The article even contradicts itself with the very next line.
> Language-oriented courses cannot cover the complete web stack, and students will end up clueless about what an htaccess file does, or how to restart a Unix daemon, or how the different types of POST encoding work.
Then, there's this...
> There was a time when looking things up on Stack Overflow whenever you had a problem just wasn’t an option, and many pieces of software had unreadable documentation, if they had any at all ... That is the environment where hackers thrived, and that’s what we are going back to, sooner or later.
It's almost as if he's nostalgic of the way things used to be. Like things were better when the documentation was terrible and you couldn't look things up. To all of that, I say good riddance.
> If every time you find yourself dealing with a complex issue that affects multiple technologies your first instinct is to search on Google, you should reconsider your working habits.
I'd actually argue the opposite. If you ever find yourself with a complex issue that affects multiple technologies, your first instinct _should_ always be to google it. Sure, you should understand the problem and all, but there's no need to make your life difficult just for the sake of "being a hacker".
> I have met many programmers that don’t like to code in their spare time, and that has reliably revealed them to be sub-par developers.
Disagree 100%. Some of the best developers I've met understand very well how to separate their work. The stereotypical hacker that works all through the night on obscure problems tend to pidgenhole themselves and lose sight of the purpose of programming. They also tend to be the ones that burn out to epic proportions. It's a very unhealthy mindset to have, and to be encouraging others to have.