> But there is something becoming more familiar to me as I go back to the wild west of C programming (where very little happens behind the scenes I might point out - no garbage collection here!).
I share this sentiment. I pursued computer science because of the romanticized notation of tracking every bit and byte and talking directly to the hardware. I know I'm not alone given the popularity of old-school fantasy consoles and consumer microcontrollers, like Arduino.
It saddens me to say but if I were starting over today I might not enter the field professionally. Most modern development is gluing components together and wrangling packages. It's so far removed from the machine and the distance is only growing.
> It saddens me to say but if I were starting over today I might not enter the field professionally. Most modern development is gluing components together and wrangling packages. It's so far removed from the machine and the distance is only growing.
I don't know what's sadder: That this kind of programming is becoming a lost art, or that nobody seems to care. I got my first job out of college partially because in the interview when I was asked to implement some algorithm XYZ in "any language I choose" I went with X86 assembly and nailed it. Well-performing code was highly valued at that company. Knowing how to ensure your code fit into the cache and your memory accesses stayed on one page were considered critical programming skills.
Now, outside of a few rare embedded or game engine jobs, nobody seems to give a shit about any of this. It's just "glue these APIs and libraries together and ship the resulting crap as soon as it barely runs." Nobody cares about the size of the resulting executable. Nobody cares about or even knows their code's runtime heap footprint. Nobody cares or even measures how many cache misses they're encountering. I feel like an old grandpa even typing this stuff out.
The worst part is that end users are the ones that end up losing. It's their system resources that are ultimately being wasted.
There's a bit of a renaissance here in Rust, or perhaps this is just my bias. Many people are drawn to the lack of a garbage collector and efficiency of the runtime code. Lots of packages optimize algorithms that are unlikely to be strictly necessary for the task at hand.
I think it will remain niche to care deeply about these things but they aren't forgotten. In my late 30s and finding this type of thing more and more fascinating despite working for companies building criminally inefficient software using Ruby and Python and dozens of microservices.
> That this kind of programming is becoming a lost art, or that nobody seems to care.
I think there are real costs if the level of incompetence adds up over a stack/system. But as a general rule, I don't think this is a sad phenomenon at all. Well, actually, it's fair to feel sad about it, but it should be understood as a form of growth for our society. (Growth is often painful.)
Writing low-level code should be viewed like blacksmithing or any other technology. At one time, it was the cutting edge that enabled new tools to exist. But over time, it's become well understood and abstracted and it's possible for us to build factories that automatically output many of the steel products we need. And now we can mass-produce steel components for more advanced machines that would have been impossible if they had to be forged by hand.
Don't get me wrong, once you go down to embedded you sometimes do need to know and sometimes it just improves your understanding (and it's kinda cool anyway), but if you do high-level work, perhaps web work, all you need to know is the abstract interface of floats as a data type. What operations you can do, what is guaranteed about the result of those operations (loss of precision etc). In other words, why would a Haskell programmer need to know the IEEE FP standard?
You have to understand that float are negative powers of 2 and that conversion from decimal is not as straightforward as for integer: conversion from decimal often leads to loss of precision contrary to what happens for integers.
Maybe not everyone needs to know, but for example it's important to know about it in ML. You can have overflow, underflow, non-deterministic addition when parallelised, or just need to speed up your network.
Debugging. If you don't know how your code is translated into CPU ops, you won't be able to recognize the clues that malfunctions almost always provide when code misbehaves. Translators, compilers, libraries, and new code routinely have bugs. Depending on a google search to pin down the source of your troubles often fails, and of course is of no help at all diagnosing your own bugs.
It also depends on your philosophy. I want to master my machine not be its servant.
Because not caring how FP values are stored and what the implications are (or even not realising that you favorite programming language uses FP by default) is a particularly fast way to become the "minus 10 times programmer".
Storing monetary values as FP is one thing, but I've seen phone numbers and other numeric identifiers (long time ago before PCI-DSS even credit card numbers) stored as FP values, with predictable results.
As long as you know the implications, you don't need to know how the values are stored.
You may say it's easier to know how they are stored, then you can derive the implications anytime you need them. Maybe that works for you, but most people who I know that got this wrong do actually know how FP values are stored, they are just drawing the wrong conclusions. So better focus on the implications, cause it's those that matter.
I already expressed this in the GP comment, and it's a little shocking to see all the replies that didn't actually pick up on that.
Knowing how the values are stored provides you the "why" behind the practical implications. Another example: Half of the range of all values that "float" can store lie between -1.0 and 1.0. Knowing how those values are encoded in memory tells you why.
> Because not caring how FP values are stored and what the implications are (or even not realising that you favorite programming language uses FP by default) is a particularly fast way to become the "minus 10 times programmer".
I don't think that's true at all. You're merely looking at a symptom of someone who is intrinsically a negative performer. But that's rather like assuming that someone with a cough has tuberculosis.
Guilty. I'm pretty good at my job, but I never learned CS fundamentals and don't work in a context where this matters (I currently work in front-end JS/TS).
I don't think it's worthless knowledge to me, but I think you're a bit blinded by your own context. I have seen many comments on HN bemoaning "Developers who don't even know X", and X is always something different.
I don't think there's a smoking gun for bad developers and it's weird to me when programmers think there is one, tbh. Everyone has gaps in their knowledge, and with something so arcane as programming, it's very easy to have no idea that you're missing some important or "fundamental" piece of knowledge
Here's one take: good developers learn fast, bad developers learn slow. Software is one of the fastest moving fields, so it takes a quick mind to keep up. Clearly, everyone needs some basic knowledge to program, and equally clearly, there will be gaps in every developer's knowledge. When needed, we fill the gaps as quickly as possible. If someone can learn 10x as fast, maybe they only need 1/10th the knowledge base.
I generally agree, in other languages it suffices to just tell juniors "don't use floating point if you can avoid it". And integer types are less messy to reason with (just need to learn the 32-bit and 64-bit ranges).
But in your specific case (of JS/TS), _all_ the normal numeric variables are in IEEE 754 floating point...
Not to mention most of my time as a "Developer" is spent managing busted CI/CD pipelines, container crap, whatever cloud-yaml things, fighting broken dependencies/build issues, cargo-culting test suites.
I program way more on weekends/nights on toy/tutorial stuff than I do at work, ironically.
> Most modern development is gluing components together and wrangling packages. It's so far removed from the machine and the distance is only growing.
That’s actually what I love most about things today. Hard components I might want to use probably already have a ready to made library I can start with. Time from inception to prototype to production is so short that it really removes a lot of the tedium. I wish it could be even shorter but a lot of the time I’m writing code is because there isn’t something that meets my specific requirements. I wish AI could fill that gap but it feels like we’re an extremely long way away from that. In other words, the part I enjoy most is the high level problem solving or even coming up with what new requirements might be to solve a problem. The mechanical aspect of realizing the vision can be fun but it can also be quite frustrating / repetitively tedious.
Same for me as well. I don't want to talk to hardware because the ideas I have in my head are so removed from that problem space. I want to build tools to help normal people do stuff or create automation. I fully understand and appreciate that to make all of that work, the low level stuff also needs to be fast and functional, but I'm not the person to make that optimization.
I really enjoy being a web developer despite most of us being the new butt of the so many programmer-centric jokes.
But does the cut-and-paste model of coding scale to 30+ years of developing code? I don't know of any problem domain that's so deep and compelling that I could stay engrossed in solving high-level tasks for that long. Either I have to change domains (as I have maybe 4 times in my 37 years), or vary my routine by occasionally diving into the nitty gritty of low-level code and O/S services.
Being a _user_ of code doesn't appeal to me at all. I work at a big pharma and know lots of biologists and especially chemists who are proficient programmers. They use code (more than craft it), but they're impassioned by the science itself. Coding is merely a tool to them, the means to a more compelling end.
I don't share their perspective, nor do I want to compete in that space, so I get my jollies by learning the info extraction process and diving into the cool, underserved, often complicated parts, like image quantification and pattern enhancement/recognition in raw and dirty data. That often requires some math and some low level bit twiddling, in code and in signals. I can't imagine cutting and pasting my way through that world, nor would I ever want to.
> I can't imagine cutting and pasting my way through that world, nor would I ever want to.
Personally, I don't do that but I do work with a few people who don't really see a problem with that workflow. I agree with you that it is ultimately flawed since you aren't really learning anything except how to put things together. But we both know what happens when you reach the limit of this workflow, as evidenced by no/low-code tools.
I much prefer to write my own business logic and interface with tools that let me express requirements succinctly and cover the most common pitfalls. If/when those tools are no longer good enough, I have an opportunity to write something tailor made to the problem space that meets the current performance or design needs.
Right now my favorite stack is Laravel, Vue, and Tailwind. Other than Vue, I have very few JS dependencies and I like it that way!
The world I want is where I can build an entire computer by myself from HW synthesis to SW by just telling an AI high level description of how I want the software to be written. Having to domain shift my entire focus is horribly annoying.
When I say cut and paste I mean “import cutting edge compressor” or “import cutting edge probabilistic filter” as building blocks of building a new piece of software.
To be clear, I want to be able to program hardware too. I’m talking about end to end system design. Building databases and operating systems that work drastically different from today. It’s really hard (and insanely expensive) for one person to realize a vision of a wide ranging new way of doing things.
NPM is fine but too many people add what should be 20 straightforward copy-pasted lines as an external dependency (lodash being an extreme example).
It's not even new. I've been doing this professionally for 15 years, and the "web development isn't real development" was a common attitude at least since I started. If anything, it was worse.
> I pursued computer science because of the romanticized notation of tracking every bit and byte and talking directly to the hardware.
Isn’t computer science mostly math? Figure people in computer science would be more at home with the abstract stuff like Haskell where you can’t even see the underlying hardware and your programs are expressed as a bunch of (declarative) functions where sequential execution of instructions is a very small part of your programs.
The most unfulfilling days for me are when my dependencies seem to fall over.
I have a project that only builds in one version of Visual Studio on my machine. I know it's bad, I know I should get to the bottom of it before it becomes a real problem, but it's soul crushing work.
Interestingly as frustrating as this sometimes is I tend to become obsessed with not letting the computer win. So it feels much less like work than some kind of mythical quest with a very shallow plot.
> The most unfulfilling days for me are when my dependencies seem to fall over.
I tend to pick only stuff that is packaged in a distribution, if at all possible. They usually do a decent job at gatekeeping the countless libraries created by people that have no idea on how to make a library.
'System' languages like c / c++ / rust are a natural fit for hobby programming because you're writing something to run on your machine, to scratch your particular itch. All the better if that language is able to run on a little microcontroller and move an actuator or what have you.
I say that to contrast with 'application' languages like java, nodejs, or whatever. I'm not saying hobbiest stuff doesn't get written in those but you're a lot less likely to see an OSS spring or react app on someone's github than you're likely to see some cool little command line program that does one small thing.
imo a self contained component are the same as bit and byte.
To me, the draw of programming being that it is both a logical puzzle and a constructive endeavor, brings me a lot of joy to be able to see my stuff work. Whether that's manipulating memory bytes or using a package someone else wrote to do something I want, there's little difference in terms of satisfaction.
I've noticed the same transition away from wrenching your own low-level code to making high-level calls to routines written by others. Maybe that's partly because my employer (a big pharma) tends to attract either IT types or bench scientists, but few who are pure computist CS types inclined to get their fingernails dirty with bit twiddling. After 17 years working in biomedical R&D, each time a low-level CS topic happens to arise, I'm surprised again by how much it resonates with me and attracts my rapt attention. Alas, because pharmas seldom hire CS grads, there seem to be few with whom I can share my glee.
Or maybe today's CS grads speak a different language too.
My entry to the field was a bit odd. I’ve been a hobbyist since I was a kid, but I never wanted to enter the files professionally, at least not the way I did. I knew I wouldn’t like it.
But my immediate post high school plans fell through I spent a year working crappy jobs until I eventually figured “you know I bet I could get a job as a software developer if I tried”. It surely would do a lot better for paying the bills.
Nowadays I mostly tell people: I didn’t get into programming for the money, but that’s sure as hell why I entered the industry.
Of course, it looks like my little stint is over, but I don’t know what to do now. On paper it’s the only thing I’m qualified for that’s not unskilled labor.
I stopped caring about industry trends and hype years ago. I was able to get away with it by coasting on what I did know for years, but otherwise I’ve been very narrow (not really true, but in the context of the job market it is). Funny enough this was somewhat on purpose as I was planning to leave anyway, but I thought I had a couple more years than I did.
On top of that, the jobs I still see seem want want increasingly obscure specializations I don’t have. I haven’t had a job for over half a year, and there doesn’t seem to be any sign of that changing. The recruiters dried up a while back and I’ve only had sparse interviews, which have been some of the most antagonistic I’ve ever had.
It’s not any particular “type” of company either. Both the tech profit center companies and the tech cost center companies don’t want me.
I share this sentiment. I pursued computer science because of the romanticized notation of tracking every bit and byte and talking directly to the hardware. I know I'm not alone given the popularity of old-school fantasy consoles and consumer microcontrollers, like Arduino.
It saddens me to say but if I were starting over today I might not enter the field professionally. Most modern development is gluing components together and wrangling packages. It's so far removed from the machine and the distance is only growing.