Anecdotally, what I've seen is that older programmers, in addition to being programmers, also have the skills of a computer helpdesk technician. They can debug Windows problems, hardware problems, know how to mess with the BIOS, and have no problem installing or configuring any piece of software. This helps immensely when setting up your IDE or debugging weird problems with compiling.
Younger programmers seem to commonly have zero knowledge in this area. If they run into some problem with their environment, they are completely stuck. Oh, they're asked to install a VPN client but it doesn't work and they have to debug it? They're completely blocked. They have no idea how to continue.
For me, I learned these basic computer skills first (because I was obsessed with computers as a child). Then I learned to program.
I think this is a side effect of teaching programming as a career skill. You have people that want to be programmers but don't include "computing" as one of their hobbies. So of course they don't know anything about how to debug computer problems. This is totally expected.
Teaching these skills in CS curriculum would probably be a good idea.
I agree completely --- being one of the "older programmers" --- and think it could be phrased thus: a new generation of "coders" is being forced to learn how to write instructions for the machine, when they are barely computer-literate.
Having taught some introductory CS courses a few years ago, the amount of struggling with basic things like file naming, directory hierarchies, and even simple hardware use was a constant distraction.
On one memorable occasion, I had to help a student by turning on the monitor: he would sit down at a workstation, move the mouse and tap the keyboard, and seeing nothing show up on the monitor after a moment, go to another unoccupied one. I watched, without intervention, him try this with all the unoccupied workstations in the lab until he finally came to me and said, in an exasperated tone, "all the computers left are broken!" I lead him over to one, and upon pressing the power button on the monitor, the expression he made was quite unforgettable...
There aren't really any clues as to how a computer works on an iphone. There isn't even a visible filesystem. It's no wonder students today have no idea how a computer works.
In the bad old days computers weren't wide spread but just by virtue of having one you would learn programming / tech skills. Today computers are everywhere but owning one doesn't teach you anything as fundamental functionality is hidden/locked away from the user in the name of being "consumer friendly".
I wonder if those born between 1988 and 2002 will be the last of the so-called naturally computer literate.
This! I'm very concerned that we are growing a population that is more and more monitored by their devices and has less and less understanding of how they work and how they or even if they are being monitored.
Aye, I second this, got me really pissed off when fellow CS student told me he has ''experience'' in using Iphone extensively, and that counts towards computing.
Makes you wonder what does computing mean to people nowadays.
Not OP but, it's the classic Dunning-Kruger effect, they know so little about how to use a computer that they think that using a iPhone is a skill that is equivalent. This usually isn't a problem for most people but this person is in CS, where you would expect them to have at least passing knowledge in computers to join the major.
Again, why? FWIW I have a fancy degree in computer science and spent my childhood rebuilding and scraping together old 2/3/486 boxes. Yet I think it’s terrific that someone would be interested enough to enroll in a CS program without having wintel/Linux boxbuilding trivia knowledge.
The perception of what makes a viable CS major needs to evolve past what those born during the “Personal Computer” era think it should be.
I do agree with that and I have seen quite a few brilliant programmers be born in classes but there is a point where you start to expect them know some basic stuff like what a router, IDE and script are so that I don't have to explain them in full just to say how to make something work.
Was correcting several statements that mobile devices aren’t computers. Lots of “computer knowledge” had to be aquired in the 90s due to the poor design of wintel boxes.
Wait, the previous generation had no phones at all. Do you mean that today's generation use phones instead of computers? In that case, I understand your point.
This reminds me of a story I recently heard where a student at a digital arts college reported they were saving their file inside photoshop instead of a folder.
It sounds bizarre - because they were using an iPad/iPhone instead of a computer.
I'd argue something a bit different. Coding is far more accessible today and far more visible as a field one could get a job in than ever before. As such, you have way more students trying to learn some introductory CS. Maybe the proportion of CS learners is going up while the proportion with more technical literacy stayed the same?
I hope you weren't too smug about knowing to turn on the monitor. I can't remember the last time I've had to do so. Remember, there are no intuitive interfaces, only familiar ones.
I do, but that's mainly because of working with older and/or more professional devices which don't (want to?) consume power even when technically off. But for the rest, yup, doesn't really happen that often.
there are no intuitive interfaces, only familiar ones
Most screens in the office, when 'off', look like a matte black box with a single clearly visible button with a dim light. You can claim it's familiar because nearly every techincal appliance has a power button, but on the other hand: I am quite sure if you'd give this thing to a homo sapiens or similar, arguably not familiar with a button, it would not take too long befoer that button would get pressed.
Or maybe the student just had a brainfart. People make stupid mistakes sometimes, or fail to see a completely obvious solution. It happens to everyone.
> Having taught some introductory CS courses a few years ago, the amount of struggling with basic things like file naming, directory hierarchies, and even simple hardware use was a constant distraction.
Or to put it the other way: users (future would-be programmers) are being distanced[sic] from how the machines work. Even at the most rudimentary level.
Frankly, I'm not surprised. Interfaces are systematically being dumbed down in the name of usability. Users may be buying powerful computers, but they are using appliances. Pieces of shiny machinery that, to all intents and purposes, work with magic. [Queue A.C. Clarke quote here.]
The immediate downside is that when the interface is dumbed down to the level of non-confusion for the large masses, the features that power users would find appealing are being stripped off.
Curiosity may have killed the cat, but what happens when curiosity itself is being killed?
Familiarity with how computers work arguably peaked during the early-mid PC era. The first computer that I used was an IBM 360. Basically, someone showed me how to punch cards, and how to enter my data. After submitting the job, I'd find out the next day whether it crashed, or completed properly. And after a few tries, I got my results.
But nowhere in that process did I know anything about IBM 360s, or how to program or run them. I more-or-less understood the mathematics involved in the data processing. And I could somewhat parse the Fortran. But I had no concept of anything like a directory hierarchy.
With an IBM XT 286, on the other hand, it was all in my face.
> Familiarity with how computers work arguably peaked during the early-mid PC era.
Now there's an interesting observation. I'm not quite sure whether you're right, but quite possibly.
Looking around my local libraries and bookstores, most of the books that deal with using personal computers do seem to be from the 90s/early 2000s era. The question is whether the decline in these books indicate that they are not needed any more, or that they are not wanted anymore (due to "hidden complexity")?
> With an IBM XT 286, on the other hand, it was all in my face.
Earlier in the pre-PC age, for the hobbyists you had similar experiences with home computers like Tandy RadioShack, ZX spectrum, Commodore Vic-20 or 64, not to mention the Amigas and Ataris (and all the others I’m old enough to have forgotten).
Early Macs also let you mess around (and mess up) your system too.
It was an exciting time in computing indeed, but it did in no way start with the PC.
Can’t help but wonder if in another 40 years, if this sentiment be viewed as being absurd, similar to how we’d find hypothetical comments from 35-40 years ago as absurd: personal computers being nothing more than calculator or hobby, and “not a real computer since it came already assembled and someone else already wrote the software!”
We recently got a junior-dev straight out of finishing his CS degree. And he didn't know you could "extend" your desktop and show different things on each side by plugging in a monitor. When quizzed, he told me he's "always just had a laptop", and never a monitor next to it I assume. He was then equally perplexed by the resolution and corresponding font-size change that occurred on the monitor because it could now run its own native resolution rather than what the laptop screen was forcing it to mirror.
I think there is something to be said for what classes don't teach you. You can ace a test but have very little practically knowledge if you don't work with stuff outside of class. I honestly think that you learn 10-50x more in practical knowledge from the organizations that you can join in college than the classes themselves.
I think that's a side-effect of degrees that focus only on coding or applied math, whereas before the computing industry was as developed as it is now and there was less specialization within it, a CS degree would have covered the basic data structures and algorithms that were common knowledge then and still had plenty of time to cover systems things. My degree wasn't in CS - it included basically a CS minor, but it also focused on network protocols and hardware architectures that most CS students had never heard of. So when I was interviewing for jobs around graduation and was asked to explain everything that happens after someone types "google.com" in their browser, I could explain all the network protocols in detail right down to how binary was encoded into specific 5V fluctations on certain lines, DNS resolution, routing algorithms, the server's filesystem, etc. But when I was doing a coding problem and they asked me what the invariant was in my loop, I had to ask what an invariant was. And yet I had been coding full time for 4 years by that point and never needed to know that term. I knew stuff CS majors didn't, and they knew stuff I didn't. It was different specialities, each of which really filled 4 years of heavy, heavy semesters.
The famous question is "why are manhole covers round?" Manholes being artificial, this leaves some room for speculation about why the shape was chosen.
"Why is a pothole round?" is a more objective question.
Most manholes are round because that is the most practical shape such that the cover cannot fall in. There are actually other shapes, but they are usually hinged and most costly.
I've never thought about it, but a circle of the correct size has no less reason to not fall in than say a square or triangle of the appropriate size. I.e. if it's smaller than the hole, it'll most likely fall in.
My theory: They're heavy, so making it a circle at least eliminates the need to "align" it when putting it back. You can basically just drag it using a hook and it'll slot itself into place.
But manhole covers are always bigger than the hole. Otherwise, they'd fall in at installation and the problem would be immediately obvious.
The point here is more that a circle has the same diameter regardless of orientation. There's no way to rotate it to make it fit through a hole it couldn't before, like you can do with a rectangle.
> In geometry, Prince Rupert's cube (named after Prince Rupert of the Rhine) is the largest cube that can pass through a hole cut through a unit cube, i.e. through a cube whose sides have length 1, without splitting the cube into two pieces. Its side length is approximately 6% larger than that of the unit cube through which it passes. The problem of finding the largest square that lies entirely within a unit cube is closely related, and has the same solution.
Man hole covers aren't flat discs nor thin cylinders, but are truncated cones. The diameter of the bottom of the cover is smaller than the top. There is no way for a cone to fall through an opening smaller than the widest diameter end.
Potholes are naturally occurring. For any particular pothole, there is a reason it took whatever shape it did.
The only real constraint operating on manholes is that they fit the opening. Unless you can produce a design document specifying the requirements for the manhole you're looking at, there isn't really a "why".
I agree with your observation, but I don't agree with your diagnosis of the cause. I think it's mostly down to the increasing abstraction and ease of using software. When I was a kid, I wouldn't have described "computing" as a hobby, but just getting certain games to install and run on my parent's cheap Gateway PC would require a lot of fiddling around. My younger siblings have no more or less interest in computing than I did, but if they want to play a game they can seamlessly download and run it from Steam or the app store.
Programming is not harder than it used to be. For example, I was programming Macs in the pre-Internet era which meant lots of thick books. If you got stuck there was nobody who was going to help out and you just had to puzzle it out vs the example in one of the few how-to books like the famous Mac Programming Primer. Also, there was little open source, so you couldn't read the library source to figure out how things were supposed to work. Simple data structures that come with every language these days were something you had to pay a significant amount for or handcraft. If you hit a null pointer dereference, the machine crashed and you had to reboot. This happened fairly often.
Maybe programming isn't, but perhaps finishing something is? Because while the hands-on-the-keyboard, moment-to-moment writing of code is almost certainly easier than ever, actually making a thing out of that is nowhere near as simple as it used to be. Building a nontrivial thing in QBasic was really, really easy. Building a nontrivial web app--whether it's strict server-side PHP or strict client-side JavaScript or whatever--is really, really hard, requiring the consideration of a lot more stuff, and the failure cases are much bigger and much nastier.
I do feel that one has to do some careful thinking and digging to find an appropriately sized project, library, and toolset before you can start working on a decent sized project you have a chance of finishing.
It took me 5 years of screwing around with Java, Python, and C++ "I want to help humanity" projects before I found that I really should just write small game clones in P5.js for fun. I'm a new stepdad, and if I want to be motivated to work on side projects in my tiny amount of free time, it's going to have to be a game.
Project: 80's game clone.
Library: P5.js.
Toolset: Notepad++, Python SimpleHTTPServer.
You can make a rails app and run it in heroku within an hour into your first tutorial. You can get an interactive JavaScript environment in any web browser.
I cut my teeth on c++ on lab computers. I was crap at it. The compilation and debug cycle was slow and frustrating. I couldn't Google for answers on anything.
It's easier than ever to start programming, without any question, and so many batteries are included everything else is easier too.
People in some ways do expect more from their software now, and to help deliver all that there are tons of open source libraries available that can feel overwhelming at times, but that's moving the goalposts. Somebody learning to code doesn't need all that just to get something basic and interesting working. If you're tasked with making a functionally equivalent piece of software now versus 20 years ago, it's better and easier in every single way.
> You can make a rails app and run it in heroku within an hour into your first tutorial.
How long does it take to actually understand what all of that does? I can claim to understand it, sure--but I'm also thirty years old and I've been getting paid for this stuff for the last fifteen. (Not, like, random POKEs into memory to make syscalls, for sure, but I understood why stuff like SCREEN 13 worked, along with the general constructs of the language.) On the other hand, I pretty much fully understood how QBasic worked and its interactions with DOS at age 5. I was able to parlay that into C and C++ and into understanding how the thin virtual machine there, simultaneously into PHP and Java and how those actually worked by relating them back to lower-level concepts I'd picked up...
Cargo-cult monkeying about doesn't adequately, to me, replace understanding and insight--and "just make a Rails app and push it to Heroku" is fundamentally cargo-culting. If you don't understand how your tools work, you don't understand what you're doing. (I've written everything from CPU emulators to Rack frameworks to prove that I do understand what I'm doing.) I started doing that literally from the jump as a kid, and the systems were built so that it was something you could easily do.
It's "and" not "or" here. The new easy highly abstracted stuff doesn't prevent people from digging deeper, it's just one more potential entry point.
There is nothing stopping anybody from digging into lower level languages now, in fact that is also easier than ever with open source IDEs, free compilers, fast processors, ubiquitous documentation, and helpful communities too. The mere existence of a totally open source operating system like Linux and its ability to function on commodity PCs and embedded devices is huge.
You can argue (and I believe this is the point of the article) that people don't bother going deeper because for most computing purposes interacting with very high level abstractions of really complex systems is sufficient (and indeed becoming the norm), which leaves users at a loss when asked to grok lower levels of abstractions. It's also true that there is now more complexity and there are simply more possible things to learn, making it less likely that any individual has the time to develop deep knowledge of all of the systems they use.
Those assertions may well be true, but I'd argue they are not the same thing as it being harder to learn anything in particular. It's daunting, maybe, because of the sheer possibility space, but all of the knowledge is out there and the tools are better than ever.
have you breadboarded it out with transistors though? when was the last time you designed a board? when was the last time you laid telephone poles or dug tunnels for wires? designed fiber optics for cross ocean communications? launched a satellite for your GPS?
at some level, you're always going to be assuming the layer before reasonably works the way somebody gave it to you.
Its not cargo culting if you know where the actual dependence is, and both rails and heroku make it fairly clear what kinds of things they are doing. Even if you don't know how they're doing it, you know that without running the heroku part, you cant load the page in the browser, and without the rails part, you don't get any files you can change.
Not really the point; if you know the dependencies but do not know the fundamentals, you can build stuff but when things go wrong and the fix is not the first SO hit, you are going to waste a lot of time on something you have no clue how to even start diagnosing.
I don't agree. It took me multiple years to really wrap my head around building web pages (to say nothing of apps) in PHP. There's just too much h*ckin' stuff in the web that you have to be aware of and being able to build stores of relevant information is definitely in some ways easier--Stack Overflow exists--but in others, thanks to the volume of crap you just have to chew through, it is much more difficult.
Oh, if you're talking about anything on the front end more rich than simple classic forms and full page reloads, I agree. It just wasn't clear what you meant by "non-trivial".
> Programming is not harder than it used to be. For example, I was programming Macs in the pre-Internet era which meant lots of thick books.
Same here, and I was a kid who a) had no idea what he was doing b) couldn't afford to buy Inside Macintosh c) really had no idea what he was doing. But honestly, I achieved more with C and some cargo-culted System 7 (6?) Pascal calls than I have with Go, with respect to UIs. I achieved more with HyperTalk than I have with Common Lisp, with respect to full-fledged application environments. Honestly, HyperTalk was by far the single most productive system I've ever used in my life.
Life was pretty great back then. I can only imagine how much better it could have been with highspeed Internet in my home. Or maybe we'd just be on the dozenth round of Macintosh Toolbox wrappers written by sleep-deprived college students by now.
It was a lot harder to be discouraged as a young adult. Some year you'd feel you were outgrowing your happy QBasic hacking and pick up a massively thick Turbo Pascal book and feel a bit humbled, but that was once in a blue moon. Now you are always one search away from some harrowing design pattern depths or cryptic industry standard source code.
> If you got stuck there was nobody who was going to help out and you just had to puzzle it out vs the example in one of the few how-to books
Maybe, ironically, one of the advantages we had is that these difficulties forced us into a much better learning environment, when the answers weren't all immediately available through Google? Being the one who can self-learn extremely well gives us a big advantage when given a new problem or technology stack.
I think this is definitely the case. Have a browse over Stack Overflow. Most of the questions are asking very specific things and getting very specific answers. Rarely is the context explained properly by either party. People use these questions for future reference, which results in them just trying random stuff until it works. If it doesn't work, they just revert the code and go on to the next question instead of learning why.
On the other hand, nowadays if you think (for example) for web development you need Babel and webpack and npm modules and vue.js, it's easy to give up before you even get to hello world.
Now of course you don't need that to do hello world, but a beginner doesn't know that!
I think you're exactly right. I learned my way around computers as a kid because I wanted to do the same stuff that kids today want to do. Play games, prank my brother, talk to random people online. And I just followed that thread until it got me what I wanted. That thread involved some basic shell programming, learning what IRQs are, the difference between TCP/IP and IPX and what they did, how to setup the modem so call waiting wouldn't interrupt the connection, dealing with the rather technical nature of early IRC, usenet, BBSs. I wasn't trying to learn computers, it just happened along the way, and pursuing the same goals today won't teach you anything along the way.
IRQ conflicts! Holy mother of hell IRQ conflicts! particularly when dealing with those early gen Sound Blasters -- and trying to get that Roland MT32 you borrowed from the greasy weirdo in your big brother's band hooked up and pumping out that sweet, sweet midi
Recently I had this surface, some RT kernel extension we use would display possible conflicts. Despite programming a lot these days, my background is electronic engineering, so I still have a rough idea of what an IRQ is and does even on the lowest level. However had no luck trying to explain that to a 'pure' programmer.
> "However had no luck trying to explain that to a 'pure' programmer."
Interrupt requests are easy to explain to even a novice computer user if you frame them in the context of something they're familiar with.
For example, when you press a key on your keyboard an interrupt request will be created to make sure the computer knows you've pressed the key. So in simplified terms, you could say when you press the "A" key on your keyboard, there is a part of the computer that says "pause what other things you're doing, respond to the "A" key press, then continue with what you were doing before".
I mean, it pokes the keyboard controller in order allow the processor to control an additional bus line to the memory controller. Then, thanks to 8086’s overlappping segmentation architecture, you can access 65519 additional bytes that were never intended to be there in the first place.
There’s Cthulhu mythos that is more acc si le than that.
That is in fact when the A20 gate was introduced. I suspect the poster above was referring to 80x86 and not specifically the 8086. The 8086 (and 8088, 80186) could only address 1Mb of memory and only have 20 address lines, while the 80286 has 24 address lines. In real mode the 80286 is supposed to be compatible with the 8086. However, it has a bug where the A20 line is not 0 in real mode, which can cause problems with real mode software that relies upon address space wraparound using seg:ofs addressing. So Intel introduced the A20 gate which could be controlled by software and allowed or prevented address line 20 from being seen on the bus. Enabling it allowed your 80286+ real mode software access to an extra 65520 bytes of memory.
I agree. My first computer had Windows 3.1, then soon upgraded to Windows 95, and I was probably worse off because of it. I made Angelfire sites with HTML for a summer but it didn’t hold my interest. I still don’t care for making HTML/CSS websites, but eventually learned I love programming. I firmly believe that if I were required to use command line simply to interact with my computer at that early stage, I would have pursued more programming.
I think there's also an aspect of people's expectations changing. Twenty or thirty years ago, no one expected new software to run on a ten-year-old computer and no one expected Windows software to run on a Mac or vise versa. (Linux was pretty far outside the mainstream.)
Now, we have a proliferation of computing environments, and it's hard to give people generic advice about how to install and use a tool that would apply equally across all those various environments. Browser standards seem to be the one thing that's relatively common across platforms, so a lot of that stuff moves online.
I absolutely agree with your first few paragraphs. As a hobbyist before I was a professional (Linux desktop since '95), I'm surprised when I meet people who have a technical skill but not a hobbyist background. I first encountered it with Oracle DBAs who sometimes were amazing at Oracle but had rudimentary Unix skills, and that's becoming more commonplace as people do things like phone app development but without the background "we" expect (whether rightly or wrongly).
That said, my background as a Sysadmin/Ops person also landed me in my own silo. I don't have much programming experience, and running production systems means NO, the environment does NOT give you access to wide-open CPAN, RPM, pip, whatever downloads. It's a regimented release process where someone upstream has to provide the relevant dependancies and you don't just go around installing stuff, running services, etc.
As a result of my background and not being a developer, on HN I hear of new languages/frameworks/libraries/methodologies on a daily basis. Things that, clearly, a large number of readers hear must already be familiar with, even though many of them will be out of fashion in 6 months.
And then the process to install a lot of these things is "oh, just pipe a wget of this script into bash, RPM install this, pip install that, don't worry, it's all quite safe for your system..."
> In many ways, it feels like worlds are diverging.
We are specializing.
Enterprise shops have always wanted stability, and so they develop ever more elaborate build processes to control exactly what versions go into production. Embedded systems have always wanted to run only what needed to run, and so we have developed ever-improving systems to pick out only exactly what needed to be run. Developers, in the meantime, have always wanted the newest and shiniest, and so have gotten 10,000 ways to install newer and shinier toys.
I'm and older programmer. I learned how to build a computer because my parents scraped enough money to buy a computer when I was young, but not wealthy enough to get it fixed when I inevitably broke it. This was back when PCs were friggin expensive as were parts. Fellow computer friends (we had few back then) would let each other borrow hardware (parts) and whatnot, so we had to learn how to install it and configure it (IRQ's COM ports, etc), but it was fun, and without internet! It's what I liked to do in my spare time.
The funny thing is the C64 and VIC-20 were self contained, for the most part. You didn't have to add cards and swap out processors and all that. All you really had to do is know how to put wad of aluminum foil where your fuse should go when your fuse goes out. :)
When the dot boom came around, I had to learn how to not only write web sites, but I had to learn how to host a website out of my house, which includes everything you would need to host a website, which involves a lot. DNS server, mail server, database server, server admin, TCP/IP, security, and more. I still host out of my house today.
I have not seen any age-correlation in what you describe.
For sure there are "career" programmers who don't know much about computers beyond their narrow scope of expertise.
You're admittedly obsessed with computers, which presumably means you spend a lot of your free timing tinkering and learning.
You've picked up a lot of very useful knowledge, but surely there's room in a profession for those who are not obsessed with their field? The programmer who doesn't know anything about a BIOS perhaps spends their free time bike riding.
Being a programmer without advanced computer skills is like being a paramedic who can't drive. Sure it's possible and there's probably some niche where such a person can fit in, but definitely career limiting.
I don't really think of programming as a career skill - I'm a hobbyist. But I love it, and I love learning about it, even the really low-level stuff. I like reading about algorithms, and data structures, and architecture.
Computers, on the other hand, I despise. Every piece of knowledge I have to learn about how a specific computer works, not in the sense of ALUs and registers and whatnot, but in the sense of 'if Windows gets a BSOD it's probably the boot usb you've left plugged in', I hate - and only let into my brain with a great deal of teeth-grinding and swearing.
Fundamentally, computers are interesting devices. Using them, however, is generally horrible. So I can see why people get into programming without really taking an interest on the devices we normally use to program on. I think programming would be an interesting activity if you were just doing it with pen and paper. For me, it's really a shame that the best device we have available for it is some kind of crazy-house compendium of horrors and bizarre compromises.
Personal anecdotes here. I never occurred to me that people who could program couldn't do some sysadmin stuff to some extends until I work a small contract in a company. A day I was to demoing something there was a problem, so I fired Powershell to to some command line stuff... and someone asked "why do you have IT skills?". At first, I didn't even understand the question, I mean, installing systems, playing with VM, and the like was my daily bread, so I thought it was the some for every programmers...
Another day, I went to the local of the university's CS student association. I was excited about what I did the previous day so I told after entering: "yesterday I installed FreeBSD in a virtual machine and did XX". One people replied by what is a virtual machine, and another by what is FreeBSD. I explain to them briefly then left wondering where the people actually interested by the field were. It turns out I met some of them a few months/years later. Both mostly self learners, one with formal training in theology and the other a musicology.
Why is that, exactly? I grew up in that era, I remember when floppies were actually floppy, and I've been well into computers ever since, but I don't believe I could tell you why. If I had to guess, I'd guess that FAT used a fixed number of bytes for each filename.
Yep you got it. My first actual PCs (I don't count the Apricot PCs haha) were DOS 3.3, and I still remember the transition away from 33MB partitions although it's so long ago I can't remember if that came in with DOS 4.0 (abort abort!) or 5. I thought I was so cool having a PC with drives A B C D E F G H I.
I don't know about PDP-10s. DEC's PDP-11 operating systems used a character set called RAD-50, short for radix (octal) 50, for filenames and various other purposes.
The (decimal) 40 characters were A-Z, 0-9, and four punctuation characters -- I don't remember which. Three characters could be stored in a 16 bit word, so a 6.3 filename required 3 words or 6 bytes. The '.' separator was not stored, of course.
The implementers of CP/M or one of those early PC OSes copied the idea but not the encoding and picked 8.3 as a slightly more usable size. Floppy drives were relatively capacious (!) so bigger filenames were reasonable.
The only reason I know this PDP-11 trivia is that end users could recompile the RT-11 kernel to save space, and DEC distributed the OS sources to enable that. The distributed version was written in assembler and had had comments stripped, so it wasn't quite like reading the Lions book, but you could read it if you really wanted to.
I don't think starting early or on your own is a requirement. Like in any field, intrinsic motivation is what counts (IMO).
I started programming in university at... um... 19? As a part of a physics degree. And fell totally in love with it. 20 years later I'm doing it for living and the love really hasn't faded.
Computer graphics is a demanding mistress, you really need to understand why programs and computers work.
Based on my experiences I would consider myself a pretty OK programmer.
Hi, I'm a good programmer (according to my bosses and peers) and I wasn't initially self taught, CS 101 was the first time I programmed ever. Nice to meet you.
> Teaching these skills in CS curriculum would probably be a good idea.
This is the one thing I don't agree with. A better placement for these skills is a generic course required for any degree, or an optional course that anyone could take to shore up these skills (and you could recommend it for your CS major). Importantly though, this isn't part of the field of computer science, and so it doesn't really belong in that curriculum.
Now, if we had more of a "software development" or "software engineering" major, or even trade school, I think that would be a perfect area to cover practical aspects of tooling that are required to work in the field. There's a large gap between what you need to learn to become a reasonably competent programmer at most organizations and what you learn in a computer science major, and it really shows if you've ever interviewed a lot of fresh graduates. Great algorithm and theory skills, but no idea how to set up a build system. There's definitely room for a more practical education that transitions more directly into industry.
There are a lot of useful skills. Teaching people how to fix a flat tire would also be rather useful.
Basic computer administration falls into that huge category of useful skills that are not worth teaching to a general population who have completely viable alternatives.
Specialization is extremely valuable for society and having more competent doctors etc is worth doctors having a more narrow focus.
I disagree. Having a broader skill set means you can be more creative with problems. The separation between doctors and computer science is holding both fields back. Think of machine learning and big data, we are _just now_ starting to apply those ideas to medical problems, while those problems are fundamental to our existence as a species.
Computers in hospitals are a chore and a drain on doctors. They should be empowering, instead doctors spend the majority of their time typing reports. Reports of things that are already registered in other parts of the system. (You're already thinking of a solution right? That proves my point)
Hearing about how hospitals handle technology from my SO (neurologist) makes me want to rip my hair out. Specialization is a great thing but we need generalists to bridge the gap to help specialists empower each other.
I think you would be familiar with what it takes to become a doctor v/s a software engineer, education wise, but let me reiterate:
For a doctor:
1. 4-5 years Bachelors degree
2. A Masters Degree (at least 2 years)
3. Residency (at least 2 years? And this is working weird shift at hospitals and whatnot)
For a SE:
Degrees are optional. Perhaps a Bachelors.
So if you require the same amount of rigor from SE's, that's probably gonna scare off many who don't want to spend the better part of their youth buried in books.
And since there is such a huge demand for SE's who don't need to know OS fundamentals, there is no push for having more rigor in the curriculum.
Maybe I'm being too pessimistic. I would certainly like there to be more generalists. But that would require the same kind of attention to credentials that doctors are put through. And I think a vast majority of current engineers are opposed to more formal education.
I see zero connections between having computer administration skills and fixing those problems. Knowing how to install and troubleshoot software would be a waste of their time.
Programming on the other hand would take a significant commitment to be useful. Sure having say ~1% of doctors go down that path might be useful but for the other ~95% it would be a horrible waste of time and cost literally hundreds of billions of dollars over time.
If there were doctors and nurses who have basic computer literacy and entry level coding skills taking part in the acquisition and development process of the IT systems they use, I bet they wouldn't suck as much as they do.
Most specialized software systems, e.g. those that are used in hospitals or selling you train tickets, are developed by people who don't have a faintest idea how the software is used daily. This ends up being an expensive disaster more often than not.
This is why I think everyone should have basic computer literacy and programming skills. Which is not the same as everyone should be a professional quality software engineer.
Yea, even basic programming is not part of basic computer literacy. You are looking a very different and expencive in terms of time to acquire sill set.
Worse, Doctors are often not part of these decisions in the first place. Further there are plenty of doctors with these skills however they don't work for cheap and most of this software is build by people that are paid as little as possible. Remember their workflow is not actually complicated, it's just not helpful when selling this software.
Really, this software is not terrible because it's hard to make, it's terrible because of how it's sold and maintained.
One other factor I've wondered about over the years is how quickly they give up when they hit a problem. In the 90s and early 2000s people had to develop local diagnostic skills, read documentation, etc. more before things like Stack Overflow were so pervasive.
I wouldn't say it's a scientific observation but I've definitely gotten the impression that it's taking people longer to outgrow the cycle of immediately searching for a copy-paste answer and moving on without developing a better understanding of the issue for the next time something similar shows up.
> I wouldn't say it's a scientific observation but I've definitely gotten the impression that it's taking people longer to outgrow the cycle of immediately searching for a copy-paste answer and moving on without developing a better understanding of the issue for the next time something similar shows up.
I wonder if that's a phenomenon on the same spectrum as HN-Driven Development, where whatever one has last seen posted on HN is clearly the New Best Thing Evar. It's a bit like a voluntary outsourcing of one's own agency.
I definitely see a connection between that and complexity: people think that they need what someone else with a bigger, different problem did and amplify the number of things they need to learn.
Your right there arnt that many top rated post on HN about someone running up a small ec2 instance with wordpress and mysql.
A lot of web stuff is plugging in the DB and not tripping over the cord. Yet thing of all those crazy frameworks that keep popping up and they look so impressive!
Just something to think about aswell. When things were simpler and there was less choice there was also more concentrated documentation. It wasn't a bad investment to by a book about a language.
I think another big culprit in this situation is the IT department. I visited a research university recently in a computer science related field. The IT department wouldn't allow even graduate students and researchers to have admin access on their machines. Even if the programmers could debug basic issues, they often had to go track down and IT staff member to install some library or program for them. If this is happening at the computer research level it's happening even more in non-tech company business environments. How do we expect staff at non-tech companies to experiment with new skills in programming if they can't even use their computers at their full potential?
Ya, I had an intern position at my school that did computer lab build support (the hardware and network, not as a teacher). We installed Windows NT 4 Workstation everywhere and it wasn't overly secure. At least 4 times, someone changed the default background to a porn still. Blame that guy.
Has little to do with age, but more with the amount of people in the CS field right now. Only a fraction of them are enthusiasts. I notice that especially with my "front-end" developer colleagues - of whom 3/4th is here to do a job and at home many of them don't even have their own computer. Sure they play video games on their XBox or PS4, but that's the extent of their technical interest. For them it's simply a well-paid 9-to-5 job.
Now my other colleagues are mostly doing back-end and embedded C++/Python stuff, and are in general a lot more interested in the technical side. I think it's more or less the other way around, 1 out of 4 is not really tinkering with tech at home - but even they are more interested in tech outside of their field. I show them stuff like docker, kubernetes, time-series databases and metrics collection stuff, ansible/tower and what we can do with those things, and how they could make changes in the way they design their software to make things more reliable or easier to deploy/monitor - and their reaction is "oh cool!". Our front-end team? Completely the opposite attitude and completely averse to changing the way they work. I hear "we'll do this manually" or "why? this works too"...
Most of the front-end dev roles seem to attract people who're not really into tech, but can code a bit. I think because of the immediate visual feedback? They end up hiding behind their frameworks with a lack of even basic understanding of HTTP or networking. Me as an embedded developer rolled into sysadmin role then has to tell them what the hell HTTP2 is, catch their screw-ups when they serve mixed http/https content on their site, ... Now they're not all like that, there are some pretty good guys there - but the majority can write basic code, and that's where it ends. Sadly it's no surprise their team-leads all fall in the wrong category. On the other side, I have the last say when stuff is deployed in production, so they are forced to adapt to certain things - but they'll always complain.
But those skills have nothing to do with computer science as a field of study - wouldn't they be more useful to teach in general education like K-12?
Like I remember in middle school, we had an elective option between woodshop and CAD. The 'CAD' class was really more about performing simple automation in programs like MS Word/Excel, but that sort of thing is still useful in just about any job these days - not just programming.
I really like the idea of starting to call 'CS' something like 'Informatica' so people might stop looking at the field as just a valuable career skill. Google tells me that there is now a startup called 'Informatica', but that is also what the field is called or translates to in some other countries like the Netherlands; I think https://www.cwi.nl/ could probably claim prior use.
Speaking of, it seems odd to try to use an existing commonplace phrase as a trademark in the industry that the phrase was born from - it seems like it'd be impossible to argue that anyone was infringing - but I guess I'm not a corporate lawyer.
I think you are trying to make some good points, but your language could have been clearer. If I'm understanding correctly you mean:
1. Treat Computer Science (CS) as a specific field, and don't dilute the curriculum with related items.
2. Introduce a course that covers these broader, non-CS items, and actually push it out beyond the audience of CS?
Did I get that right?
I think you might have meant the term "informatics" (https://en.wikipedia.org/wiki/Informatics). Depending on your position CS is either subordinate, equal, or above informatics. I have a degree from a UK university and there is a much wider spectrum degree types in this field than exist here in the US where I now live.
I'm conflicted on the original post that elicited your response. I have that background where I inherited a broken PC, a bunch of floppy disks, and some manuals. In some ways this gives me an advantage, especially given that I'm comfortable being dropped into a messy project or some problem that is debugged. I've seen lots of folks who can't handle this, but can churn out code on to need more support around the edges.
I now see more and more folks attempting things like devops and at least being able to learn some of these skills later on. I think where it stands me in better stead is problem avoidance, but even that is subjective.
Frankly, I think we would need to know what software engineering looks like, outside of the spaces where engineers (like, ones with PEs) more than developers own the project, before we start teaching it to novices.
I mean, I've been getting paid for this for going on fifteen years now and I'm not an engineer. I try to build resilient, reliable systems, I try to test heavily and comprehensively...but I'm not an engineer, and most of the time I'm worried when somebody calls themselves one.
What "software engineering" is, in the context of a web app or a point of sale system, rather hard to define. Worth taking a crack at. But we shouldn't put the cart before the horse.
Completely agree. The mantle of _engineer_ is not passed on simply via a job title.
Responsible engineers (i.e., those that I aspire to emulate) seem to spend non-trivial amounts of time predicting and avoiding potential-negative outcomes. They also seem to have a near-total understanding of the processes that impact these outcomes.
If only I could figure out how they _know so darned much_ about everything ...
> If only I could figure out how they _know so darned much_ about everything ...
As someone with a degree in Mechanical Engineering, I'd say that your observation is mostly due to (1) several hundred years' worth of head start (2) the field not shifting under them (much) over their careers (3) intuition from interacting with the real world being applicable and (4) you're also unaware of the crazy amount of accumulated computer knowledge you have. Now, due to the changing nature of software development, a lot of that accumulated knowledge is useless, but it wouldn't be useless in a more mature field. Imagine if all of the random arcania I've had to pick up from AppleSoft BASIC, Pascal, Perl 5, Tcl, ANSI C , etc. were directly applicable in my day job using C++, Scala, and Java.
Sure, there's some overlap, particularly from ANSI C to C++, but with Mechanical Engineering the basic tool kit changes much more slowly. I learned about common steel and Aluminum alloys, polyethylene, ABS... advantages and disadvantages of each, things to watch out for.
If a Mechanical Engineer really gets to know 1040 vs 1045 vs. 4130 steel alloys, that's knowledge that's going to pay dividends for the rest of her career. It's not like 10 years ago or 10 years in the future there was/will be a massive shift away from Aluminum and steel and the majority of engineers had/have to throw out most of what they knew/know about metals and learn how to design for Nickel and Titanium alloys. 3D printing brings something new, and heat treatment and machining techniques have improved, but even 3D printed metal parts aren't fundamentally different from the powdered metallurgy that my dad's cousin was researching in the 1940s.
You're also accumulating a ton of arcane knowledge, it's just that most of it has a relatively short useful life.
There's also survivor bias. The engineers that were promoted and kept through economic downturns tended to be the ones that were really interested in their field and readily soaked it in.
ASU has a degree called 'informatics' that's run by the computer science faculty but focused more on <the application of technology to other things> rather than <the technology and creation of technology itself.>
I just transferred into it from computer science, because I spend all my free time making games anyway, and the informatics program lets me take an elective track to get credit for that. I do think CS is probably a better fit for what I see as my long term plan, but I think this program has a lot of potential for people who wouldn't otherwise call themselves engineers.
Calling Informatica a startup is a bit of a stretch given that they've been around for 25 years, were public for a long time, make ~1 billion in revenue a year,
and have thousands of employees (though I don't doubt that CWIs use of the word predates the company's founding)
in some other languages it's basically called that. "Informatik" in German. there's still the vestigial title in some department names in the US, "Department of Computer Science and Informatics".
I agree it would be a better name but I think the battle is lost, informatics now refers to an applied discipline (closely related to library science) involving building up big systems to store and organize data.
When I went to college, CS required 3 semesters of Calc and 3 semesters of Physics. It's almost as if CS was trying to fill in some hours. Calc and Physics are great and all, but very narrowly applicable. The only benefit is how it works your brain (and a good weed out course).
I would ditch that and concentrate only on computer related issues. Database would be a big one that is usually missing. Networking as well. Computer architecture (like building a computer and the Von-Neumann model) should also be included. Basic project management would be helpful too. That's 4-6 courses you could suppliant with Physics 1-3 and Calc 1-3.
It just seems silly to force CS kids to learn Physics and Calc when it is seldom used in the real world. I would much rather have a grad know the ins and outs of networking and how a computer actually works than Calc and Physics.
However, the explanation behind such a program is probably the distinction between vocational training and academic training.
At least, here in the Netherlands, academic CS taught in academic institutions lies on the theoretical side.
In such an academic CS program there are some practical courses.
Stuff on databases, networking and computer architecture. But those are high level, and quite theoretical.
Were they not, there would be a lot of push-back because they are almost considered demeaning.
Meanwhile, with the vocational instituion, things are very practical as far as I know.
Much of those courses are actually building practical software. In large projects aimed at real uses.
Meanwhile, in the academic institutions, you'll have multiple projects that boil down to 'implement gradient descent in MATLAB'.
I've heard it said that even though the academic institution is much harder and prestigious, when it comes to making practical software you are better of taking someone from the vocational instituion.
Or at the very least, if you take someone from the academic institution you'll need to invest more time into training.
There is a bit of tension here where 'really smart people' who want to just become really good at a job are still expected to go to the academic institution for the prestige.
Then, those people get less practical knowledge than they want because they are at an academic institution.
In Germany we have two different kinds of universities, the normal ones and the universities of applied sciences. After spending two semesters at a university (with fairly good grades) I switched to the uni of applied sciences.
It was far too theoretical, and I'm not talking about the algorithms and data structures kind of theoretical. More like prove that a nondeterministic fa is not "mightier" / able to compute more than a DFA.
That was not what I wanted and I'm quite happy at the uni of applied sciences now, and since most of our profs have an industry background (and a Dr.) we learn a lot of practical things.
E.g. Git, build management, agile development, a module about usability and user experience, networks, applied algorithms and data structures, some math (proofs where a side note though) etc
On the other hand, some of my fellow students ... Well, one task last semester was to program some really easy program with java, connect to a database with jdbc, do something with the data and write it back. We didn't know about jdbc until then. After a module is finished we have the opportunity to fill out a questionnaire about it and at least half of the class wrote that it was not ok to demand something like this. It boggles the mind.
We should have something like that in the Netherlands.
We do have 'university' and 'technical university' but I just looked at the curriculum for CS at a technical university and it is similarly theoretical.
You really need to go to HBO (höhere Berufsbildung, Higher Professional Education) to get that practical approach.
This is sadly considered as plain easier than university. It probably is as well, but that leaves a gap for professional training at a higher level. It sounds like the 'Technische universitat' in Germany nicely fills that gap, or is that still considered as easier than normal university?
I've found that some universities have a program like this -- replacing the calc, physics and higher-level math with more hands-on classes. At my university, a BA in CS was like this, while a BS maintained the rigorous math requirements. I've also seen programs labeled Information Systems that's similar.
Not sure I agree with whole-scale removal of it though. I have to use calculus every once in a while in my job. While I never integrate by hand, I certainly need to know what integrals and derivatives are and how/when they are applied.
Also, I would think just 1 semester Calc and Physics would be beneficial to add more applicable subject would be a fine compromise. 3 of each seems really excessive. That cirriculum was created when computers weren't overly complicated. I know that had it back in the 90s. There's just been too many important innovations in computing that are ignored because of Calc and Physics IMO.
Yes, Information Systems type degrees are close, but at my school they weren't strong on programming, instead required Accounting I+II, Finance, Marketing, Management, Business Ethics and that sort of thing.
My wife got her CIS degree a few years ago and they seem to have improved a bit. She had multiple programming courses in several languages along with more advanced CS courses and database courses.
Asymptotic notation is defined using limits. In fact a lot of concepts in CS involve limits, convergence, etc.
The problem is that most calculus courses were devised before the widespread availability of Wolfram Alpha and focus too much on hand computation rather than understanding of the subject.
There's lots of money in computers, which has increased the proportion of graduates who know only how to add an IDE to the factory install of their computer. I saw this a lot back in my desktop support days.
This brings up another interesting point... "back in my desktop support days."
I had a few jobs once out of college (phone support, QA (before automation was much of a thing), sysadmin, back to phone support) before I got a programming job. That was a good two years out of learning the other parts of a computer. It was because I knew perl that I was able to do the sysadmin - web programmer transition.
That wasn't too unusual of a path back in the mid-late 90s.
A good chunk the developers over in the engineering department and nearly everyone in the programing wings of IT and customer support had done tech support or system administration. It was just part of the "working your way up" within the career path.
Many of my classmates in college went to a "the computer guy" job which did all of the computer stuff for a small company (hardware, software, helping with excel, writing programs to serve dynamic text to the web server). Being a computer person back then was a much more generalized skill set.
As an aside, recently I was tapped on my team (Java developers) to do work on an older website. LAMP. I had to brush up on Linux from the past decade (I was another decade out of date with some of the package management)... but being able to do that sysadmin, read perl (I know of one other perl coder in the department, but he doesn't have any sysadmin background) and get it running was a skillset that didn't exist much of anywhere else.
I was telephone support for people installing freeview boxes in the UK in the early 2000s. Same skills. Same skills as programming your VCR without the manual. Same skills as getting Doom to work over a direct pc to pc connection. None of that crap worked first time. We had to learn how to debug to get basic stuff working.
In some ways computers are too usable these days. In other ways the experienced programmers are too sloppy and put in too many barriers to get their tools to work flawlessly in the modern world. We don't live in the tinker world any more.
But ultimately to be a good dev you must learn to debug.
> Debugging is the cornerstone of being a programmer. The first meaning of the verb "debug" is to remove errors, but the meaning that really matters is to see into the execution of a program by examining it. A programmer that cannot debug effectively is blind.
I don't think its a coincidence that that's the very first bit. It is indeed the most essential of skills.
I think this skill is what kept me going through the frustrating patches after I decided that I wanted to be a programmer. I started programming later than many here (after I had obtained a degree in an unrelated field), but I'd always enjoyed breaking and fixing computers and software as a kid and so I'd picked up "computing", as you put it, as a hobby.
That sort of hobby teaches you to keep ramming your head against the wall until the bricks start to crumble, which is helpful when you're trying to learn a language but the package you want fails to install on Ubuntu, or your program won't run despite you having checked the syntax a thousand times, or a concept doesn't click until you've re-read it every day for a week. You learn to see computers and software as a mess that you will necessarily have to untangle and rearrange for things to work properly, instead of thinking of things that don't yet work as "broken". And you learn to Google until your eyes start to melt.
(btw, I'll use 'older' and 'younger' as a sort of statistical reference, not implying a strict age separation exists and all on either side are clones of a prototype)
I recognize this. My 'younger' colleagues can't install a computer from scratch, a get completely stuck on the slightest network issue. But I don't thin it is related to education. The 'older' programmer had to squire these skills by necessity as it was completely impossible to use a computer otherwise. For us getting a new release of Windows or VS usually meant wiping everything clean and install from scratch a we all knew how dodgy 'upgrading' was.
My 'younger' colleagues wouldn't know where to begin when they can't hit update and everything works. They have to wait and rely on a system manager to re-image their machine for them.
The youngest guy on my team refuses to reinstall his development environment like the rest of us when new major platform changes occur. Then every few months complains when his environment causes an error in the build.
>>> I think this is a side effect of teaching programming as a career skill. You have people that want to be programmers but don't include "computing" as one of their hobbies.
It may be the result of the competitiveness of the education system. Everything is more competitive. As I understand things, CS is the hottest undergraduate major right now. You can't just sign up because you learned to program in high school and think it's cool. You have to be a top student just to get admitted. So it may be that the kids who want to be computer scientists just aren't able to have hobbies at all, or their hobbies are chosen to look good on their college applications.
One of the things I've noticed between then and now, is that when I was in high school, I learned programming because I had a lot of spare time to kill. My kids have no spare time.
>>> Teaching these skills in CS curriculum would probably be a good idea.
Granted, I didn't study CS, but instead math & physics, decades ago. But I think that some things don't belong in a college curriculum, especially given the price tag. I didn't take college courses to learn soldering, or even programming.
But also, not all CS majors are destined to become hackers, and there has to be room for a student who might be all thumbs but really has a deep interest in the subject matter of the field. In physics, we called them "theoreticians."
A deeper question is why someone is studying theoretical computer science for 4 years in order to become a coder. Programming is getting harder, but that much harder? When my mom taught programming in the early 80s, her students were getting jobs after one year of coursework.
> It may be the result of the competitiveness of the education system. Everything is more competitive. As I understand things, CS is the hottest undergraduate major right now.
Hardly. Looking at the Swedish admissions statistics[1] from this year (because that's what I'm familiar with):
- Of the top 5 programmes, 3 were for physicians (doctors), with the required GPA ranging 22.09-22.29
- The hardest engineering programme to get into was no. 8, technical physics, requiring 21.98
- The hardest computer-related programme to get into was no. 56, software engineering, requiring 20.73
That said, it was above 20.0 (straight As), so the requirements do seem to have increased in the last few years
[1]: https://statistik.uhr.se/uhr.html, to see applications to programmes based on high school GPAs you'll want to pick "Bara program", "Urval 1", and then set "Visa urvalsgrupp" to "BI". You'll want to sort by "Antagningspoäng".
This is a subtle and sublime point - it may not just be a function of age, maybe somewhat about having a wider interest in computers and tech.
10-20 years ago, it was a requirement to learn the layers support your own hardware, network, security, hosting.... all for the opportunity to try to build a piece of software. You could spend a few years improving at each and it was perfectly normal. Now there seems to be a rush to get through it all to become awesome by skimming the surface.
As a result of learning multiple layers (because there was little other choice), one picked up a lot of transferable skills. You can sometimes literally traverse debugging an issue between all of these layers.
In school computers were taught, first from how a computer physically worked, booted up, etc.
Then, you learned some basic operating system, applications, and how they interacted with the computer.
Once that was completed, you started some amount of creation, whether it was visual design, games, programming, etc. Not to create a programmer, but as a form of creation with the computer.
I too am obsessed with computers, and will push through "helpdesk" type issues to get my work done. However, I don't consider that to be a remarkable skill. Such issues can almost always be resolved with a bit of googling. Perhaps I'm too close to see the foundational knowledge I'm depending on, though.
Actually reading error messages is an apparently very rare skill. Almost always, if you just google what it tells you, you'll hit something... Yet I constantly see people automatically click through popups milliseconds after they appear, only to fume about things not working. It takes a surprising amount of effort to get them to do it over and slow down long enough to figure out what went wrong.
i would just like to get error messages that actually said something more than something went wrong. yea, no shit something went wrong, i already know that, but what and why and how do i fix it are questions i want answered. i know the code knows what went wrong, so just tell me.
Why i love the older parts of Linux, but hate the more recent DE centric stuff. Because the older, unix derived, stuff will tell you front and center what went wrong (unless it is so wrong that all the kernel can do is barf a hex dump of the memory region on screen).
The newer stuff will come up with unhelpful messages or just an eternally spinning pointer, and yet they are supposedly bringing Linux to the desktop. F that.
And i wonder how many times now i have fed a Windows error code into a search term and found a bunch of people asking all manner of channels (both Microsoft and otherwise) for help and get nothing (W10 don't like the card reader on a tablet i have, and all i can find is that maybe, just maybe, MS screwed up the driver. But Intel seems to no longer provide any usable chipset bundle...).
I confess, I was very guilty of that at my old job, which was very fast paced. Whenever reflexively closing the pop up resulted in things not going as I expected, I would, indeed, do it over, slow down and carefully read the messages.
IME, you usually can just google things like that and as long as you're willing to read through the first ~5 results to see if the situation is similar to yours, you'll very often find the solution to your problem.
I used to be a teaching assistant for a programming class at a top university, and it was astounding how often my students' issues could be solved by simply copy/pasting the error message into Google.
It seemed like most of my students were often so intimidated by the scary red error text that they didn't even try to solve it themselves, they just gave up and assumed it was unfixable. About halfway through my semester I reached a breaking point and spent an entire lecture where I had a list of previous emails from students asking for help, copy/pasted their questions verbatim into Google search, and showed how often the solution was found within the first few search results.
Another anecdote... My experience with older programmers and sys admins is that the vast majority of them know a lot about how a computer works (as you mentioned) but many also haven't kept up to date on tools available to simplify and automate processes. This leads to unnecessarily convoluted and confusing configurations that are far from optimised considering todays available tooling. This frustrates me to no end as it creates a load of needless busy work which at the same time makes the dev/admin feel both smart and productive whilst not optimising for what the business actually cares about.
On the flipside, I see a lot of tools pushed that are more complicated, less flexible, and far buggier than the old tools. And then inevitably in a few years they're deprecated in favour of something newer and shinier and the old tools are still perfectly fine (e.g. Make).
I've had the opposite experience, with older programmers balking at new libraries, techniques, and setting up any environment they're not used to. If it doesn't work right away they go back to what they know. I know one guy who absolutely refuses to learn c++ because it doesn't compile for him and he only "trusts" c. Another who writes only in good ol' python2.7 (scipy dropping support; subprocess bugs; cough cough). Younger people tend to be aware of and use newer tools. But we're all bias I suppose.
I dream of the day I'm happy and productive enough with one set of tools that I refuse to switch to another. I still largely feel like I'm wandering the desert when it comes to tooling, where a new toolset looks good from a distance but when I start using it I discovered the ugly underside and start looking for greener pastures.
Tell me about it, I'm using tensorflow now and it does amazing things, but the API is so bulky for anything besides setting up the computational graph... Planning to check out pytorch for next project. Imo most dev environments and software leave a lot to be desired, we should not settle.
I think this is a side effect of teaching programming as a career skill.
My guess is that you don't even have to go as far as "computing" in order to see the negative effects of this. I've had a couple of encounters where I helped out a peer that was stuck in a particular problem merely by applying some basic deduction. And I had those situations both with fellow students and later on in the workplace.
In all those situations, the underlying problem was that the person in question didn't have a firm grasp of the programming language they were using, but seemed to program in terms of "patterns" -- they knew how a for loop has to look like, but as soon as there was a slightly more nuanced problem (say a parsing expression in Boost.Spirit), they weren't able to see how the underlying syntax was treated by the respective language.
While I get that it's almost impossible (and certainly impractical) to learn every edge case in most modern programming languages, I couldn't help but think of cargo cults.
Edit: On second thought: OTOH, it seems to me that it isn't helpful to view what a software engineer does merely as "programming" in this regard. I don't see a problem if, say, an electrical engineer is missing said "computing" skills, but they are in their way to complete a program. Many modern professions benefit greatly if the respective person is able to program (maybe only in a specific language, e.g. Matlab or LabView), but that doesn't mean they have to know their working machines inside out.
Yeah, I'm always amazed when someone my company hired as a developer doesn't understand what "open a terminal" means on Windows or I have to coach them through digging up the environment variables screen in control panel.
I assume it's because I was a power user before I became a developer and so had that foundational knowledge and interacted with OS APIs to build on it... now people just go right into mashing JS code at web browsers which seem to be hacked into everything.
Well, that's an effect of experience. You don't just become a better programmer with experience, you become better at software development. Better handling stress, better understanding requirement, gathering requierment, better navigating political corporate structure, better explaining yourself, better defending yourself, ... and yes, better getting your environment sorted.
That said, the specific point you raised is interesting because I was having the same problem at work. Except it was caused by automation. We have automated pipeline for everything, zero downtime deployment, containers, ... all of that is working very well - so well that there is just no enough big stuff to do so that newcomer on the project can build enough experience with the pipeline to master it. The best people achieve is some sort of very rudimentary debugging capability, earned from the monthly minor tweak the system require. Basically either you build similar pipeline on another project, something that is extremely unfrequent, or you missed the train.
> Teaching these skills in CS curriculum would probably be a good idea.
It was included in my college CS curriculum, using the most obvious text (Debugging, by Agans) almost 10 years ago -- though I had a head start, since I read a copy of it from my public library about 4~5 years before that.
If other schools haven't followed suit, I have to ask "What on earth is taking them so long?"
The consensus seems to be that those types of skills do not belong in a Higher Learning institution. In Germany, the saying is "You do not need a Computer to study CS", although it sounds wrong in English because CS literally contains the word "Computer" while the German term for CS (Informatik) is a mashed-together version of Information and Mathematics.
Some German Universities (TUM, KIT) pride themselves on that last point so much, they actually pass off the CS degree as an Informatics degree on the English version of their diploma transcripts.
Agreed. The article sort of presents these kind of "computer problems" as a barrier to getting started with programming. I don't quite agree -- or maybe they are, but only in the sense that they "filter out" those who don't really want to be programmers at an earlier point in time.
Don't get me wrong; I don't think there's any point in making new programmers jump through arbitrary hoops because "programming should be hard". But I think that today, programming is necessarily intertwined with computing and problem-solving in general, and all the Dockers and specialised positions in the world don't change that. To be a good developer, you need to have some idea about the systems your code will be running on. A program doesn't just exist in a bubble. (This is more true for some kinds of applications, less for others, of course.)
Perhaps it's more to do with the stock of people who learnt CS.
In the older generation, CS was a relatively obscure and difficult skill with little to no easy-guides that it filtered the set of students to only those who were insatiably curious and were fuelled by learning new things.
In contrast, CS has become so ubiquitous that nearly everything has a "how do it" guide that eases the entry of nearly anyone, with little or not interest or whose motivation might be less than ideal. Thereby, possibly leaving room for more anecdotal evidence for this perceives shift in skill.
I know plenty of younger programs who can diagnose their own problems, so the generalization doesn't hold so well.
I think the issue is more about experience and need: when we were learning to program, we HAD to do those things. For the younger generation, it is totally possible to get by without because the ecosystems are so much easier, coupled with a lot of online knowledge that they can access quickly, which was mostly missing in our age.
On the other hand, those young programmers that are faced with a complex ecosystem will usually learn how to deal with it. Heck, with the web and its many step backwards in programming environments (e.g. try debugging in Chrome!), many things are actually harder for them and they acquire skills that I didn't need!
That's something that shocked me as I began to help onboard new developers. Some of them can't set up an IDE, local server, or get a build to compile in the first couple of days.
Onboarding documentation is key to smoothly integrate new team members if your process is convoluted. Even without documentation I feel like guys should be able to get projects up and running just by having general familiarity with how most set ups are configured. My first "project" as a newbie developer was performing pseudo dev ops setting up continuous integration and then creating a project set up guide for the existing project, with no help.
Interesting, I'm old enough to have grown up before the scourge of Wintel PCs in the 90s. Had a Commodore Vic-20 instead. You'd hit the power button and about 5 seconds later a BASIC interpreter started. There was no complicated debugging to be done to get it working.
Still, I learned to manage/hate computers by the Win '95 era and grow to love them again by the time Linux became tolerable about 10 years later.
I have these skills too and yes they prevent me from getting blocked when I run into these issues. I hate relying on others for help on small issues like this. But I also get not everyone would want to waste their time with issues with do not challenge them. As a company owner, I will prefer a dedicated help desk to sort my programmers with these issues but as an entrepreneur, I want to be self-reliant.
You couldn’t have said it better, I had the same kind of learning path as you did and I believe it makes everything a lot easier to understand since you can clearly see the full spectrum of what you are dealing with while programming and usually it helps open new doors so you can explore a problem from a different angle since you have a deep knowledge of your stack available to you from the start.
The downside of growing up on Commodores was their default imperative and procedural programming environment.
My introductions to object-oriented and functional programming came much later in life, and I can't help but wonder how many bad coding habits I had to "unlearn" as a result of my Commodore foundation.
One thing about non-imperative languages (functional, logic, etc.) is that they tend to encourage a much higher level of abstraction that's useful for dealing with complexity, but difficult for most novices to deal with. In particular, young children have relative trouble with abstract reasoning.
If your standard library makes pervasive use of higher order functions, and you haven't yet wrapped your head around elementary algebra, you're going to have a bad time.
I'm having a hard time thinking of an existing functional or logic programming language that requires a minimum of abstract thinking. You can use many languages in a functional style, but if e.g. you're using Python with a low level of abstraction, then you're likely using it as an imperative language.
> The downside of growing up on Commodores was their default imperative and procedural programming environment.
That could be an upside if you wanted to learn assembly since BASIC really wasn't that far removed from it in terms of what you could do and how you did it.
Interesting point, I'll add that we don't see many desktop computers any more. Kids or those wanting to learn to program don't see the whole machine. Instead we have these slim downed devices that require existing knowledge to take apart and put back together again.
I'm a relative newcomer and didn't have a problem downloading Android Studio and setting it up on Ubuntu to run an emulator to develop React Native apps.
Learning to program is getting harder because the programs are far, far more complicated than they were 10 years ago.
This is a good example. Im in my mid twenties and just discovered yesterday that (at least on Windows 10) that a user and a computer have _entirely_ different permissions.
I always thought it was account based! At least thats what my coworker and IT guy explained it to me as.
Creating a program that meets todays' users expectations is what's harder.
But learning programming by itself, is way easier.
We have Python, Ruby, Javascript... We have the Web as a platform and as documentation. We have communities and cheap, fast computers.
The hello world has never been easier to do.
What's hard is to make a real time offline first SPA with a beautiful responsive design that uses the same API backend as the mobile app while integrating with third party Saas and leveraging social medias.
We also used to have BASIC pre-installed on a lot of different systems, and were able to write a GUI just by writing to the frame buffer directly.
A helloworld was just:
10 PRINT "Hello, World!"
I learned to program in BASIC, with just the on-disk manuals. I can't say that it's easier today - I didn't need anything extra back then, and had a snake game up and running in a couple days from never knowing programming.
Today, we don't even have access to an unbuffered get_char by default, and you need to either modify how the shell is running or reach for a heavy library.
Today's modern and accessible example (in Chrome) is to hit ctrl-shift-i, then type out `console.log('Hello world!')`.
Users these days browse the web, so having Chrome/Firefox/Edge isn't anything extra. It may take a bit longer to write snake though; games these days have user accounts, achievements, and multiplayer.
Kids these days have something better than get_ch; a mouse!
It's way easier today with all the free, spoon-fed resources available. It's also way easier to get distracted, with the Internet just a tab away.
If we can manage to throw out preconceived notions about "serious" computing, that is. There are still language prejudiced corners that actually believe Python isn't "real" programming; that javascript is "just" a scripting language.
Meanwhile, people are passing jsfiddle links around and learning to program.
Glitch.com lets users build a whole apps (meaning frontend and backend) for free!
> Kids these days have something better than get_ch; a mouse!
Had that too. Had full GUIs from tracking mouse movement and writing to the framebuffer.
Our software is more complex, so programming it is.
Let's go with JS, as getting up to basic game design is still fairly simple in it.
Scanning for keycodes is done by a callback, so you need to have a basic idea of scoping, objects, and functions before you can get started. Didn't need any of that in the past, because BASIC's procedural model was dead simple.
Then to draw, you have to either learn about the DOM or the canvas, you can't just POKE and PEEK til it works.
There's nothing wrong with using Python or JS or whatever, but the complexity to get going is much higher than my childhood. For good reasons, but still more complex.
My family got our first computer when I was 6 years old. It was a 4K ROM BASIC machine with cassette storage. It didn't come with any software. Instead it came with a book that taught some BASIC and then pages of program listings you were supposed to type in. That's all you could do with the early computers - program them. Now I have a career in software development.
is not that much worse. Granted you have to launch a shell first. Obviously the programming stack got more complex and users' expectations got higher (which is a good thing), so it is more work to make a program that is useful.
But the things that have been simple then are still simple today - they are just not popular anymore or not installed by default.
except with BASIC you turned on the computer and literally the first thing you saw was a prompt where you could type your program. The whole point of this article is that downloading and installing python to the point where you can contemplate typing your program is a big barrier.
Just wanted to point out that C++ was in reach for a lot of people, thanks to Borland's compilers. They were considered cheap by most.
Example, in my case. I had a TRS-80 200 computer, which cost US$600. About the same as most computers of that era. In the context of owning a computer, the $50 wasn't much more.
> were able to write a GUI just by writing to the frame buffer directly.
And, among other problems, that GUI was inaccessible to people with disabilities, e.g. blind people or people who need to use an alternative input method. It probably wasn't internationalized or as good-looking as today's GUIs either.
In the words of Billy Joel (in the song "Keeping the Faith"), "say goodbye to the oldies but goodies, cause the good old days weren't always good, and tomorrow ain't as bad as it seems."
The article is specifically discussing the ease of getting into an IDE on a new PC. When I started, I could turn on the PC, type “basic”, and press enter. Now, at minimum, I would need to know to start notepad, save an html file, point a browser at it, and type some html and js boilerplate. That’s what the article says is harder (except the article goes on to discuss installing other languages, which is even harder).
Is it? When I was learning to program, I had a computer with Windows on it and no money. First, I needed to install Borland's C++ compiler / IDE, because that's what they used in the class and computer lab at my high school and I didn't know of anything else. But I had no money, so first I had to figure out how to pirate it. Then awhile later I heard that there was a free compiler on this operating system called Linux. So, after reading a bunch of stuff to figure out whether I wanted Slackware or Gentoo or SuSe or Fedora, I had to figure out how to download and install Linux. Then I had to figure out how to use GCC, but first I had to figure out how to edit text, which meant figuring out how to use Emacs. Then I finally had a usable (and legal) development environment.
Nowadays it seems like I could just get a Mac and download Atom to get to that same point.
Maybe there are other hard things to figure out now, but it just doesn't seem true that it was easier "back in the day", at least 15 years or so ago.
It's as if you were describing my own experience. With the only difference, that I had to figure out how to use vi. Such fun! I took me some time to figure out how to exit the editor, let alone how to make changes or write the file.
We didn't have all the resources at out fingertips back then. Just the books. We couldn't find copy-and-paste hideous solutions on StackOverflow, we had to figure it out on our own. Programming had the air of wizardry.
I do believe that nowadays it's much easier to learn the basics of programming, considering massive number of resources available just a few keystrokes away. On the other hand the expectations have changed a lot after programming went mainstream.
In my line of work, before I became a consultant, I had interviewed a lot of candidates for web development positions. In my rough estimate only 2 out of 10 are cut out for it. And no wonder - the demand on the market is massive, the money is all right, so more people jump on the bandwagon and try to get by. They don't want to learn a great deal, they don't have a real interest in the domain. They just want to do the job as painlessly as possible and get paid good money.
This also means, they have to cut corners. They don't have time to figure things out and build solid foundations. They are after quick results. They just want to become employable. The passion in our profession is hugely diluted, with the inflow of people who want to do the job just because it pays nicely. It's pure economy.
And because it is economy, the expectations have changed. Maximising return on investment, where the time spent on learning something is the investment, means that steep learning curve is no good anymore.
So maybe let's not dwell on how hard it is to become a programmer these days. In my view it's definitely harder to stay one, than to become one, with everything changing so fast. However becoming a programmer is not more difficult today than it was 20 years ago.
I think you are agreeing with sametmax's point actually.
It's not "the learning curve of getting an environment setup" generally, it's more that the specific environment you're thinking about, which is part of your presumption in this statement, is harder.
It is in fact extremely easy to get an environment setup if you're talking about basic Javascript and the browser -- there is no change in this from 20 years ago. You open up Notepad, type some HTML in, include a script tag and some alert("Hello world"); code in there, then double click the html file to open it on the browser. You do the same thing whether you're on Windows, Mac, or Linux. It will just work; there is nothing to install.
The harder stuff is setting up a python backend to start your first web application, or creating that SPA with responsive design like sametmax said -- that stuff is the "expectation" now in modern day applications. Because the expectation is set so high, setting up the envs to make those things is harder.
Really? I remember it being rather easy circa 2003. On Windows, you could download an installer for Dev-C++ or the Borland compiler, install it like any other program, and you were done. On virtually every other platform, you didn't have to do anything at all: the system came with a C++ toolchain ready to go.
I'd say it's still just as easy to get going with those sorts of tools now, and the same goes for many other languages, like Java and C# for example.
That said, I think the question of whether it was easier to start programming then versus now has a more nuanced answer. The kinds of programs people are likely going to want to write—like mobile apps, web apps, and games—are a lot more complex these days than their equivalents were back then. On the other hand, there are boatloads of free tooling and information readily-available to help people through it. On the third hand, there could perhaps be issues of information overload if you go looking without a guide.
So, getting started is one thing, and I can't say for sure whether it's harder now versus then. Of course, that's a slightly different question than whether it was easier to learn programming then versus now. This question is easy for me to answer: of course not, learning it now is just as easy[1] as it's ever been!
[1]: Or hard, if you're of the glass-half-empty persuasion ;)
You had to install Visual Studio then the Windows SDK then probably other libraries. Then spend an afternoon to set environment variables and project settings until it can compile anything.
Dev C++ and code blocks had a an easier start, although they were bugged to death and only used for hello world projects.
Borland is paid software. There was a free edition was very challenging to obtain. That certainly contributed to the death of the company.
If we talk about two decades strictly. There was no internet so you couldn't find any tutorial or help page. Books were selling for $50 each and you couldn't Ctrl+F them.
On Linux a newbie can "gcc -o helloworld.c" to compile his first C file. Then there is a massive gap until he will be able to compile multiple files with make, auto tools, manage external libraries and package anything to run on another computer. Now it's 2018 and Linux still doesn't have a C++ debugger.
How so? Two decades ago was... 1998. You pop in Visual Studio 6.0 click a few buttons and it's done. On Linux we had gcc/g++, which came preinstalled in every distro.
Even if you factor in setting up Makefiles and autoconf and friends, those are nowhere near the complexity of configuring webpack for a SPA, let alone installing everything you need for a functioning stack. We're talking days, weeks, and even months of tedious fighting to get everything setup. You have to worry about setting up web servers, HTTPS, CORS, what type of backend API to use, what ports your various microservices live on, what HTTP framework to use, which babel settings to use. I spend way more time these days on architecture and overhead issues than feature work. It's not even ballpark close. C/C++ I set the Makefile up once for the architectures I need and I'm done.
In C++ in 1998 you were also never in danger of having to upgrade a major component in the middle of development. I've had to upgrade node, React, webpack, and others all within the same year. The shrinkwrap supply chain vanished and so did the costs of pushing out new releases. New releases, I might add, that have a shit ton of bugs (never, ever, ever touch a X.0 npm package, jesus christ do not do it)
> Even if you factor in setting up Makefiles and autoconf and friends, those are nowhere near the complexity of configuring webpack for a SPA, let alone installing everything you need for a functioning stack. We're talking days, weeks, and even months of tedious fighting to get everything setup.
That's an overly rosy view of the past. Makefiles are inscrutable for a newbie. We're talking hours just to figure out tabs versus spaces for indentation. Google launched in 1998 so that was barely Googleable, there was no Stack Overflow (remember expertsexchange.com?). Older C compilers utterly failed at useful error messages on trivial errors like missing semicolons, or closing strings or braces, and 1998 was before Boost, so good luck getting STL Vectors working. These days languages have hashes/dicts available as a built in data structure. If you were lucky in 1998, you had a C string library and no one had to call ::malloc (or worse, ::realloc).
There have always been major component upgrades if you choose to get distracted. Circa 1998... remember that whole mess with 2.2 vs 2.4 Linux kernel? How about upgrading Debian versions was and that fun with incompatible versions of libc? i386 vs x86_64? STL? How about rewriting to use Boost (version 1 came out in '99)? Getting Boost to build in the first place was at least a multi day adventure.
As a newbie, setting something up on a modern stack is actually pretty easy - because that's no longer something to setup by hand. Clone the tutorial repo, grab their docker container or `npm install`, and you're up and running.
For decades, "development tools" was a (relatively) standard bit of kit on any Linux distro; in fact, you needed it to compile basic software you needed to run on the machine.
These days, every environment/framework has a different distribution method/model involving running random shell scripts or learning git or whatever.
For example, on OS X (I imagine this is harder on Windows, possibly easier on Linux)
Python:
To install REAL Python on OS X:
* Install GCC
* To install GCC you need XCode
* To install Homebrew, use Ruby to curl to a random shell script on github.
* Export a path
* Use brew to install python3
* Brew installs pip for you (and then?)
React Native:
* Assuming you have Node installed (yeah, sure, why not!):
* npm install
* create-react-native-app
* npm start
* Then download Expo client on iOS or android phone and scan a QR code...
I thought it would be easy on windows but was baffled how bad the instructions on the python website are. The getting started guide still caters for windows xp and choosing py2 vs py3 is still considered a good decision for beginners.
1. browse to python.org
2. get confused whether to choose python 2.7 or 3.6
3. get confused wheter to choose gzipped sourced tarbal, XZ zipped source tarbal, embeddable zip file x86-64, embeddable executable installer x86-64, embeddable web installer x86-64, and then repeat all those options for x86. T___T
4. open installable, press next a few times. Last time i did this there were some confusing options to select there also, like wheter to use %APPDATA%, add to PATH or not.
Getting a C++ toolchain in 1998 was pretty easy. You were either Windows and lived with whatever Microsoft abomination they shipped or you were on a Unix and you installed gcc and vi/emacs. Truly smart people installed CVS.
In 1988 on the other hand, you sometimes had to figure out how to bootstrap gcc--possibly from the broken garbage compiler the manufacturer shipped and sometimes from nothing at all.
Even for more advanced web-dev stuff I'd argue that the tooling has made things much easier for getting started. For instance create-react-app does all the heavy lifting in basically a one-click package.
A generation ago, many (most?) apps were C++ for Win32.
That means wndprocs, LPARAM, pointer casts everywhere. No STL, buggy templates. Dialog resources, manual layout of controls. Build the app for both * A and * W, ship unicows for win95 support. Crashes on machines without IE4.
A lot of that was just incidental complexity and is covered up now.
They are easier to get started with for several reasons.
1. Automatic GC means new programmers don't have to understand memory management or concepts like pointers up front.
2. Dynamic typing does not demand type declarations or throw type errors so it lets new programmers get up and moving quickly without having to appreciate the purpose and benefits of a type system.
3. No compiling means faster builds so newbies become less frustrated with errors and debugging. Users will also avoid dealing with cryptic compiler errors.
4. The dependency management and module systems for these languages are easy to use and don't require an understanding of header files or object linking in order to get your program to build or run in other environments
5. Package managers with tons of free open source code that are easy to discover, install and use immediately.
But then we're not comparing the same thing. If you were trying to learn manual memory management, type declarations, linkers, cross-compilation and packaging in 2018, would that really be any easier than in 1998?
I don't disagree, but these languages can still be used to create useful programs for hobbyists and industry without having to learn those concepts, so I'd say it certainly counts as "easier to learn", the utility of those other skills notwithstanding.
Agreed. The hard parts of programming have been removed ( especially for entry level people ).
> We have communities and cheap, fast computers.
And online tutorials ( youtube, coursera, etc ).
Learning to program today is much easier than just 10 years ago.
Not only that, the setup to start programming today is much simpler. Not only does almost everyone have a computer, there are IDEs/programming suites available that eases you into programming.
I disagree. We currently have 5 SaaS companies. We are finding that we are writing lesser and lesser code. Most of our services have 300-500 lines of code and a product is composed of 5-6 such services. We've not found any difficulty in selling it but yea some niches are crowded with tons of competition. I do both backend and frontend for SaaS at my company.
Not just some book, usually lots of books, from the fundamentals of BASIC to technical references with complete schematics for the entire machine. I had a Commodore VIC-20 and it certainly came with all these.
And even then, experts and resources were available online, for suitably primitive values of "online". Think CompuServe.
Another thing we had back then but less so today was magazines, full of programming and technical tips and info, usually tailored to a particular computer family. (So, Commodore magazines, Tandy magazines, Apple magazines, etc.) Once business computing took off, the dominant players (PCMag, MacWorld, etc.) became more productivity-oriented -- but I still learned Windows programming from the free utility and programming columns in early 90s PCMag. And then once the Web hit, the dead tree publishers were fighting over scraps.
The manual didn't tell you anything. The books didn't tell you anything. They scratched the surface, which if you knew nothing might have seemed profound, even mind-blowing, but if you were trying to do anything non-trivial you were left grasping for answers.
I tried to build games on the C64 using BASIC and, while I had some success, the results were always limited by the fact that it was in BASIC, it was going to be slow, and that planning and editing your code was extremely painful, you didn't even have a proper text editor.
The real way to develop on the C64 was with assembly. Unfortunately, finding information on how to do this was virtually impossible. Almost no books existed, and those that did were rare and expensive. Even when you got the book you were missing huge chunks of knowledge: The CPU itself is just 10% of the problem, the rest is knowing the memory layout, the supporting chips, and so on, stuff that wasn't well known.
The methods employed by actual production games were dark and mysterious, often close to protected secrets. You were welcome to painfully disassemble these games, but you also had no tools, not even a basic assembler, unless you went out and bought one.
Today it's easier than ever to build games for vintage hardware because everything, every little thing, is lovingly documented. This was absolutely not the case in the 1980s.
Why didn't you just buy the Commodore Programmers Reference Guide[0]? It had an entire guide to C64 machine language, and circuit schematics. The book even tells you to buy the 64MON cartridge if you're serious about programming in assembly. This book wasn't rare. A friend of mine had it, we lived in the middle of nowhere. These programs and cartridges were not rare. I had multiple catalogs for buying these things as kid. The best programming books had code listings on page, and line by line explanations on the facing page.
Funny... I didn't even know that book existed until I was a few years into learning how to program my C64. I honestly thought you were supposed to figure things out by randomly PEEKing and POKEing addresses to see what they did. I had complete tech illiterate parents and no mentors existed to point me in the right direction.
I remember visiting some other kid who had a C64, and his dad had given him the Programmer's Reference Manual. He never opened it and didn't care about programming at all (he just used the thing to play games). I cracked that book open and my mind was blown!
I thought the guide that came with the C64 mentioned the Programmer's Reference Manual. I remember asking my dad to get a copy and he told me that I had to finish the first book before we could get it.
> I cracked that book open and my mind was blown!
I really wish I had had the opportunity to read through one when I was younger.
It did. The last page of the User's Guide[0] that came packaged with every C64 says:
> Commodore hopes you've enjoyed the COMMODORE 64 USER'S GUIDE. Although this manual contains some programming information and tips, it is NOT intended to be a Programmer's Reference Manual. For those of you who are advanced programmers and computer hobbyists Commodore suggests that you consider purchasing the COMMODORE 64 PROGRAMMER'S REFERENCE GUIDE available through your local Commodore dealer.
Honestly, until I checked the PDF just now, I thought the User's Guide came with a card to order it through the mail.
My "local dealer" was a Radio Shack and they didn't sell anything Commodore related. If you wanted to know about Tandy, though, did they ever have you covered!
If my local library didn't have it, and the local bookstore didn't have it, the book may as well have not existed. The local bookstore was the size of a shoe store and carried a handful of popular titles and the most technical books they had were about building decks or, if you were lucky, rebuilding motorcycles.
You say the book wasn't rare, but I didn't know anyone who had one, or who even had access to one. I had a friend with a technical reference like that, but it was for the Apple ][, so that was no use. The only access to computers at all was at the local school, and the faculty had no idea how to use them for anything other than to train kids on typing.
Now that book is available online for free which is a sterling example of how much things have changed. Anyone who wants this information can get it if they look hard enough, and anyone with questions can get them answered by experts.
Arghh, I learnt Z80 assembler and Turbo Pascal in the mid 80s without ever seeing any books about either. Not sure how, picked up bits and pieces from computer magazines in libraries.
A few years ago in a 2nd hand bookstore I finally laid eyes on Programming the Z80, which I'd heard about but never seen, and I was like "OMG, I would've given a leg to have had that in 1985!"
Life with LibGen (and StackExchange) is so vastly different to my childhood isolation from the world's knowledge. I'm constantly profoundly grateful. But wish I'd had that access back then! Although I can't imagine it.
Yeah, I'm also not exactly sure how I picked up Z80 asm. I remember leaning heavily on the list of opcodes in the back of my ZX Spectrum manual, but it was literally just a table of opcodes and corresponding numbers so presumably I learned roughly how to use them from some other source. Owning ProgrammingtheZ80 was also one of my unattainable teenage goals, along with owning a disk drive.
I remember when I was first learning Java, whacking my head repeatedly against all these sorts of problems - WTF is a JRE vs. a JDK, why is it asking to install a browser toolbar, WTF is Eclipse and why doesn't it just let me run the file that I'm working on without going through half an hour of researching "Run configurations" and classpath BS first, etc. It was doable, but as a new person to the whole ecosystem, it was seriously painful to even get to the point where "Hello, world!" would run. Then I tried to get OpenGL working (I wanted to do games), and I probably blew an entire week futzing with configuration just to draw a rectangle on the screen. TBH it was harder than getting the same thing done in C++.
And compiling it all into a Real Executable that would run on Windows like a normal program? Forget about it...not even to get into Web Start, applets, Maven and JavaFX (remember that disaster?), oh my!
Then I discovered Processing, which bundled everything (Java runtimes, libraries, etc.) together in a nice tidy package, and its own IDE where you could just open up the thing, type in some code, and hit the big obvious "Run" button. Want a library? Download it from the nicely curated set that are guaranteed to work! Wanted to create an executable, click "Export", and it just works! All the nastiness of linking native dependencies, bundling the JRE, handling classpaths and linker flags, it was handled once by the devs working on that project, and end users could just focus on code.
Things are getting better with other technologies these days, but it's still rare that I see an environment as zero-config as Processing, apart from web-based Javascript playgrounds. All it takes is a pure focus on usability, which I think is something that as devs we often overlook because once we've gotten over the installation problems that we're familiar with ("well of course brew is better than MacPorts, and you'll need to install nvm and pick the right version before this library will work, except for this one dependency that's only available through RubyGems, and this other one that") we forget that they can be confusing and really off-putting to newbies.
Python environments are particularly bad, IMO because of screwy webs of dependencies that more often than not fail to install cleanly and the Python 2 vs 3 split.
On the other hand, Python's "batteries included" means you can go a long way without installing anything. Sometimes I think people forget to look in the stdlib:
> Python environments are particularly bad, IMO because of screwy webs of dependencies that more often than not fail to install cleanly and the Python 2 vs 3 split.
Which is why Python virtual environments are a thing, and I feel like more people should be taught to use them routinely.
Learning to program just takes a huge amount of frustration-tolerance. Fullstop. It's inherent in the craft and doesn't really ever go away - it's just the problems you face that change. So in that sense, complex setups are perhaps not our biggest problem.
Of course, that's no excuse for terrible interfaces. But I dare say there have been easy-to-use and not-so-easy-to-use programming setups in every era...
I don't understand this viewpoint at all. There's a breathtaking amount of tutorials, Q/A (stackoverflow), forums, chats, that can get someone started with a fraction of the effort it took for me to learn programming 20 years ago. There are one-click installers for programming languages and for IDEs. There are entire free books online to learn specific languages or frameworks.
I think people have gotten lazier and they expect to be taught, rather than seeking out materials. Or perhaps the people wishing to learn, are not as interested in programming as they think they are.
That being said, yes, learning "cloud" is an enormous sink of time that requires some exposure to lots of different silos of information. Doesn't that make sense though? It's a complicated subject.
As relatively young person I agree. Many people my age and/or younger expect quick results and don't really have the patience to see something through. But like with any other facet in life, if people take shortcuts they totally deprive themselves of vital experiences that are necessary for them in the long run if they want to experience any decent amount of success. Especially if they are wishing to learn something for their career.
In other words, if your objective is to learn programming, how can you possibly hope to create or do anything more meaningful than trivial examples without challenging yourself to learn the environment nor learning howto import 3rd party software libraries so that you don't have to recreate the wheel?
It's part and parcel of not doing anything trivial.
Learning to be resourceful by visiting Q/A websites, forums, chats and asking questions, and trying things and failing and getting up and trying again just makes you better.
You learn more by failure and experimentation than by following an example to a T but never venturing outside the sandbox.
I'm not that old, 33, but I think I recognize that I wasted a lot of my youth being paralyzed by the thought of failing, when really it is the most successful people who start out by failing. A lot. It is important to get comfortable with failing, and to separate one's failure in trying something from the thought of personal failure. That is "I failed because I need to learn more," versus "I failed because I'm not good enough."
> I'm not that old, 33, but I think I recognize that I wasted a lot of my youth being paralyzed by the thought of failing, when really it is the most successful people who start out by failing
Excellent life lesson! Never mind coding, that is one of the most important things we can teach our teenagers. (Not to deride coding, but you know what I mean.)
My only response is this famous quote: "The children now love luxury. They have bad manners, contempt for authority; they show disrespect for elders and love chatter in place of exercise"
Socrates did complain about the invention of writing, in Plato's Phaedrus:
"...an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality."
I don't think people per se have gotten lazier, but the fraction of the populace that wants to learn programming has increased. When the visibility of programming in society was small, the people who were attracted to it tended to bring along a very strong dedication to the task. As programming became more hip, more and more people were attracted to it, so the average level of determination sank...
Every single forum I frequented "back in the days" (ca. 2004, vBulletin-style) is gone. These were forums that hosted intense/focused technical discussions, in a way that just doesn't seem to exist anymore, with very few exceptions. At least I cannot find anything comparable. Today, there is just Reddit/HN and everything else is mostly commercialized and/or dumbed-down content. Don't even get me started on "educational" youtube videos.
I got my CS degree in 2012. It was kinda scary how many students were on the verge of graduation, and were unable to use a terminal, use SSH, or tell you what git is (not that I know exactly o_o).
Those aren't particularly complex concepts. It is definitely partially the schools fault, but CS is also a field where you have to have some interest in what you do. You can't just go through the motions.
I wasn't able to use version control at all when I graduated; I didn't even realize it was an option. I just every now and then would make a tar of my project in case I lost something.
I suppose I had copy/pasted some cvs commands from sourceforge in order to download the source of some open-source projects, but that was the full extent of my interactions with source control.
I know how to write some software before I went to school, but learning version control there was an eye-opening experience. It's the kind of thing I think they should teach in school to non-programmers. Anybody who writes anything, collaborative or not, is advantaged in some way by using it.
100% agree. To me it seems like most students aren't particularly interested in programming, none of them have side projects or care that much about the field.
I don't know if this is because some people are just looking for a good career path (This is not my experience from talking to my friends) or just weren't sure what to do with their lives other than they like computer gaming and computers.
The author is making an excuse for not including an intro chapter and tool in his book to get the reader to a place where they can start using the book. My sixth grader checked-out a Python game programming book from the school library. He was able to install all the tools necessary to follow along and learn from the book.
Today you have Google, Youtube and every programming language available from scratch to assembler at your fingertips. There is no way it is "harder".
Yea the C64 dropped you into basic. But this was my first experience on a Commodore 64:
"Play A Game"
-Syntax Error
"Play a Game of Chess"
-Syntax Error
"Let's play Global Thermonuclear War"
-Syntax Error
Well programming is one piece of the puzzle. I also miss the days when qbasic was preinstalled and easily accessible.
But what worries me more is the actual deployment of the application you write, which is getting more and more complex quite fast.
It seems, I'm in a rare place, where deployment consists of installing a database and extracting an zip file. There's a script you run and the (intranet) service is ready. Updates work exactly the same.
Sometimes I feel a bit ashamed, when clients watch me extracting a zip file, but then I experience deployments and updates of other so called enterprise systems and am completely baffled by their complexity.
Server provisioning and orchestration. Container managment. In memory caches. Sharded databases with master slave setups. Load balancing with health monitoring and what not... And to manage this mess you need a mix of bash scripts, fiddle with permissions, run terraform which itself runs ansible which deploys some docker containers... And when there's a tiny problem the guys in charge are completely roadblocked for weeks.
I have no idea how some people can live with so many layers of complexity and I fear the day, when I need to dig into this mess, eventually...
To add to this, one other way programming is getting harder is that there is no incremental path to becoming a developer. In the past, you could easily become a programmer by accident. You wanted to edit your MySpace page, google around, learn some HTML and do it. You wanted to mod a game, you started messing around with the game, download other mods, and change them.
I think this incrementally is gone in two ways. One is that programming is so abstracted away from how we use computers that it's almost in a different world. But even if you figured out what programming is, when you actually attempt to do it, the sophistication of our tools make it a lot harder to get started. Everything that makes us programmers more productive: compilers, build systems, package managers, etc make programming less accessible. There is a lack of coherence between all these tools, no one person designed them, they're the product of bottom-up innovation (the bazaar) which has the problem of making things messier.
I think open-source/bottom-up innovation is great, but eschewing "the cathedral" for a wholly bazaar-based approach is making it a lot harder to learn to program.
What I'm working on now (I'm the founder of Repl.it) is taking a lot of these awesome tools and putting a nicely designed experience on top of them. We want to bring back the welcoming cursor that invites you to program the computer and bring back incrementality to programming, you start from a repl, and from there you can move on to web-dev or what have you. You'd be surprised how fun it is for kids to build CLI programs before moving on to build other things. We put together a page where kids can post about what they're building with Repl.it and it's really inspiring to see: https://repl.it/ibuiltthis
I guess it depends on what you mean by "become a developer".
I think a lot of the increased difficulty is the skill of developers. At one point, knowing how to edit HTML would get you an interview. Now, I feel like I might need a successful Python library so I can get an interview...
Not sure if it's a bad thing or not, since the art is just becoming better.
We like to think programming is hard and that there's a barrier to entry that we somehow surmounted because we're oh so smart.
But does anyone here actually know people who learned to program within the last couple years?
My co-founder's girlfriend taught herself enough Javascript and React to help out on our front-end in a couple months. Had no experience except for dabbling in Wordpress. Now closes issues for us.
My buddy learned enough Python to create a GUI for an old card game. Built it to completion enough to challenge himself against the computer player. Hasn't programmed since.
Neither of them had any experience before. These posts really underrate the resourcefulness and intelligence of beginners. And my suspicion is because we like to think it's hard.
People here act like installing Homebrew to download python3 is somehow hard for a beginner when there are hundreds of tutorials online telling them how to do it. Hundreds of tutorials are exactly what I didn't have when I started programming Basic even though the hello-world was a little simpler.
One of the consequences of the transition from command-line interfaces such as DOS and Unix to contemporary GUIs such as macOS and Windows is we have moved from environments that do not make a sharp distinction between programmer and user to environments that make a sharp distinction. One of the nice things about command-line interfaces is being able to make powerful tools that are composed of smaller tools. Unix has a powerful collection of tools, and if those tools aren't powerful enough even with combined by pipes, there's awk and sed, and then there are scripting languages. Contemporary GUI applications, however, work very differently. Instead of being composed from smaller programs, many contemporary GUI applications such as Microsoft Word, Adobe Photoshop, and iTunes are large monoliths that are expensive to write (just consider how large the teams engineering these products are, and consider how much effort it would take to write a clone of these programs from scratch), difficult to combine, and are difficult to script unless they are specially designed to support scripting (e.g., AppleScript, Microsoft Visual Basic for Applications, etc.). Web applications are even harder to write than native GUI applications, and they are even more difficult to deal with in regards to interoperability and scriptability.
It didn't have to be this way, and there's room for a future where it doesn't have to be this way. The Smalltalk-76 environment demonstrated that it is possible to construct composable and interoperable GUI applications. The Genera operating system for LISP machines worked similarly. Apple's OpenDoc initiative would have introduced the wider world to the concept of composable GUIs, although it was canned as a consequence of Apple adopting the tried-and-true OpenStep platform as the basis of future Mac development. An operating system environment inspired by the Smalltalk environment and LISP machines that provides both a REPL for tasks that are equivalent the Unix command line and an OpenDoc-like API for designing composable GUI or even web applications would create an environment that is both easy to use and programmable. I've been growing more and more excited about the possibility of such an operating system for workstation users that could one day serve as an alternative to contemporary desktop operating systems and environments such as Windows, macOS, and the Linux desktop.
For me it looks like inverted survivorship bias. You won't hear from people that succeed so you don't have way to count them. People that nag or email author might be even in some special group of 'someone please make it work for me', I cannot imagine mailing author of a book. I do not believe that such group to be really big, maybe it just visible more than others. Like 'plz give codez' on StackOverflow.
Buying a book or reading blogpost does not entitle you to anything more, like wasting author time. Time is most valuable asset and he already spent it on writing book. If that book is crap you can rate it on Amazon or something but directly contacting author?
I work at JetBrains on https://datalore.io/ It's an interactive computational notebook for data science, which automatically recalculates code as you change it. We believe that such an interface lowers the barrier for entry into data science (and programming).
Looks great, though I can't find anything about pricing or pricing plans (looks like it's in open beta?)
While I've dabbled with using jupyter for teaching programming to high school students, on the whole its usually too difficult to get up and running for someone who's not already moderately tech savvy. Interface is everything when starting off; this looks quite promising if the pricing is affordable for schools
I agree that it's harder to learn to program now than in 1982 (in terms of "hello world" level simple stuff - but not for advanced stuff), but I believe that it's way easier now than in 1992 or 2002.
AppleScript might be an exception in the 1992 case, but it was a fairly specialized case and only for Macs. Otherwise, today you can google stuff easily (vs needing books and such) and even Windows supports installing Linux tools out of the box.
The danger, though, is that in the future we could have fewer and fewer general-purpose computers -- iOS is not as dev friendly as OS X.
However, even on general-purpose computers, don't seem that interested in making it easier, which is sad from a professional pride point of view if nothing else. I set up Android Studio on Linux recently. It wasn't terrible, but even that wasn't a seamless experience - the emulator stuff had a few problems, including included libs that were incompatible with Intel graphics drivers, that I had to work around.
Any time you join a company and have to browse a long dev environment setup wiki or readme, you've seen us all fail. We're bandaging it as fast as we can with stuff like Docker, but that's just delaying the problem until you have to debug your container in prod. ;)
My first computer was a VIC-20 (I later got a Commodore 64), where you also start with a BASIC prompt. I too have been thinking about to the barriers to starting to code.
Last fall, I helped in my son's grade 8 class to teach them python programming. There we wanted the students to be able to start coding with minimal effort, and minimal magic. They already had MacBook Airs, which had python 2.7 already installed, so there was nothing to install. Then we concentrated on a few core programming constructs (no classes for example), and no libraries. This was an attempt to recreate the environment I had when I learnt to program on the VIC-20.
I think it worked really well. The students were able to write fairly advanced (a few pages full of code) with that set-up. If you want to do more advanced programming, you'll need to install more, but for a start it was really good.
I've written more about it here:
I'll be honest with you. Programming is actually pretty easy to learn.
As long as you have a vague idea of how logic works and are eager to learn with the right approach, you can pick it up pretty quickly.
A lot of the concepts floated around in programming have crazy sounding names but are actually not that bad.
What's really hard to learn, however, is full on software engineering.
Knowing how something works end to end. Sure, you can learn to program well in JS, but that won't translate right away to knowing how to build an awesome website.
You can be an excellent java programmer, but that won't translate right away into architecting a nice Android app.
For me at least, when I was an up and comer, these were the things that were hard to put together.
Every tutorial site wants to jump into a lot of details right away - I wish they started with a high level overview of where you'd need the language, where it fits in in the larger "app" picture (where app can be a site, an app, etc.) and then dive into the specifics.
Hi, I completely agree you. I am currently learning how to code. I am at that point where I know the syntax of of a few language pretty well and can code up small programs.
However, I do have difficulties in architecting and developing a fully functional application.
Not sure you'll see this but honestly man, if you have friends in the field just ask them, plain and simple. Not saying this is you, but often people don't go this route because pride, etc.
If you don't know people in the field, chances are there are software people where you work - some are douchy to people just learning (pfft, you think you can learn this craft in months?) but others are willing and eager to help others. Find them and again, just ask them to point you in the right direction.
Lastly, find a class. Nothing crazy like a 25k bootcamp or anything, but find something like building a site end to end in 2 days or something like that.
A lot of this stuff is tribal knowledge and not captured well in tutorials. There are excellent tutorials for learning the technical things, but not necessarily for putting everything together into a working app.
As someone who came from a completely unrelated background (marketing). Here is how I managed to start 5 tech companies:
Your problem can't be unique. So, look around and see if someone has already solved it. After looking at their solution, ask yourself - Is it good enough or you still want to make effort implementing the same thing albeit in a different way.
The trick to complete a project never takes one that doesn't excite you. Since I mostly develop products for my own company, I have the luxury of picking up my team, project, and tools.
I started procedural programming with python then started picking up OPP. (OPP did not make sense to me since I wasn't working on real projects, just hobby scripts)
Finally, I learned C and it all began making perfect sense to me!
-> I finally understood how python does stuff it does at a low level.
Then I learned more advanced python stuff like decorators, generators etc...
Then I hired a few professional programmers to work for me and learned the software development practice, writing code is a small part of actual software project development.
-> I learned how software testing, planning, and decisions are done in teams. We were doing Agile SCRUM.
You can probably, pick up good workflow from some opensource GitHub projects which are developed in open!
Then there are special tricks/methods which you can not learn from books, so I hang out on Reddit, twitch, youtube channels and occasionally come across interesting tricks.
Then I learned Haskell, F#, OCaml which helped me in understanding functional programming.
After this, I learned Scheme and then Clojure.
I started writing JavaScript frontends with React and Vue.
Got interested in Reverse Engineering, reading Assembly etc... but then got bored because I was spending hours and hours and achieved very little.
Experimented with Java as my company has Java projects written by other devs and I wanted to contribute but didn't like Java and its ecosystem. (It's too complicated for me)
I learned Rust and moved from VIM to Sublime Text then finally to VSCode.
Then I learned Ruby and OPP magic became FINALLY clear to me.
These days I mostly write Go/Rust/Ruby/Vuejs(Typescript) and deploy on Kubernetes which made deployment soo easy, CI with Gitlab or Drone.
I was you a few years ago. There's two ways forward:
1. Get a job programming and slowly build up your skills/expertise by learning from those around you.
2. Develop a lesson plan for yourself and figure out how to build and deploy different kinds of apps.
I don't know if you're into #1 but I think it can be massively useful. There's so much stuff you don't know about that some dude who's been working somewhere for 5 years does. You can pick up best practices, design ideas, constructive criticism, better coding, better tools, knowledge that's easier to get in person than from a webpage. Today no single person makes a truly great and novel software application, and working at a tech company will make that apparent to you.
As for 2, some ideas assuming you know how to code:
1. Write an HTTP API that talks to several different databases. Learn how the databases work and when to use them. Get good at using them both from their CLI and a programming language.
2. Write a site using React/Angular/etc. and HTML/CSS to query an external API and display the results or organize user input into a fun framework.
3. Find a way to publish both of the above so that mom can interact with them.
4. Figure out how to wire a web app that you wrote to an API you wrote.
5. Find a way to deploy that as a complete system.
6. Figure out how to write a simple iPhone/Android App if you're interested in that.
7. Now make it do something cool, presumably by hooking it up to an API that does the heavy lifting.
8. Now make it all pretty.
9. Now figure out how to get it in people's hands. Collect their feedback and figure out what's working and what's not.
10. Go back to whichever step needs work and fix it.
That's easy to say, but there's a ton of work in learning all of these skills. It's one thing to write an app, it's another thing to get it in people's hands, and it's a third thing to get it to really really work to the point where people love it. A lot of people spend 40 hours a week doing that for years and don't/can't do everything - that the reason there's so much specialization in the world of software engineering.
> Computer retailers stopped installing development environments by default
This is true, but somewhat ignores web browsers. The big ones (Chrome, Firefox, IE & Safari) all have complete and high quality dev environments.
The article is Python-centric, and so the best answer for that right now is: install Anaconda!
Otherwise, the general argument being made here is solved by JavaScript at the moment.
If you use JS to write code, there’s nothing to install, distribution is built in, it comes with the dev environment. All language and feature comparisons aside, this is the strongest reason to use JS, because it makes sharing programs with other people so insanely easy compared to any other language.
The question here wasn't whether the language is easy to use, the issue raised in the article was whether the language & dev environment are pre-installed, how hard they are to install, and then how hard the programs are to share after you're written them.
No version of BASIC is being shipped with any OS today. You still have to download the language and a dev environment on a brand new (Windows) computer. And when you share a BASIC program, you either have to build an executable that will run on someone else's machine, or the recipient has to install BASIC on their own. (The article was about sharing code & learning to code, so a binary isn't ideal.)
JavaScript and the dev environment, on the other hand, comes with every new computer, on every OS, because browsers are pre-installed.
BTW, is this modern basic? That looks like what I remember from DOS 2.0 in 1980. :P
To be fair, you're comparing HTML canvas API to BASIC's native line drawing, this example doesn't actually say much about the JavaScript or BASIC languages. And it doesn't compare BASIC's graphics to canvas's... with canvas you can draw curves and bitmaps and do transforms. I'm not familiar with whether BASIC can do that these days.
In my experience, canvas is quite easy to use, even compared to BASIC. I know you were joking, but incomprehensible is what I'd use for, say, Vulkan or DX12. JS+canvas is very comprehensible and easy, at all times, in my opinion.
Honestly, if I were teaching someone to program now, I’d just tell them to buy a raspberry pi. For $50 for the nice kit, plus a surplus monitor + keyboard + mouse, you get a fully working toy dev environment preinstalled. You also get piles of programming tutorials written for that exact environment.
I personally know otherwise competent Java backend devs that have no idea how http works at all. Pipelining, gzip/deflate, Cache-control headers, XSS, uri concepts, bandwidth consumption, proxies, etc...not even the foggiest clue. There's just so much breadth, abstraction, inversion-of-control, and pressure to deliver in corporate environments. There's little incentive to be a big picture person when you're rewarded solely for closing out stories and pushing out change requests. The resulting hit in quality is atrocious.
There's another option the author doesn't consider: Change programming, or create fundamentally different programming paradigms that entirely obviate some of the requirements we have now.
One idea that's a fairly natural progression from where we are now is Luna, a WYSIWYG data processing language: http://www.luna-lang.org
A loftier re-imagining is Dynamicland, co-founded by Alan Kay and Bret Victor: https://dynamicland.org
This is why I recommend people learn how to program using javascript at first. There are no downloads, there is no setup and given some simple instructions you can mess around with fully formed pages just by typing into the console what you want to do. The modern browser has all the tools for a programming course that takes place almost entirely in the console.
I lean more towards the python/C++ side of things but for just diving in and seeing something cool for quick, colorful motivation to learn, javascript fills that need better than anything else.
For a REPL/solving an algorithms problem? Sure. And the first time they want to use some code from StackOverflow that uses a library, god forbid, it's off to the races. Require? Script tag? Webpack? Transpile? Minify?
All of those things are important tools, but their use (and presumption of their use, a la "just import 'map' from underscore.js; you know the drill") is so widespread as to present a major stumbling block for first-time programming explorers.
Commodore 64 programmers were never interested in "Bayesian Statistics" and "Digital Signal Processing". Their main interest was pretty much related to systems programming in one way or another.
Today's "programmers" expect to do crazy things like carry out statistical analysis on megabytes of data, in a matter of minutes, with all kinds of insightful plots, and animations and what not. (Same goes with DSP).
Today's programmers are NOT asking for what Commodore 64 programmers were asking for. They're asking for way way more.
Not that there is anything wrong with that. But who's is to provide such shiny interfaces to future Bayesian-statisticians and Digital-signal-processors?
- System programmers do massive research to automate for today's programming demands, workflows, and activities. And a lot of people are working in this space. But it's full of information overload without a clear winner.
- Or one of their own takes a hit for the team, transitions into system level programming, and uses his past experience to create usable interfaces, processes, workflows, and associated software. It's not an easy task.
I think now we're talking about something I've been struggling with for many years, a computational science workflow software. Again there are hundreds of projects in this space but that makes it even worse. How do you chose between them? Someone says he/she uses Sumatra and it works for him/her. But it may not work for someone else. Someone else says you should follow Cookiesutter Data Science guidelines. But that's also addresses on a small aspect of the overall problem. Etc, etc. The there are the emacs diehards who claim you can do everything in emacs so why leave emacs. Well I can do everything in linux + shell + C so why leave linux + shell + C? (point being emacs only let's you do everything by becoming an OS in itself).
I don't have a solution.
I'm working on this problem in my spare time from a very very different meta level angle. But that is a massive research project and cannot be finished in a matter of a couple years, much less without funding.
Computers in 1979 would be 500 dollars in equivalent today money for something like a sinclair zx-80.
Now computers are cheaper than lunch.
Computers of long ago couldn't do much... the raspi zero wireless can do a lot.
We're looking for treasure - but there's diamonds on the soles of our feet. We just need some curriculum worked out - like the article indicates computer usage + computer programming.
We assume a lot is just absorbed by kids - but it isn't it must be taught.
Isn't what he said the natural consequence of technology being available to a wider and wider segment of the public, reaching ubiquity? This is to be expected when what was originally something only known in a small circle is now having to be used by essentially everybody then. One thing that can definitely help is to incorporate programming as mandatory into national/state school curricula as soon as possible. Programming literacy could become something like writing literacy, though of course we're not quite there, yet.
Everybody having a solid grasp on programming principles would also protect the democratic society in the sense that people would have an idea on how the software they're using daily actually work, what is being done to their data, etc. Leaving such a fundamental fabric of the society in the hands of few is self-evidently dangerous.
I love Ruby the language, but I absolutely hate Ruby the platform. I've never run across a more hostile landmine-ridden terrain for new developers. Every time I have to ramp up a new dev, I just resign myself to the fact that it's going to take anywhere from an hour to three or four to get everything installed. God forbid they've actually tried to use Ruby before on their machines.
I've dipped on job prospects which involved code challenges that require me to get their codebase working on my machine. I inevitably wind up having to troubleshoot their broken setup and that takes way more time than doing the actual challenge.
Ruby desperately needs innovation in this space. Bundler and rbenv / chruby / rvm just isn't good enough. Vagrant / Chef just isn't good enough. I understand that dependency management is a hard problem, but other languages aren't half the headache that Ruby is.
Before the tire fire that is rvm became "standard", it was normal to just download the version you needed and build it from source. There's nothing forcing us to use crappy, obscure tools like rvm and rbenv. And I agree with you on all points.
It's a total shitshow. But since ruby manages everything with environment variables, you need some kind of tooling to manage multiple versions. Horrendous.
Ruby should just standardize on a way to point projects to rubies. Instead of a zillion environment variables, just a shebang. It's how everything else works.
The fourth option, developing on the cloud and stay there, looks like a good way to lose whatever freedom we had left.
Would the cloud provider let me write pen testing tools? Would it let me put profanities in my comments? Would it give reports of my work sessions to my boss? Would it get a licence to my code?
> Computer retailers stopped installing development environments by default. As a result, anyone learning to program has to start by installing an SDE -- and that's a bigger barrier than you might expect. Many users have never installed anything, don't know how to, or might not be allowed to. Installing software is easier now than it used to be, but it is still error prone and can be frustrating. If someone just wants to learn to program, they shouldn't have to learn system administration first.
I'm not sure I agree with this statement. macOS provides Python, Java, Ruby, etc. out of the box, and a C compiler is one button click away. On Linux a C compiler is provided. For web languages, every browser comes with a full suite of development tools.
There is a certain amount of systems administration knowledge you need to have if you are going to be an effective programmer. No, you don't need to be a sysadmin, but but source code doesn't live in a vacuum. I vote for face the pain.
"If someone just wants to learn to program, they shouldn't have to learn system administration first."
I may disagree. Even though that's maybe an old school opinion I think every developer should at least understand his own development environment (including the OS), potentially also aspects of the target environment. The reason is because (1) the abstraction between the OS and an(y) (sofisticated) SDE is still leaky (sometimes for a reason) and (2) because a developer that cannot support operations when things turn bad at runtime is just a fair-weather captain.
Every good developer should understand these things, but I don’t think it’s reasonable to demand them from beginners.
I learned to program in Borland Turbo C on DOS, and later in Visual Studio 97. Both were full-featured IDEs, at least by 90’s standards. I’ve only started to learn Windows internals & system administration on my first job in 2000. If knowing system administration would be a prerequisite, I’m not sure I’d become a professional programmer.
The point about cloud IDEs being the future is excellent I think. I worked for a bit over a year on a chromebook with no local storage - only a cloud IDE.
It was a lot to get used to at first, but once you have made the context switch, it's great to have everything in one consistent place with tools you can use from literally anywhere.
Similarly, working on a small display pays the same kind of dividend. When you can't put windows of code all over the place you enforce a kind of mental order that's hard to achieve otherwise.
A couple of things bother me about this. First the idea that someone would be a data scientist and not want to first learn how to do the most basic things with their computer. Second the idea that it's hard to do anything on a computer when you can literally type into Google what you want to do, and seconds later can be watching someone else do it on YouTube or reading detailed curated answers on StackOverflow.
As someone who makes developer courses / training material I think it's a really bad idea to use online services like PythonAnywhere or Cloud9IDE.
It's really important to learn how to set up your own development environment as a developer and if you're making this content, I believe it's our responsibility as teachers to show people how to do this.
Personally I use Docker in my courses (because it's what I use for my own stuff) and it's been a smashing success. I do get occasional people questioning "why" but after I explain what the process would be like to set up a proper Python dev environment without Docker they immediately see the wins.
Of course I do get Docker related installation questions but it's a small percent compared to questions on the course content. One of my courses is building real world web apps with Flask and it's probably 80% Flask / 20% Docker related questions.
That's much better than punting a local development environment off to a third party service / pre-built VM and then leaving people stuck in the dark for getting anything done on their own.
By the way, everything isn't just thrown into a single container. We use Docker to set up the whole app over a series of containers.
So postgres, redis, the flask app and celery all run in their own containers.
Without Docker, you're responsible for getting not only virtualenv set up, but also installing postgres and redis on your box, which is wildly different on Windows, MacOS and Linux.
From a teaching POV, it's easier to level everyone off with Docker, but from a student's POV, it's also easier because now they don't need to worry about any of that. It lets them get to the important material faster. They can just install Docker and run a simple `docker-compose up --build` and they are good to go.
Compare that to listing out a million steps to install python, setup virtualenv, write special clauses for people on Windows or MacOS, etc.. It's a huge burden for everyone, and it's the main reason why I started using Docker for my own projects years ago.
We haven't even talked about installing multiple versions of postgres or redis without Docker too (because apps tend to be developed at different points in time), or deploying an app to production. All of the steps you do to set up a typical real world Flask app on Windows and MacOS are thrown out the window without Docker because your production box is likely running Linux, but with Docker, it's pretty much the same.
The problem described is exactly why I wrote a short "How to Code" primer a couple years ago. It teaches some basics of Javascript. Javascript, at least in the browser, can be run with no extra dependencies. Everyone has a web browser these days.
The question of “where to begin” is certainly a lot harder now. Learning to program the way my generation did, the programming itself was obviously harder - assembly vs Python for example - but the path and the progression was laid out for you.
For me it was BBC Basic, then 6502, then 68k with DevPac on the ST... myself and a friend wrote our own driver for a modem we found in a skip and that is how we first got online... good times
I also think learning to program is getting harder, but at least you can start easy today.
When I started to program in the early/mid 90s it was very difficult for me to start as teenager. I didn't know anybody who did and had a (copy) of a compiler for me. At least I found QBasic (was in DOS included). It was very easy to program and debug something. The program could be as just one line of code. Just hit F5 (as far I remember) and the thing did run. But it was just an interpreter and slow. The included help was pretty good. Later I tryed to program Assembler under DOS. But again, no compiler, I had just the DEBUG function from DOS. With that it was possible to write Assembler direct in memory (but not save to a file). Also no marks for jumps .. you had always to calculate the address by yourself.
My QBasic book had (still has) 416 pages. It included EVERYTHING you could do in QBASIC. My Assembler book was a bit more slim (XT-486, before MMX). A "complete" book today would need over 10000 pages. There are sooo many things you can do. There are almost a infinite number of librarys on github.
I get along pretty good on blind trust, issuing commands provided in tutorials or setup guides sometimes having little understanding of what the heck it is that I'm doing. There is always an end point that I'm highly interested in and I spend my time learning the ins and outs of. I spend my time learning what I need to learn when I'm stuck. Sure this attitude can be disasterous. I keep current and offline backups at all times and not nuking the production code is a standing order. I know enough to install Gogs, but I am by no means an expert on all things git quite yet. If it weren't for stack overflow and search engines I'd still be writing inconsequencial stuff in Director Lingo. No one can afford the time to really understand the stack, its tens or hundreds of millions of interconnected lines of thought. When I take time to learn something I intend to do something useful with it. At this time my mind is like a buffer. For every new thing I learn, stuff from 10 years ago is fading fast. There is a lot to learn sure. There is even more that is forgetable.
I enjoyed the article, the author has some good points. I started on the Commodore 128 and I'd have to disagree with the author's conclusion. I think the filter is far less for self-motivated, truly interested users today than it was in the Commodore's heyday.
None of the issues presented are really issues, it's just that there's a larger swath of people who have accepted computers as mainstream devices and can now afford one. Or are simply too young and missed the heyday of the truly "personal" computer.
If I were to write a response, it would be titled The Great Filter. I was a computer nerd most of my life, by the advent of the smartphone and the majority of people realized how much the career paid, that whole stigma blew out the window. Unfortunately, the Great Filter is gone, people who would've been filtered out in the 80s are only visible by observing the students who aren't even computer literate enough to have bothered to investigating figuring out how a filesystem works before trying to program.
This is a REAL problem. The information overload for becoming a coder is going way up with all these "great" new tools we're getting.
Try messing with the build of a React.js app, and I'll come see how you're doing next week.
Things should be getting easier, and the information overload less. For all scenarios, not just the easy ones. I think more advanced developer tools will be the main driver of this.
I would pay $99 for an Apple TV app that recreates the experience of programming a computer connected to a TV, in BASIC. Something like the Amstrad CPC series computers with lots of built-in commands for drawing graphics, making sounds, etc. I’m trying to get my 8-year-old interested in programming, but there’s just too much shit on the computer that gets in the way and too many unrelated parts (text editor, terminal, compiler/interpreter, etc.) for him to think about. I can’t use parental controls to stop him from switching to YouTube the moment I walk away, because just about every piece of software has a long list of websites it needs to connect to.
He’s used to playing games on the iPad and none of those would be simple enough to inspire him to think that maybe he could make games too.
I seem to spend a lot of my time attacking this problem from one direction or another.
I'm currently writing a browser based editor/assembler/emulator for 8-bit AVR. I've made code-running MediaWiki plugins. I even got to the point where I made an entire client-side browser desktop environment.
At the same time as making tools to ease the entry cost, I'm also wary of making tools to conceal the complexity. Helper tools are beneficial, but once complexity is hidden it can grow unchecked. Being able to get a project set-up with ./configure or npm is great, but you have black-box that over time becomes more and more onerous to open up and have a peek.
I think it's just a matter that most people use Windows, and programming software is optimized for linux. I once wrote up all the steps required to get youtube-dl working on windows. https://news.ycombinator.com/item?id=11453506 Which is quite similar to what you need to do to install a programming language. Nontechnical people will get stuck on things like changing path variables, because they have no idea what that is. Let alone compiling something which is frequently required.
Programming certain software is optimized for Linux, but with VS Code and VS Community Edition, and the robust project templates that come out of the box, users can easily get basic C# programs working for free in console apps or even windows store apps with powerful debugging and intellisense tools. When someone is learning to code, C# can get them there quickly and easily once they discover the correct docs.
If you feel programming is optimized for Linux, that’s because it’s your preference. And a valid one at that! Its just going to tint your view a bit.
C# was made by microsoft for windows specifically. The majority of programming languages were not. When I started programming I got very frustrated getting stuff to work right. I remember it used to be near impossible to get pip on windows and eventually found some weird unofficial installer.
To add my take here, this was a problem in 2003 too when I wanted to learn, but C++ was such a hard starting point.
The big takeaway is modern commercial programming is so far from "learning to program". That's a hard gap to bridge when a newbie is learning "making a terminal based calculator and wondering how do I build an 'app' or 'web app'?". One Month a YC company tried to solve this but the market is hard to reach because education is hard.
> Some of them send me email. They often express frustration, because they are trying to learn Python, or Bayesian Statistics, or Digital Signal Processing. They are not interested in installing software, cloning repositories, or setting the Python search path!
None of that is new. Setting up an environment that is conducive to coding takes effort if you don't have someone to set it up for you, which describes most new coders. I had a hell of a time learning to code on cheap hardware running Windows 15 years ago.
This article lacks any depth! It is easier than ever to program and get started. 5 minute tutorials are available on almost anything and sometimes delivered by 8-9 year old kids.
Even low-level stuff are easier to pick up thanks to the explosion of readily available information and IDEs. No one is using Arduino? Perhaps it is true that newer generation is just less keen to put an effort to study anything outside of the norm of but learning old and new technologies has never been easier.
My mother-in-law, who learned to program in the ‘60s by carrying her boxes of punch cards across campus while getting tear gassed by Berkeley PD, would probably beg to differ.
"For beginners, and especially for people working on their own, I recommend delaying the pain."
No, no, no, no, no! This is exactly what caused the problem in the first place.
That you need an IDE is one of the biggest lies in programming. From the terminal on most Linux systems, you can download and install everything in a single line. Then compiling is another line.
I had this same problem as the OP describes and my solution after a lot of frustration was taking some tutorials on Linux from Youtube. That actually might not be a bad place to start because bash is a programming language (right?). Teaching some simple bash scripts plus navigating the file system wouldn't be the worst intro to programming and it can now be done in Windows.
Yes, bash technically is a programming language - but it's a pretty horrible one in terms of usability, and not the first that I would teach a newbie. Plus, you really don't need it unless your deep into sysadmin stuff.
As many others, I also disagree. Back in the day, laptops are more expensive and less capable. Now you can program everywhere with laptop and wifi. More importantly, sites like SO makes learning to program so much easier than before. Back then, if you don’t know somebody you might be stuck with some silly issues for a long time. Not to mention Windows had frequent blue screen of death.
Recently, I've experienced working on a remote server. It was a real pain. You need a crazy fast connection to be productive. Otherwise, you're stuck waiting for changes to save, for pages to load, etc. Of course, you can only hope the network doesn't break down or the cloud IDE company makes a buggy deploy.
As a remote developer, I like working in different places. These places don't always have nice internet. Working with a local environment and 4g connection is fine, because 4g is more than enough to look up docs. But working with a distant environment and 4g connection is very slow and unreliable.
Nowadays, a lot of people praise the "cloud" or everything. Use APIs for everything, Use public CDNs to load up bootstrap and jquery. Use an online IDE. Make a hosted Git service an essential part of your workflow. The thing is, not only do these services occassionally break, but some might get compromised (https://techcrunch.com/2018/02/12/browsealoud-coinhive-moner...) and some even ask you to install closed source binaries to your system without any idea of what's in it (keylogger, ransomware, anything is possible).
Given all that, when I read that using a local environment is "anachronistic", I can't help but cringe.
I wasn't using a web IDE, but the dev server was remote. While I suppose remote environments do not have to be a huge pain, I'd be interested to know how to mitigate all of the issues I mentioned in my previous comment. The only solution I've been able to setup is a local environment with minimal set of dependencies calling the network.
My simplified workflow is: do work locally using whatever IDE I desire -> create a pull request -> server runs the tests and puts the results in the pull request. Much faster than running the same tests on my local machine as well.
With the tests being done remote you will also likely not need to commit such large amounts of data that it becomes a problem.
> Working with a local environment and 4g connection is fine, because 4g is more than enough to look up docs. But working with a distant environment and 4g connection is very slow and unreliable.
Normally even if you are working remote it's expected that you have a stable, decent, connection. But even without one there are plenty of great options.
> Much faster than running the same tests on my local machine as well.
That sounds odd. Usually, while developing, running the whole test suite is unnecessary. So running tests is very fast, as long as you only run the tests that may fail after recent changes. At least that's my experience.
> not need to commit such large amounts of data that it becomes a problem
Not sure what you mean here. Mind clarifying ?
> if you are working remote it's expected that you have a stable, decent, connection
Definitely ! And 90% of the time I do have a stable connection. But that 10% where I don't (because the wifi breaks down, or I'm on a train to a nearby town), having a remote environment would not work. That's my use case for having a local environment.
I can't think of a reason a dev server should be remote instead of on your machine, unless you're working with large, live data sets that you can't download...
Well, with web IDEs, the server is remote, isn't it ? The reason here was the same: easy to setup environment because every dev had the same server image, with same config, etc. I guess that's what remote IDEs have going for them. I think Koding.com markets this very very well.
Oh, I see. My teams have solved the same issue by having cloud desktops and using RDP/VNC to mirror the screen on their Chromebook/iPad/whatever. It's surprising how well it works on not-very-fast connections, and you get a very powerful machine on the back-end with a full, native IDE.
most of the time, I don't really need to compile/run the code/tests to know I'm on the right track/it'll work well this minor cleanup. Then, once every few days I can pull out the remote box/IDE and clean up whatever bits of syntax/imports/typos I got wrong or forgot about.
Personally, I feel it's vastly easier these days. The sheer volume of learning material at your fingertips is staggering. I mean, you can just turn on youtube tutorials and soak it in with barely any effort. Most of the time, you can copy/paste whatever error you're getting into a search engine and get instant answers. It's a walk in the park!
I’m amazed to see that the reaction of many people including the op author, don’t think that this is simply the symptom of a field (ide) that haven’t evolved in thirty years.
Whenever i open a terminal to type some esoteric command line i don’t feel like a tech genius. I feel like someone living in the stone age.
It's getting harder because it's become fashionable to shun IDEs that make most of these issues obsolete, and have done so since at least the 90's. It's ridiculous to expect newcomers to a language and/or programming in general to slog through a 20 or 30-step environment setup process. This idea that using an IDE will discourage someone from finding out the more detailed workings of the language, compiler, etc. is nonsense. If anything, immediate progress and reward will encourage new developers to keep going and exploring further, whereas an overly-complicated, tangled mess of setup instructions will just make many people throw their hands up and quit.
Case in point: my first experience with the Delphi IDE in 1996 had me immediately writing a basic text editor without knowing anything about Delphi/Object Pascal at all prior to sitting down with it. Drag, drop, consult the context-sensitive help, write some code, and compile/run, all in the same IDE. It also helped that Delphi shipped with the entire source code to it's RTL and the VCL (Visual Component Library), but that's more important as you get deeper into the language/product.
I find a modern IDE comes with so much implied mental baggage that new users can get completely overwhelmed. Sure, languages which require complex build environments almost demand one, but in simpler environments it can be a lot easier to reduce the cognitive load in a simplified environment.
One can do a lot of programming and learning with just a simple editor (Nano?) and a Python-like environment (repl, interpreter). With quality online notebook-style environments, we can now accomplish an outrageous amount with very little time or cognitive overhead devoted to build tooling.
As professionals, we ought to be familiar with the underpinnings of our tools though. It can be really painful trying to work with a colleague who's afraid of getting low level to really understand their build issues. I've worked with people who actually become visually agitated when I open up a command line.
What's worse (I see less of this these days) are those who are so coupled to their IDEs that they don't use proper build tools - limiting reproducability, CI integration, etc. This was a real problem back in the earlier Java days before Ant/Maven/etc came around.
No need to shun an IDE - I certainly don't, but one needs to understand how it works otherwise you risk having incorrect (overly simplified) mental models about how your programs are built and run.
For me it's the opposite. A modern IDE is significantly easier to understand, with far less mental baggage than some hacked together combination of several CLI based tools. The mental drain of clicking some nice green play arrow on screen when you want to execute code is non-existent. The baggage of remembering which tool does what, what weird flag you have to keep adding to one of the commands you need, what keyboard shortcut you needed again to exit your CLI based text editor, all of that takes far more baggage for me. Especially when you have 10 different setups like it, and may not return to some of them for months.
And even if you do somehow remember how all of your CLI tools fit together, you're still missing out on an incredible amount of power and clarity that IDEs provide. A CLI just doesn't have the ability to show information nearly as well as a GUI, it doesn't even come close in terms of the amount of bits worth of information you can show simultaneously, nor in its ability to use positioning and formatting to clarify what's important and related. Add that GUIs provide constant in your face hints that tell you what your options are, and it's no wonder almost everyone finds them significantly more pleasant to use. You don't have to remember anything, it shows you.
Yes, people end up developing bad habits and will run into some problems later. But these issues aren't as significant as people giving up early because of learning curves that are too steep on software which is only vaguely connected to what they actually want to do. "I want to build an app" is what drives people, not "I want to learn all the flags for javac".
CLI tools are all about codeveloping the build environment with the project. Instead of you trying to remember the weird flags, write a one-line shell script and save it with the project.
For example, my current side project is built on C+SQLite. The larger chunks of SQL are plain text files so that I don't have to worry about C string escaping. Instead of trying to read them at runtime, I have the build process wrap them in a .o file that gets linked into the executable.
This is relatively simple because it's just a slight variant on what's already happening (.sql files are "compiled" with objcopy instead of gcc), but I've utterly failed whenever I've tried to do something similar in an IDE-based project.
On the other hand, the standard build process is generic enough these days that most projects don't need to do this sort of customization. For those projects, being locked into the IDE's process isn't really a drawback and you may as well take advantage of the benefits it provides.
Just to give you an idea of how you would do that in Delphi (even in 1996):
1) Drop an instance of your database-specific query class (or a generic database API query class, such as ODBC, ADO, etc.) on a form or data module.
2) Click on the property editor button next to the SQL property for the class instance in the object inspector.
3) Input your SQL in the provided window (or load/save to/from a file, whatever the property editor allows for).
4) Compile your application.
That's it. The SQL is now automatically included as a resource in your executable and is available to you at runtime as if it were assigned at application startup. And this isn't just limited to SQL - you can do this with any text or binary file. It's all only limited by the imagination of the developer of the component(s) that you're using in the IDE.
Visual Studio has always been a bit more obtuse in how it handles datasets/databases, so it's probably more of an example of how not to do something like this in an IDE.
The point is that a beginner can do the above, whereas I'm not sure that a beginner would even know where to begin when it comes to implementing what you've done (which is very well done, putting all of this aside).
I'm glad that at least one IDE gets this particular case right, but I think I missed my broader point.
As a programmer, my job is to produce a set of instructions that a computer can understand. The Unix shell is quirkier than most, but it is a real programming language that I can bring all of my profession's skills to bear on. IDEs, on the other hand, are opaque boxes that work great until you need something they didn't anticipate.
I'll readily admit that my needs are no longer those of a beginner, but my learning path was a pretty steady climb of copy/pasting boilerplate, to making a few small changes, to just starting with an empty file and writing everything from scratch.
I worry[1] that, by hiding complexity from beginners, we may be hindering eventual mastery in the name of making it easy to start learning. Are modern intermediate-level developers asking questions like:
- I've been told to always do X; what happens if I don't?
- This thing I usually ignore seems vaguely related to my current problem; what does it actually do?
The problem is figuring out what to do when the little green play arrow doesn't do anything or is grayed out for whatever reason. Fixing that probably requires a bit of knowledge of both the language toolchain and the particular IDE. Typing "python" at a command line and getting a prompt can certainly seem easier.
My first few years of programming were all in Java/Eclipse, and I had an irrational fear of the java/javac CLI compile & run commands. I was completely incapable of modifying build and run scripts (which were customized by one guy at my company), and it was a huge gap in my knowledge. So I have lived this firsthand and can see the value in retrospect.
That being said, I did get a LOT done with the pure IDE (three years of school programming and a year of part time work). So there's something to be said for both sides.
> It's ridiculous to expect newcomers to a language and/or programming in general to slog through a 20 or 30-step environment setup process
Modern IDEs often require an hour long install, and then you turn them on and are faced with 10,000 buttons and 50 windows, they aren't simple either.
I have had good success with making beginners people sit through a 30 minute dev environment set up process, in video form. No complaints, everyone got it working, on linux, mac, and windows. I used VS Code + the Go language plugin. Once set up, you have a pretty simple IDE-like environment.
I do miss the days when every dos PC just shipped with QBasic, or when you could get a simple, quick loading IDE like TurboC++
VS Code is modern ide. And I would really like to know which IDE takes an hour long install. Because I tried multiple on windows and they all required download, double click and that was it.
Visual Studio (proper) has famously long installation times. VS2017 provided a significant installation time decrease but was still on the order of 30 minutes for my company-issued MBP.
In 2015, VS took me multiple hours to install and often minutes to open. The most unpleasant piece of software I have ever used. I hear it's better now.
Installing visual studio 2017 in a windows VM on a ryzen 1600 with an ssd and 16GB ddr4 and a fast internet connection took 3 hours, just did it a few days ago
Having to wait an hour for a download/install to complete is completely different from requiring an hour of browsing stackoverflow to figure out why the CLI install instructions given aren't working on your device.
You're not exactly wrong, but are we not still comparing to "turn computer on, get BASIC prompt" (C64), or "turn computer on, type 'qbasic', hit enter"?
VS Code is a modern extensible programmers editor; with appropriate extensions, it approximates an IDE, but it's not an IDE like Visual Studio proper, Eclipse, etc.
Now, most actual IDEs don't take long to install either, but VS takes approximately forever.
Well, I don't think anybody is "shunning" IDEs just to be difficult; it's just really, really hard to describe something in the context of an IDE. For one thing, IDEs are (obviously) graphical, so you have to include a screenshot if you want to refer to something. Worse, IDEs change things around a lot (sometimes for no particular reason). Then of course, there are no standards in this area - in Java you have Eclipse and IntelliJ, so if you want to describe something in terms of an IDE you either pick both and double the explanations, or you pick one and freeze out the others... or you just do everything on the command-line which is unambiguous. And don't forget, eventually you're going to have to get out of the IDE comfort zone if you want to interface with a continuous integration system.
Commandline instructions fail as often as IDE based instructions do. Changes in different versions, changes in the environment, different ways of trying to use the software, different ways projects were set up; all of these and many other reasons cause plenty of issues with commandline based instructions. It's never "just do everything on the command-line", it's "it just spewed out an essay of errors on step 2, and google is silent"
It's not really that hard, though. Almost everything you see in an IDE is just a pretty version of something in a text file somewhere. IDE's don't get rid of project files, source files, object files, etc. They just simply manage the complexity in a way that keeps what you need at your fingertips.
> It's ridiculous to expect newcomers to a language and/or programming in general to slog through a 20 or 30-step environment setup process.
It's ridiculous for anyone really. If you're not using something outside of the standard libraries, it's ludicrous to expect someone to slog through the manual setup process when they can double click an IDE icon and be ready to program in under a minute.
Not to mention the fact that a lot of those setup guides are for specific OSes and the author always seems to leave that part out. So you get to step fifteen and get a weird makefile error only to find out that that particular library is for another OS with no alternative for your own. This forces you to restart at square one with more potential added steps (eg virtual machine setup, OS setup, etc).
I do Java programming by day, and I still find the IDE to be a pain in the ass. It offers me a lot of power (enough that I still use it for all my Java code), but the configuration is so opaque. It's a GUI, but what a confusing, complicated GUI.
In contrast, opening a repl in python/node is super easy, and it was super easy even when I was a beginner.
Building a GUI like you said would be hard, (though I did a very simple tkinter gui in my first year), but I think that even for a beginner, opening a repl and starting to muck around is easier without the IDE in the way.
Indeed. There's no way I would've learned how to program at such an early age if the process had been any more complicated than "install Turbo Turtle." (And later, HyperCard, THINK Pascal, THINK C, and finally CodeWarrior.) It wasn't until I had basic command of a few different languages that I was thrust into the world of UNIX, emacs, Perl, CPAN, and so on. By that time I was able to handle it, but if that had been my first go of it, I'd have run for the hills and never looked back.
Jesus christ, this. I was doing free online course recently and guess what the forum was full of: self proclaimed geniuses that discouraged beginners from using IDE "notepad is enough". In java mind you, just about the worst language to write in notepad.
Delphi was WAY ahead of it's time. I agree with your point. Installing the IDE also installs just about any library you would want, even TCP/IP socket libraries in 1996.
It's literally:
1. Install
2. Wait forever (CDs are slow)
3. Run it
4. Code, compile and run.
Yeah if anything it's people moving away from IDEs that's making life much more difficult. The complicated part of an install is rarely ever just downloading some software, clicking the executable and then 'next' a few times.
The hard part is docker throwing out nonsensical error messages into your console, offering 3 equally illegible possible solutions, which after 2 hours of googling and attempts turn out to be wrong, because all you needed was a reboot to get to the next error (which ends up being caused because of one of the 3 attempts to fix the earlier problem).
Yeah sure, with an IDE you might not grasp what really happened, but do you really with the 15 step install instructions? Both GUIs and CLIs only ever show the information that the programmer decided to output anyway, and GUIs are actually much better at showing legible output due to all the extra space, formatting options, etc. So what is it that people think makes a CLI somehow grasp what's happening better? The fact you just copy pasted some command with 3 flags?
> The hard part is docker throwing out nonsensical error messages into your console, offering 3 equally illegible possible solutions, which after 2 hours of googling and attempts turn out to be wrong, because all you needed was a reboot to get to the next error (which ends up being caused because of one of the 3 attempts to fix the earlier problem).
Docker is a CLI program written by GUI programmers — it's a whole new level of terrible. It's certainly not a stain upon the CLI escutcheon.
>The hard part is docker throwing out nonsensical error messages into your console, offering 3 equally illegible possible solutions, which after 2 hours of googling and attempts turn out to be wrong, because all you needed was a reboot to get to the next error (which ends up being caused because of one of the 3 attempts to fix the earlier problem).
Are you talking about kernel updates breaking docker until reboot? Yes that's annoying but you have to reboot anyway.
Visual Basic was this for many people. These days the Community edition of Visual Studio with C# comes close, with a combination of drawable GUIs, code generation, and generous autocomplete.
Not only that, but the perception is going in the opposite direction - as learning to (or actually practice) programming gets harder - non-programmers seem to think that the proliferation of graphical tools ought to have made programming easier.
You don't need a computer to program. Paper and pencil will get you pretty far. Anyone take CS106X at stanford? Most everything was written by hand, only the labs or homework were actually done on a computer.
I started with autohotkey. Still think it's the best gateway from user to programmer. keyboard and mouse macros. It's the first thing you learn on emacs and vim also.
Things are so much easier to program these days. Compare c or asm or cobol to php,ruby, python or modern javascript things are much easier at a higher level of abstraction.
I've been programming for about forty years -- starting as a teenager on a KIM-1 with 1K of RAM. I was playing with digital electronics for years before that which is also a good foundation for understanding the basic logic of some programming operations like AND, OR, and NOT. What I feel is also often missing in the programming community today --
as well as computing education -- is people who have the experience of computers from the wires up.
Nand2Tetris is a course that provides some help in that direction -- but it is still not the same as having been there: http://nand2tetris.org/
Back then, programming was one of the most interesting things you could do with a computer. One of the biggest challenge for programmers starting out now is that, frankly, much of what you can do with computers is so darn interesting (e.g. Kerbal Space Program, Minecraft, or just browsing the Web). So, those distractions are often more interesting-seeming than the computer basics of logic circuits, machine language, and fumbling around in confusion (maybe for years in some areas) until things click.
I see that challenge with my own kid learning programming.
Still, there is Redstone in Minecraft -- and even a Forth computer mod for Minecraft. There are lots of programming game-like things like the Smalltalk-derived Scratch now. And great software libraries. And there are places like GitHub and GitLab for sharing code. And even board games like RoboRally.
And Arduinos and Raspberry Pis are very cheap. And you can so so much even on just a cheap Chromebook. I'm writing this on a cheap Acer Chrombook 15 running GalliumOS Linux when I now do most of my personal coding and writing -- both to save money and also to keep thinking about what is possible with the current low-end of personal computing.
JavaScript is still no Smalltalk -- which was a much easier language to learn and use when it is installed and supported, especially since it was designed for kids and went through many years of refinements to make it easier to learn, use, and debug. But JavaScript these days (especially ES6/ES7) is not that bad -- and it is almost everywhere and almost anybody can start programming in it in their web browser. Here is a browser-based playground I wrote recently for JavaScript using Mithril and Tachyons:
http://rawgit.com/pdfernhout/Twirlip7/master/src/ui/twirlip7...
So programming has never been cheaper or more accessible. And the original article goes into that some, like with the suggestion of starting with a Cloud-based IDE (e.g. https://c9.io ) or simulators (e.g. http://www.visual6502.org/JSSim/ ).
Programming is not getting harder, these are just the reactions of people who probably shouldn't be in this discipline in the first place.
A great deal of any programming, technical, what have you job is just getting stuff to work. A lot of it has nothing to do with the actual core task at hand or your specialization but still needs to happen one way or another. This means using critical thinking skills, deductive reasoning, the scientific method. It means scouring Google, trying things until you have a clear mental model of whatever insane legacy you're working in.
Back in the day, I'd spend my time at home trying random things, looking at files in a hex editor, disassembling software to figure out how it worked because there was simply no other way. There was no Internet, and the only documentation I had was at my library, which was limited to whatever they had and usually was out of date or incorrect. Sometimes they wouldn't even let me bring stuff home, because I was a kid and why on earth is this weird kid wanting some giant technical reference manual? There were no laptops (that I could afford) so I had to read it there, and then go back home and try more things with my hopefully newfound knowledge.
Today, we have Google, we have groups, the Internet, mobile devices, tons of high quality blogs, books, free online courses, and the industry itself is infinitely larger with resources that still boggle my mind to this day. As long as I have the time, I'm usually able to tackle most any problem. I couldn't really say that as a kid growing up in the early 90's, trying to learn programming. Those were the literal stone ages. No internship programs, most people thought of tech jobs as vocational IT help desk related (at least in my hometown). There were times when I was legitimately stuck with no resources or recourse.
So please, excuse me if I don't find sympathy of these people who cannot be bothered to setup their python envrionment with the aid of stackoverflow.
I don't know...if you can't figure out how to install python on a local machine...maybe you don't have any business being a coder to begin with. I mean, come on...
edit: the downvotes from the delicate snowflakes crack me up...
Agreed sort of. Especially when the solution is often easily searchable.
I’ve hired interns and junior folks who have trouble with the command line (special circumstances for FT hire). I offer help, but also tell them that it’s their responsibility to understand any extant skill gap and learn. I’ve worked with at least one older boot camper who threw fits when they had to learn something new on their own to debug a problem. I’ve worked with much older career programmers who’ve done the same.
I don’t think it’s a matter of being a coder or not, rather some people aren’t problem solvers, and some people are lazy, and some people dumb, and some people want to be somewhere else. If I had my pick, I’d never work with these people. However, sometimes the people doing the hiring shouldn’t be managers, you could say, and sometimes the people making the decisions have incentives to hire bad people. Then you end with experienced newbies, and have to fill a position eventually.
A better articulated post would posit the problem is not inherit to people not knowing things, but people not having to.
Younger programmers seem to commonly have zero knowledge in this area. If they run into some problem with their environment, they are completely stuck. Oh, they're asked to install a VPN client but it doesn't work and they have to debug it? They're completely blocked. They have no idea how to continue.
For me, I learned these basic computer skills first (because I was obsessed with computers as a child). Then I learned to program.
I think this is a side effect of teaching programming as a career skill. You have people that want to be programmers but don't include "computing" as one of their hobbies. So of course they don't know anything about how to debug computer problems. This is totally expected.
Teaching these skills in CS curriculum would probably be a good idea.