Hacker News new | past | comments | ask | show | jobs | submit login

I've worked with kids from elementary to high school age. One of the obstacles for getting nearly any kid into coding is conditioning their response to things like syntax errors and the other obtuse-looking things compilers and interpreters like to spit out.

For example, a kid can spend an hour adding a tree of dialogue to their text-based dungeon crawler, only to get a compilation error they've never seen before. They might have forgotten a curly brace somewhere and I'll help them find it, then they give it another go only to find out they forgot a semicolon or misspelled a variable name. That series of events can be pretty demoralizing, and it takes some coaching to get them through.

The above is obviously a teachable moment to test early and often, but the point is kids often lose motivation in the face of errors they don't yet know how to interpret. On the other hand, kids get really interested when they can see the fruits of their labor come to life, which can happen without a lot of friction in Scratch.

So to give students a taste for that success without the slog through syntax errors and stuff, things like Scratch are fantastic. Especially for younger kids where attention span and getting bored or disinterested come more easily. We quickly move on to real languages, though, and the concepts they picked up in Scratch are easily portable to Python or whatever else.




In my experience, kids recognize the brightly-colored sanitized tools as "not real" programming. The moment they hit the overwhelming UI or unfriendly error messages of many environments, they feel under-prepared for "real" programming and take a hit to their confidence.

I tentatively agree with them. If I can take a stab at what "real" programming is, it's when you can't expect everything to just work out of the box. You can screw it up in unexpected ways. Sure, that's demoralizing. But you either win or you learn, and I wonder if winning too much, too early prevents cultivating the mental ruggedness it takes to really program.


When I learned to program (I was ~13 years old) using Small Basic I had no issues whatsoever with the errors the interpreter gave me, even though they weren't even in my native language. It was pretty obvious to me that I had committed a mistake at some point of my code.

For kids younger than that, maybe it's better to learn scratch, but I wouldn't trade Small Basic, it's a great language to learn, the only bad thing is that it's made by Microsoft and only runs on Windows.


Yeah, in my experience teenagers (especially those that are highly self-motivated) can see stack traces and stuff as just another part of the puzzle. It's the younger and less motivated kids that tend to get frustrated.

Edit: I'll also point out this:

> It was pretty obvious to me that I had committed a mistake at some point of my code.

How a student handles this is also often a sticking point. "Oh no, another error, I'm bad at this, maybe this isn't for me." Is an attitude that I've had to turn around. It goes a long way to convince a kid that it's actually the computer's fault because it isn't smart enough to understand what you're trying to tell it. So you have to be really specific in how you write to it. Then suddenly error messages are informative clues on the way to making your program work, rather than the computer scolding you for being a bad coder.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: