Hacker Newsnew | past | comments | ask | show | jobs | submit | drfuchs's commentslogin

And when it finally dies and is disposed of, the mercury in the internal (ingenious) mechanism will likely end up in the wild. P.s. They came in colors? I only ever saw them in tan, which virtually everyone had half a century ago.

The ad mentions you can easily paint it, so I think it just came in "silver-bronze".

Steve Gibson? Now that’s a name I’ve not heard in a long time… Maybe 30 years? SpinRite?


Steve Gibson also does a show called Security Now with Leo. One of the best IT security podcasts out here. He knows so much about IT and IT security it is amazing.

https://twit.tv/shows/security-now

Long Live Steve Gibson


Steve “RAW sockets Will Destroy the Internet” Gibson


The annotations say "... I don't have the exact instruction format for the either the LGP-30 or RPC-4000 ..." but the entire manual for the LGP-30 with this information, and way more, is available at http://www.bitsavers.org/pdf/royalPrecision/LGP-30/LGP-30_Op...

In 1973, Mr. Willoughby taught the Abington (Pennsylvania) High School computer programming class to code in LGP-30 assembly language. We didn't actually have an LGP-30; it just happened to be what he had been taught on when he was young. All assignments were graded by him simulating your code in his head. Of course, we then went on to learn the (slightly) higher-level NEAT/3 language, for which the school actually had an NCR Century-100 mainframe that would run programs that we submitted on punch cards. Mr. Willoughby's (nobody knew teachers' first names back in the day) theory was that it was important to learn the lowest-level machine language first, so you could understand what was really going on underneath. Worked fine for me; evidently it's not quite so universal anymore.


While not universal, I do recommend Charles Petzold's _Code_:

https://www.goodreads.com/book/show/44882.Code

which covers things down to a quite low-level and was recently updated:

https://www.amazon.com/Code-Language-Computer-Hardware-Softw...


I think going as far as assembly up front is probably not worth it, also it's hard to keep people engaged at that level as it's difficult to make the computer do exciting things with assembler.

However I strongly believe it's good to start with C. You can still rather quickly do interesting stuff, C is a small language so learning the language itself isn't a huge barrier.

A big benefit of the small language is that it leaves more time available to explore important concepts, not just the super low level ones (memory/pointers, etc) but really important parts of the stack that are infectious to everything else. Specifically things like syscalls and the libc interface that most other languages are essentially built on top of. Working with building blocks like pthreads directly is also very important IMO, both to get a handle on parallelism and concurrency primitives but also to learn why high level languages are so valuable. Similarly for stuff like important socket APIs ala select/epoll and implementing your own event loop.

I was lucky enough to learn all of this early in my career and luckier still to have been able to pass on this knowledge to many mentees over the years.

If there are any aspiring programmers here that want to build from a solid foundation then yeah, ignore the haters, write some C.

man pages and ironically ChatGPT are your friend, use them to explore the foundations of (most) modern code so when you start writing modern code in earnest/for money you will be substantially ahead of your peers in actual understanding.


No, go down to assembler, just pick a smaller machine like a microcontroller where simpler programs can do more things more easily.

When you know assembler you can always see what compiled programs run as no matter what language that program was written in, if you start with C you won't have that ability.


Assembler is much better approach, one gets to learn how the machines actually work (ignoring the microcoded part), and reach for something like a 8/16 bit computer emulator (ZX, C64, NES), or an Arduino like device with ARM/RISC-V.

Also the realization, C isn't that special, plenty of ways to play around with pointers.


If it makes you feel any better as of 2005 or so Pitt's computer science degree still includes learning assembly language and the particular Turtle-Headed Sadist that teaches it at the Johnstown branch campus won't let you use hex opcodes for the first few weeks of the course so you're literally coding in unadorned 1s and 0s. I hated him for it. I hate him for it. But nothing compares to the dopamine pop I had when figured out how to do long division using only addition, subtraction, conditionals and jumps. Not that that's a groundbreakingly difficult problem, it was just one of my favorite "aha!" moments in my entire career.


In the story it says:

> The new computer had a one-plus-one addressing scheme, in which each machine instruction, in addition to the operation code and the address of the needed operand, had a second address that indicated where, on the revolving drum, the next instruction was located

Is this correct? The manual you have seems to have a diagram with a +1 operation on the "counter register", which loops back down.

All instructions seem to use a memory address, but they use this for doing the instruction thing (adding, subtracting etc).

Maybe I'm just not understanding the format.


The RPC-4000 (the "big brother" of the LGP-30) had each instruction specify the next instruction address. I believe this was to allow for optimizing your program such that the next instruction would always be right under a read head on the drum when the processor was ready for it, because if you missed it, it took a whole revolution of the drum to get back to it (kind of like a cache miss).

In any case, it seems that while Mel wrote lots of code for the LGP-30, the actual hack in the story involved code that Mel was porting from LGP-30 to RPC-4000.


Sounds familiar. One of my first classes as an undergrad started with boolean logic and boolean gates, then IIRC (it's been a minute) assembly for a theoretical processor.


I had a class (in the late 90s/00s) where we had to code to a theoretical RISCish CPU, but the teacher had built a simulator for it which allowed you to inspect registries, flags, execute instructions step by step etc

It was quite entertaining.


Same here. We used LMC [1] before we moved to a real architecture.

[1] https://en.wikipedia.org/wiki/Little_Man_Computer


On my case, on digital circuits lectures we did the whole thing, starting with boolean logic, gates implementation, using stuff like SPICE, eventually we designed our own toy CPU, the actual implementation on a breadboard was left as an optional exercise, which on my year I think no one did.


But will your grandchild be able to read handwriting?


I’ve already used a computer to interpret old handwriting.


1979 called, and they want their "Intel Magnetics 7110" one megabit bubble memory chips back. At the time, it seemed that bubble memory would supplant disk, tape, and even core memory (RAM to you). Maybe memristors will happen.


There’s a strong connection between President Lincoln and log cabins. He grew up in a series of log cabins, and this fact was widely known during his campaign.


Perhaps strong no-alcohol-in-public laws are related to weak no-guns-in-public laws.


you could open carry in California until about 2010ish


No, they always had legit California "temporary plates" for the allowable (at the time) 6 months. They were very ticketable; his motivation was to keep his car relatively anonymous when driving around. Source: Me, living near his house and walking by regularly.


This article has a picture of one of Steve Jobs' actual cars with no plates at all (temporary or otherwise). It explicitly talks about a "new" requirement for new cars to be issued temp plates. Before that, brand new cars from the dealer had a 6-month grace period.

> "From 2019, California joins most of the other states in the nation by requiring newly bought cars to be issued temporary license plates."

https://arstechnica.com/cars/2016/07/steve-jobs-loophole-clo...


Right, not a plate: the 6 month temporary operating permit was taped inside the windshield, not on the back of the car, but was still ticketable. On the other hand, the car pictured is from after his death; all of his were black.


Thank you for providing the sources this is probably the article I read and had a vague recollection of


oh ty ty that does make much more sense than my absurd simplification

(note that they owned the parking, so its moot if they parked on a reserved spot on private property of theirs)

I guess not being localizable by press/random people is a nice plus if you can afford.

but didnt he buy always the same model?


You were actually right. See sibling comment to yours.


Yes, I didnt remember but that's probably the article I had read years ago and was thinking about!

I guess we both where right at different points in time ;)


> note that they owned the parking, so its moot if they parked on a reserved spot on private property of theirs

Eh, pedantry, but you'll find that building and occupation codes dictate a certain number of disabled parking spots. You could argue that a spot that is ostensibly this, but "everyone knows" is Steve Jobs' spot, is not a disabled parking spot.

(But yes, odds of the City of Cupertino taking any issue with this whatsoever are entirely zero.)


DVI isn’t suitable as you’d still have to intuit where the paragraph- and even word-breaks are; what’s body text vs. headers/footers, sidebars, captions, etc; never mind what math expression a particular jumble of characters and rules came from.


Similarly, “hacker” used to be positive, until the public at large got ahold of it.


They did. But that was back in the 2000s, when nobody really understood the nuance. Today, calling someone a “hacker” to mean “computer criminal” almost feels like a boomer move. We’ve got way better language now: white hat, black hat, script kiddie, scammer (and all its lovely subgenres—pig butchering, refund scammers), phisher, etc. Not to mention whatever we’re calling the folks running dark net markets these days.

And while the general public might not know the fine distinctions between these, I think society does get that there’s a whole spectrum of actors now. That wasn’t true in 2000—the landscape of online crime (and white hat work) hadn’t evolved yet.

Honestly, I’m just glad the debate’s over. “Cracker” always sounded goofy, and RMS pushing it felt like peak pedantry… par for course.

That said, this whole “vibe coding” thing feels like we’re at the beginning of a similar arc. It’s a broad, fuzzy label right now, and the LLM landscape hasn’t had time to split and specialize yet. Eventually I predict we’ll get more precise terms for all the ways people build software with LLM’s. Not just describing the process but the people behind the scenes too.

I mean, perhaps the term “script kiddie” will get a second life?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: