|
|
| | Ask HN: Why is least significant bit first a thing? | |
3 points by raasdnil on Feb 13, 2023 | hide | past | favorite | 6 comments
|
| | I've had a long desire to get into basic electronics, and recently got the whole Ben Eater breadboard kit to get going (awesome by the way). But I can't help but wonder, why is there such a thing as least significant bit first? It seems that it just makes it that little bit harder for humans to decode with no real upside as surely the computer doesn't care? I took a bit of a look around and couldn't find anything definitive as to the origin story, thought some of the boffins on HN might actually _know_. |
|

Consider applying for YC's Summer 2025 batch! Applications are open till May 13
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
|
Historically, order is a little more interesting. Our earlier computers conveniently had instructions, registers, and memory all exactly the same width (perhaps 27, 36, or 60 bits). MSB/LSB was not a thing. But shortly after the “dawn of time” computers started to provide double- triple- and quad- data and instructions. Memory was expensive and if most data/instructions fitted into X bits (maybe 12, 32 or 36 bits) that’s what you used. But some data/instructions need more space, hence “double” floats, “long” ints, or complex instructions. Now designers (and programmers) needed to consider which part of the data/instruction is at which address, and order becomes a thing.
Each designer found advantages to a specific order … they tended to be small advantages and depended on their previous design decisions. But the tech was being pushed to the limit, so small advantages counted. For example, if you store instructions with the opcode at the “start”, then decoding can happen while the normal increment-PC mechanism is used to fetch the rest of the instruction – saves having extra logic to calculate the address of the rest of the instruction – saves a few gates here and a register there.
Computer history is marvellous – I’d recommend Tanenbaum’s “Structured Computer Organisation” and Siewiorek’s “Computer Structures”.