I think figure 1 illustrates a common misunderstanding. People who object to little-endian are often imagining it in their heads as order (3): they imagine that the bits of each byte are ordered most-significant to least-significant (big-endian), but then for some reason the bytes are ordered the opposite, least-significant to most-significant (little-endian). That indeed would make no sense.
But because most architectures don't provide any way to address individual bits, only bytes, it's entirely up to the observer to decide in which order they want to imagine the bits. When using little-endian, you imagine that the bits are in little-endian order, to be consistent with the bytes, and then everything is nice and consistent.
> When using little-endian, you imagine that the bits are in little-endian order, to be consistent with the bytes, and then everything is nice and consistent.
But isn't that kind of at odds with how shifting works? (i.e. that a left shift moves towards the "bigger" bits and a right shift moves toward the "smaller" ones.) Perhaps for a Hebrew or Arabic speaker this all works out nicely, but for those of us accustomed to progressing from left to right it seems a bit backwards...
But because most architectures don't provide any way to address individual bits, only bytes, it's entirely up to the observer to decide in which order they want to imagine the bits. When using little-endian, you imagine that the bits are in little-endian order, to be consistent with the bytes, and then everything is nice and consistent.