I recently built a cross-compiler for a used Kindle 3 I got from a friend. Its kernel is from 2008, but you can still read books with it just fine. I was thinking about turning it into a mobile terminal I can use in full sunshine (e-ink is rad) and using precompiled hacks from the mobilereads forums, I got ssh on the device pretty quickly.
But as soon as I wanted to compile my own binaries, things weren't so easy anymore. The glibc on there is too ancient for modern cross-compilers (like the one installed by apt). It can't be replaced because the kernel is too ancient for a modern glibc. Replacing the kernel would probably fail due to lack of compatible drivers.
The sources for the original cross-compiler, the kernel and various libraries were actually quite easy to find on Amazon's website, but they were incredibly difficult to build. Because that GCC used some GNU C extensions that changed semantics in modern GCCs, it didn't even compile without patching the source. The build step for glibc involved filtering the assembly produced by the compiler through a sed script, which of course silently failed when the newer compiler produced different output.
All in all, it took me about two weeks of experimentation until I had a working cross-toolchain.
I did something similar for kindle 4, and it wasn't that hard, i just had to patch a few of the makefiles and i got gcc, but i was unable to then build that to run on the kindle.
No. I mean: Teach me how to build a compiler binary, and I'll be able to build pretty much whatever binaries I need whenever I need them for a lifetime. (What's the point of having a compiler at all if you're not going to build your own binaries?)
> the "fish" are free
That's true. But if you know how to build them you can make new kinds of fish. That can come in handy if you don't like the free fish.
> (What's the point of having a compiler at all if you're not going to build your own binaries?)
By 1) minimising the times you need to compile yourself 2) making it easy to compile yourself (e.g. easier than autoconf/automake but even those are examples which have said goal in mind).
> That's true. But if you know how to build them you can make new kinds of fish. That can come in handy if you don't like the free fish.
Its not difficult to learn how to make these new fish, but learn it when its the best or required (ie. only) solution.
A better analogy is that you have a hammer at your disposal, but you refuse to use it because when you learn to use your feet instead that's a useful practice. Its not; its only useful the time you don't have a hammer (which you usually have around). Instead, try to keep your hammer with you.
I found cross-compiling makes linking with weird stuff, for example, the raspberry pi's proprietary Broadcom libs hard. You can chroot directly to a raspberry pi image using a simple script with qemu-user-static-arm, and that will guarantee 100% linking compatibility.
The article mentions the need to account for a different version of glibc in Raspbian. Does this solution also solve this problem, or do you need to also use the same version of Debian on your main machine?
Apt or dnf or some other package manager (like yum or pacman) comes preconfigured and preinstalled on all but the most esoteric Linux distros, so unless you're doing some really esoteric stuff, if you're fighting with dependencies, I would assume something's gone wrong.
Which is to say, it still certainly possible to fall into dependency hell in this day and age, but the parent response belies a certain level of fear of installing things on Linux because of dependencies. I hear Windows still has this problem, decades later, and while Linux has it's own share of foibles, installing pre-packaged software is generally not the exciting/frustrating part. (Which Docker has made even easier!)
a recent example for me is that i want to install some stuff on an embedded linux that runs opkg and doesn’t have apt. now all of the instructions for the thing i want to build just point to apt installation commands. they do provide instructions to build from source, but now i am immediately in dependency hell. it isn’t clear at all how to get all the dependencies even installed, much less properly, as they also have their own dependency hell instructions.
and as a developer who primarily develops on windows, windows itself doesn’t have any “dependency” problem really. it’s all the people who basically write their software on and for linux and then haphazardly try to make it work on windows through a series of hacks while claiming support for windows.
when you come across maintainers who take both linux and windows seriously, you just run an installer and get on with your life.
opkg itself looks like a competent package manager. If the software selection of the repositories for the device you were working with is any good, then it might take some work to find equivalent packages, but opkg should still handle resolution of the dependencies.
Then again, the more esoteric the platform, or the less frequently a particular piece of software is wanted on that platform, the more friction you'll have to overcome.
Find a Windows for MIPS (or whatever that embedded Linux is running on) and tell me that getting equivalent software running is easier.