Crossdev (or variants like crossdev-ng) may work, but only if you stick to the most well-tested build and target combinations. Otherwise, there are O(ludicrous) configurations, most of which are untested.
I recently built a cross-compiler for a used Kindle 3 I got from a friend. Its kernel is from 2008, but you can still read books with it just fine. I was thinking about turning it into a mobile terminal I can use in full sunshine (e-ink is rad) and using precompiled hacks from the mobilereads forums, I got ssh on the device pretty quickly.
But as soon as I wanted to compile my own binaries, things weren't so easy anymore. The glibc on there is too ancient for modern cross-compilers (like the one installed by apt). It can't be replaced because the kernel is too ancient for a modern glibc. Replacing the kernel would probably fail due to lack of compatible drivers.
The sources for the original cross-compiler, the kernel and various libraries were actually quite easy to find on Amazon's website, but they were incredibly difficult to build. Because that GCC used some GNU C extensions that changed semantics in modern GCCs, it didn't even compile without patching the source. The build step for glibc involved filtering the assembly produced by the compiler through a sed script, which of course silently failed when the newer compiler produced different output.
All in all, it took me about two weeks of experimentation until I had a working cross-toolchain.
I did something similar for kindle 4, and it wasn't that hard, i just had to patch a few of the makefiles and i got gcc, but i was unable to then build that to run on the kindle.
No. I mean: Teach me how to build a compiler binary, and I'll be able to build pretty much whatever binaries I need whenever I need them for a lifetime. (What's the point of having a compiler at all if you're not going to build your own binaries?)
> the "fish" are free
That's true. But if you know how to build them you can make new kinds of fish. That can come in handy if you don't like the free fish.
> (What's the point of having a compiler at all if you're not going to build your own binaries?)
By 1) minimising the times you need to compile yourself 2) making it easy to compile yourself (e.g. easier than autoconf/automake but even those are examples which have said goal in mind).
> That's true. But if you know how to build them you can make new kinds of fish. That can come in handy if you don't like the free fish.
Its not difficult to learn how to make these new fish, but learn it when its the best or required (ie. only) solution.
A better analogy is that you have a hammer at your disposal, but you refuse to use it because when you learn to use your feet instead that's a useful practice. Its not; its only useful the time you don't have a hammer (which you usually have around). Instead, try to keep your hammer with you.
I found cross-compiling makes linking with weird stuff, for example, the raspberry pi's proprietary Broadcom libs hard. You can chroot directly to a raspberry pi image using a simple script with qemu-user-static-arm, and that will guarantee 100% linking compatibility.
The article mentions the need to account for a different version of glibc in Raspbian. Does this solution also solve this problem, or do you need to also use the same version of Debian on your main machine?
Apt or dnf or some other package manager (like yum or pacman) comes preconfigured and preinstalled on all but the most esoteric Linux distros, so unless you're doing some really esoteric stuff, if you're fighting with dependencies, I would assume something's gone wrong.
Which is to say, it still certainly possible to fall into dependency hell in this day and age, but the parent response belies a certain level of fear of installing things on Linux because of dependencies. I hear Windows still has this problem, decades later, and while Linux has it's own share of foibles, installing pre-packaged software is generally not the exciting/frustrating part. (Which Docker has made even easier!)
a recent example for me is that i want to install some stuff on an embedded linux that runs opkg and doesn’t have apt. now all of the instructions for the thing i want to build just point to apt installation commands. they do provide instructions to build from source, but now i am immediately in dependency hell. it isn’t clear at all how to get all the dependencies even installed, much less properly, as they also have their own dependency hell instructions.
and as a developer who primarily develops on windows, windows itself doesn’t have any “dependency” problem really. it’s all the people who basically write their software on and for linux and then haphazardly try to make it work on windows through a series of hacks while claiming support for windows.
when you come across maintainers who take both linux and windows seriously, you just run an installer and get on with your life.
opkg itself looks like a competent package manager. If the software selection of the repositories for the device you were working with is any good, then it might take some work to find equivalent packages, but opkg should still handle resolution of the dependencies.
Then again, the more esoteric the platform, or the less frequently a particular piece of software is wanted on that platform, the more friction you'll have to overcome.
Find a Windows for MIPS (or whatever that embedded Linux is running on) and tell me that getting equivalent software running is easier.
I found the approach taken by yocto quite powerful: it can be used not only to compile all kernel and all packages for the target system but also build a matching SDK (cross-compiler with matching libraries).
I really wish I had read this a year ago when I was struggling with cross-compiling Qt for the RaspberryPi and instead natively built it which took 3 days.
Well, I've actually spent 3 days trying to get Qt5 to cross-compile for a custom arm platform. It might be easier for something common like the raspberry pi, but still, you may have dodged a bullet there. ;)
The worst thing about cross-compiling Qt is that I always run into different issues with new releases, and it's not even always consistent between builds. I ended up renting a 64 core VPS to build Qt so that I could iterate faster though build failures, but it's still at least half a day's work to get it to build properly.
I hear ya, especially with respect to upgrades. Like without fail MYSQL Plugin will always be missing, QtWebsockets won't build automatically, and even now on ubuntu 18.04 which ships with openssl1.1.0 I discovered all my applications can no longer use https properly until I downgraded libssl to 1.0.2
Also: I'm assuming the 64 core machine is ARM? Which provider do you use for that?
I'm cross-compiling, meaning I'm not compiling on ARM. ;)
It might be helpful though for 64 bit ARM, but my target architecture is 32 bit and because I need webengine I suspect that it will not compile on a 32 bit platform. The problem is that the linker requires more memory than can be addressed in a 32 bit address space for linking chrome/webengine.
There's also crosstool-ng and buildroot for this, as well as pre built compilers by Linaro and Bootlin (formerly Free-Electrons). Sorry for lack of links, extremely limited connection right now.
Damn GCC, that you have to build for each single architecture and os combo. Clang, being intrinsically a cross-compiler, is much better and way less of a hassle.
I'm actively trying to replace GCC+binutils for llvm based tools in Android (and further, at Google). Currently working through some issues for the kernel, more so llvm's assembler and linker than Clang.
Building a cross compiling toolchain is a good exercise, but if you need to have a suite of tools/libs, check out llnl's spack. Spack automates building compilers/toolchains/libs from source, has hundreds of package definitions, and supports cross compiling. Kinda like Gentoo, but usable on any Linux/Mac platform.
Not all software supports being cross compiled - I can't remember if the problems I had with gstreamer were due to it not being able to be cross compiled or because I was trying to build Raspbian-compatible .deb files.
I've built a gcc-based cross toolchain for m68k (m68k-linux-gnu iirc) and it was very easy with crosstool-ng (ct-ng).
I would recommend building a cross-toolchain by hand just a couple of time to understand the whole process, but then just use ct-ng or something similar: it gets the job done and it's very configurable (abi, target architecture, libraries etc).
There's also arch linux arm way, where you control the build/linking from the target device, but the compilation of object code is done by a cross-compiler over the network.
* https://wiki.gentoo.org/wiki/Crossdev
* https://wiki.gentoo.org/wiki/Embedded_Handbook/General/Creat...
* https://wiki.gentoo.org/wiki/Raspberry_Pi#Crossdev
* https://wiki.gentoo.org/wiki/Mingw