Hacker News new | past | comments | ask | show | jobs | submit login

Why bother?

  apt install gcc-arm-linux-gnueabihf # Debian / Ubuntu
  dnf install gcc-arm-linux-gnu # Fedora



I recently built a cross-compiler for a used Kindle 3 I got from a friend. Its kernel is from 2008, but you can still read books with it just fine. I was thinking about turning it into a mobile terminal I can use in full sunshine (e-ink is rad) and using precompiled hacks from the mobilereads forums, I got ssh on the device pretty quickly.

But as soon as I wanted to compile my own binaries, things weren't so easy anymore. The glibc on there is too ancient for modern cross-compilers (like the one installed by apt). It can't be replaced because the kernel is too ancient for a modern glibc. Replacing the kernel would probably fail due to lack of compatible drivers.

The sources for the original cross-compiler, the kernel and various libraries were actually quite easy to find on Amazon's website, but they were incredibly difficult to build. Because that GCC used some GNU C extensions that changed semantics in modern GCCs, it didn't even compile without patching the source. The build step for glibc involved filtering the assembly produced by the compiler through a sed script, which of course silently failed when the newer compiler produced different output.

All in all, it took me about two weeks of experimentation until I had a working cross-toolchain.


Did you put this up somewhere ?

There are probably at least a couple of other people who might find it usefl.


I did something similar for kindle 4, and it wasn't that hard, i just had to patch a few of the makefiles and i got gcc, but i was unable to then build that to run on the kindle.


Wouldn't this madness be solved by documenting the versions of the various required tools?


Give me a binary and I hack for a day. Teach me to build a binary and I code for a lifetime.

(Just in case you didn't get the allusion: https://quoteinvestigator.com/2015/08/28/fish/)


You mean:

Teach me how to build a compiler binary, and I'll be building compiler binaries for a lifetime.

which is somewhat less inspirational. Even less so when replacing "compiler binary" by "other people's code".

I think the main problem with your analogy is that in IT, the "fish" are free.


No. I mean: Teach me how to build a compiler binary, and I'll be able to build pretty much whatever binaries I need whenever I need them for a lifetime. (What's the point of having a compiler at all if you're not going to build your own binaries?)

> the "fish" are free

That's true. But if you know how to build them you can make new kinds of fish. That can come in handy if you don't like the free fish.


> (What's the point of having a compiler at all if you're not going to build your own binaries?)

By 1) minimising the times you need to compile yourself 2) making it easy to compile yourself (e.g. easier than autoconf/automake but even those are examples which have said goal in mind).

> That's true. But if you know how to build them you can make new kinds of fish. That can come in handy if you don't like the free fish.

Its not difficult to learn how to make these new fish, but learn it when its the best or required (ie. only) solution.

A better analogy is that you have a hammer at your disposal, but you refuse to use it because when you learn to use your feet instead that's a useful practice. Its not; its only useful the time you don't have a hammer (which you usually have around). Instead, try to keep your hammer with you.


Spending all day on a fishing boat when all you want is a fish is an apt analogy to building GNU cross toolchains for embedded targets.


Indeed so. And fishing is not for everyone. But some people find value in it.


So more like teach me how to build a fishing rod?


No. That would be teaching you how to write a compiler.


Yeah but if you are downloading a compiler, then that sort of implies that you know how to build binaries :)


Sure, if you count knowing how to type "make" as "knowing how to build binaries."


Yes, because every project uses the same build tools, right?


Give me a binary and I run it. Teach me to build the binary and I am fixing bugs for a lifetime.


Better than just living with bugs for a lifetime IMHO.


"give a man a lump of sugru..."


Unrelated variant: Give a man fire and he'll be warm for a night. Set a man on fire and he'll be warm the rest of his life.

;)


lol you explained the fishing proverb


I found cross-compiling makes linking with weird stuff, for example, the raspberry pi's proprietary Broadcom libs hard. You can chroot directly to a raspberry pi image using a simple script with qemu-user-static-arm, and that will guarantee 100% linking compatibility.


I found the same for anything remotely complicated. I prefer just using a VM.


If you need any configuration options different than the packages provide, or a version later than the packages provide.


The article mentions the need to account for a different version of glibc in Raspbian. Does this solution also solve this problem, or do you need to also use the same version of Debian on your main machine?


From my experience, compiling on a machine with an older version of linux/glibc should work fine. The other way, not so much.


Now I need to install apt and whatever dnf is and whatever dependencies they have and so on...


Do people still fight with dependencies?

Apt or dnf or some other package manager (like yum or pacman) comes preconfigured and preinstalled on all but the most esoteric Linux distros, so unless you're doing some really esoteric stuff, if you're fighting with dependencies, I would assume something's gone wrong.

Which is to say, it still certainly possible to fall into dependency hell in this day and age, but the parent response belies a certain level of fear of installing things on Linux because of dependencies. I hear Windows still has this problem, decades later, and while Linux has it's own share of foibles, installing pre-packaged software is generally not the exciting/frustrating part. (Which Docker has made even easier!)


a recent example for me is that i want to install some stuff on an embedded linux that runs opkg and doesn’t have apt. now all of the instructions for the thing i want to build just point to apt installation commands. they do provide instructions to build from source, but now i am immediately in dependency hell. it isn’t clear at all how to get all the dependencies even installed, much less properly, as they also have their own dependency hell instructions.

and as a developer who primarily develops on windows, windows itself doesn’t have any “dependency” problem really. it’s all the people who basically write their software on and for linux and then haphazardly try to make it work on windows through a series of hacks while claiming support for windows.

when you come across maintainers who take both linux and windows seriously, you just run an installer and get on with your life.


opkg itself looks like a competent package manager. If the software selection of the repositories for the device you were working with is any good, then it might take some work to find equivalent packages, but opkg should still handle resolution of the dependencies.

Then again, the more esoteric the platform, or the less frequently a particular piece of software is wanted on that platform, the more friction you'll have to overcome.

Find a Windows for MIPS (or whatever that embedded Linux is running on) and tell me that getting equivalent software running is easier.


Uh, why would you install a system package manager when you already have one?


Exactly, why bother?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: