I played a ton of EV and EVO back in the day and still have them on a Basilisk II VM, but Endless Sky has really captured the spirit of EV (because everyone who made it also loved EV) and offered it up in a modern incarnation.
Endless Sky seemed like a nice refresh when I tried it a few years ago, did a few initial quest lines.
It was interesting how the different faction technologies had different power/mass/volume/hardpoint production and consumption ratios, so there was a real nudge towards having tech all from one faction, and gently discouraging min-maxing the build using a Frankenstein of gear sourced from the far-flung-corners of the galaxy. At least that was my recollection.
Well, if it was that way, it certainly isn't any longer. They keep adding new alien races, storylines, and sectors of the galaxy, and some of my best ships were franken monsters with tech from a half dozen races. There is a core storyline that is primarily human, but if you play to the end of that will suddenly discover that there is significantly more around you that becomes reachable with some interesting new technology (trying not to spoil anything)
They have done a good job balancing the numbers so that everything requires some tradeoffs. More species/tech gives more choices and interesting variability. Some species make very efficient drives, others inefficient, more powerful but produce tons of excess heat, those folks also produce good passive cooling, others great active cooling but power hungry, etc. The ship hulls tend to match the sizes of the drives and weapon hardpoints of that race, but often work much better when outfitted with different kit (perhaps with some wasted space)
Plus the fleet management is pretty good. You fly your flagship, but you can park ships and switch your flag to different ships. So I might fly a fast little scoutship, then switch to an armor-clad behemoth surrounded my 20 of my heavy-hitting henchmen for some different missions.
Why is this in the least surprising? It's just the natural successor to what everyone used to do with the trade magazines thirty years ago. Back then you filled in a profile questionnaire to get a free subscription, so every basement hacker turned into the manager of a 500-person division with control of a $1m capital budget. The magazine didn't want to check because it would damage the demographic numbers that they pitched to advertisers. The advertisers knew that there was some liar's poker being played but everyone just rolled with it.
Thanks for that. We seem to have lost sight of the importance of "commercial biodiversity" in the past 40 or more years of continuous M&A concentration.
Happily, I saw a little discussion of it in 2008 when the advocates of letting the auto companies fail were pushed back by statistics showing how many second and third tier suppliers would be destroyed. But the fourth tier, the shenzhen / radio alley-type stuff is still ignored. Very similar to how most companies want to simply hire skills and assume that they will magically appear when in years past, companies took an active hand in creating them by having a career development path in-house.
Perhaps the AI bubble will be viewed in the future as the last gasp of companies that depleted the soil that they grew in and now struggle to survive without anyone that knows how to do the work anymore. Maybe LLMs will be all that remains, our Moai.
It's a shame the same logic wasn't applied in maintaining a healthy root-level auto company ecosystem. Having a single megacorp at the top inevitably makes it too big to fail. On the other hand, if there are dozens of smaller car companies, the failure of any one of them is insignificant to the wider ecosystem.
A company that knows it is too big to fail will inevitably lead to mismanagement. After all, why bother saving for a rainy day when you can count on corporate welfare handouts? Why bother reducing your risks when you can always rely on a bailout? You can never lose, so the obvious thing to do is to bet as big as possible in an attempt to create as much short-term "shareholder value" as possible.
I can't think of a way to have dozens of smaller car companies, all competitive and all viable long term without ongoing and adaptive regulation.
With Chevron doctrine dead and Congress struggling to pass a budget, I can't see how it is possible to have any meaningful regulation in the US in the short to mid term.
Even the mainland China model of pitting province and local governments against each other to foster competition might not work in the long term (it is still early days). We already see specialization in provinces which to me indicates that there is a defacto province government who wins auto manufacturers in the long run.
One week too late for me. Didn't feel like scratch building a new machine and finding a low TDP mobo with a bunch of SATA ports. Wanted to go Synology but dragged my feet for months watching this play out.
In the meantime, I became enamored with the Jonbo cases and started seeing white label N100 ITX mobos pop up with a bunch of SATA ports. Eventually figured out they were Topton when Brian Moses included them (and a Jonbo case!) in this year's NAS build.
So my parts are arriving in a few days and Synology has lost one potential new customer.
That's exactly how periodicals worked for a hundred years. A letter to the editor that is published in the next edition. That's how I always mentally viewed blogs and expected most niche sites to eventually do that. In fact, a local "neat things" blog was successfully doing that for quite a while.
Multiple nethack ascender here (~50x in 20yrs). I usually play in the traditional November tournament (originally devnull, now tnnt). Never set aside the time to learn any of the other roguelikes mentioned in the article, but wanted to mention that nethack itself is a class of games. Many people have written variants to scratch a particular itch (the article briefly mentions spork, which was one of the very first). Some of those are wildly different from the base game. There is even a separate tournament in June dedicated just to playing as many of the variants as you can (junehack)
That's one of the reasons I loved Ansible from the moment I saw it. As the OP points out, traditionally machines accumulated ad-hoc changes over a long period of time. Describing the "known good" state and running this "checklist" to make sure it is in that state both documents the checklist and evaluates it.
Same reason we haven't typed "cc" on the command line to call the C compiler on individual files for about 30 years or more.
The last time I typed (well, pasted) "cc" on the command line to call the C compiler on an individual file was 26 hours ago. I wanted to recompile a single-file program I'd just written with debugging information (-g) and it seemed easier to copy, paste, and edit the command line rather than to manually delete the file and reinvoke make with different CFLAGS.
I mean, I've surely compiled orders of magnitude more C files without typing "cc" on the command line over the last week. But it's actually pretty common for me to tweak options like -mcpu, -m32, -pg, -Os, or -std=c11 -pedantic (not to mention diet cc) by running a "cc" command line directly.
Similarly, I often run Python or JS code in the REPL or in Jupyter rather than putting it in a file. The rapid feedback sometimes helps me learn things faster. (Other times it's an attractive nuisance.)
But I may be a bit of an odd duck. I've designed my own CPU, on paper. I write assembly code for fun. I've implemented several different programming languages for fun. I like to know what's underneath, behind the surface appearances of things. And that requires experimenting with it.
Of course I cc one file quickie programs all the time. What I am talking about is a whole directory of source files, and just "knowing" which ones are out of date and building the object files manually.
I still remember years ago trying to convince one dev to use make on a package with 20-30 source files.
Running just cc instead of make is actually a much more reasonable thing to do nowadays than it was 10, 20, or 30 years ago.
https://gitlab.com/kragen/bubbleos/-/blob/master/yeso/admu-s... is the entry point to a terminal emulator I wrote, for example. `make -j 8` can build it with GCC from a `make clean` state in 380ms, but if I, for example, `touch admu-shell.c` after a build and run `make -j 8` to run an incremental build, it recompiles and relinks just that one file, which takes 200–250ms. So the incrementality of the build is saving me 230ms–280ms in that case.
Without -j, a nonincremental `make admu-shell` takes about 1100ms.
it takes 900 milliseconds to compile those 1100 lines of C. This is a little bit faster than building from scratch without -j because I'm not compiling the .c files that go into libyeso-xlib.a that admu-shell doesn't use. So all the work of `make` figuring out which ones are out of date and building the object files automatically and in parallel across multiple cores has saved me a grand total of 600–700 milliseconds.
That's something, to be sure; it's a saving† that makes the compilation feel immediate. But it's really pretty minor. 900ms is small enough that it only affects my development experience slightly. If I were to run the build in the background as I was editing, I wouldn't be able to tell if it were incremental or from-scratch.
Unless it screwed up, that is, for example because I didn't bother to set up makedepends, so if I edit a header file or upgrade a system library I might have to do a scratch build anyway. The `make` incremental-build savings doesn't come without a cost, so we have to question whether that cost is worth the benefit. (In this case I think it's worthwhile to use separate source files and `make` for other reasons: most of that source code is used in multiple Yeso programs, and `make -j` also makes a full build from scratch four or five times faster.)
If we extrapolate that 700ms saving backward to 25 years ago when our computers ran 500 million instructions per second instead of 30 billion, it's something like 45 seconds, which is enough of a wait to be distracting and maybe make me lose my train of thought. And 5 years further back, it would have taken several minutes. So `make` was an obvious win even for small projects like this at the time, and an absolute necessity for larger ones.
At the time, I was the build engineer on a largish C++ project which in practice took me a week to build, because the build system was kind of broken, and I had to poke at it to fix the problems whenever something got miscompiled. The compiler and linker were writing their output files to an NFS server over shared 10-megabit Ethernet.
As another data point, I just rebuilt the tcl8.6-8.6.13+dfsg Debian package. It took 1m24.514s. Recompiling just generic/tclIO.c (5314 SLOC) takes 1.7 seconds. So not doing a full rebuild of the Tcl library can save you a minute and a half, but 25 years ago (when Tcl 8 already existed) that would have been an hour and a half. If it's the late afternoon, you might as well go home for the day, or swordfight somebody in the hallway or something.
So incremental builds at the time were totally essential. Now they're a dispensable optimization that isn't always worth it.
______
† 1200 lines of C per second is pretty slow, so probably almost all of that is repeatedly lexing the system header files. I'm guessing that if I took the time to do a "unity build" by concatenating all the C files and consolidating the #includes, I could get that time down to basically the same as the incremental build.
Never MUDed, but November (and now June) are my nethack months going back about two decades. Actually started playing about fifteen years prior to that (hack on SunOS 4 machines) but didn’t get good for a while.
https://endless-sky.github.io/