Went to Drexel for CS, but dropped out in my Sophmore year back in 2004. Did PHP webdev in my home state of CT until 2011. Moved to the SF Bay Area and transitioned to doing Erlang and C++ for some F2P games for a while. I'm currently a Staff Engineer at Discord focused on AV and other "native" stuff.
So looking back to the Falcon 9, there were only 4 failures to complete orbital objectives across 503 launches and one of those was only a partial failure (main payload delivered successfully, but the secondary payload was not due to a single-engine failure). These failures were not consecutive (4th, 19th, what would have been the 29th and 354th). Now apart from the first launch or two (COTS Demo Flight 1 had some useful payload, but still seemed pretty disposable) these all had real payloads so they were less experimental than these Starship test flights.
If we compare to the propulsive landing campaign for the Falcon 9 1st stage it's a bit more favorable. The first 8 attempts had 4 failures, 3 controlled splashdowns (no landing planned) and 1 success. I think in general it felt like they were making progress on all of these though. Similarly for the Falcon 1 launches they had 3 consecutive failures before their first success, but launch 2 did much better than launch 1. Launch 3 was a bit of a setback, but had a novel failure mode (residual first stage thrust resulted in collision after stage separation).
Starship Block 2 has had 4 consecutive failures that seem to be on some level about keeping the propellant where it's supposed to be with the first 2 failures happening roughly in the same part of the flight and this 4th one happening during pre-launch testing.
This is a small point, but calling the 33-byte unit a sector in CDDA is a bit misleading and probably incorrect for the quantity being labeled. This is a channel data frame and contains 24-bytes of audio data, 1 byte of subcode data (except for the channel data frames that have sync symbols instead) and the rest is error correction. This is the smallest grouping of data in CDDA, but it's not really an individually addressable unit.
98 of these channel data frames make up a timecode frame which represents 1/75th of a second of audio and has 2352 audio data bytes, 96 subcode bytes (2 frames have sync codes instead) with the remainder being sync and error correction. Timecode frames are addressable (via the timecodes embedded in the subcode data) and are the unit referred to in the TOC. This is probably what's being called a sector here. Notably, a CD-ROM sector corresponds 1:1 with a timecode frame.
Note: Red book actually just confusingly calls both of these things frames and does not use the terms "channel data frame" or "timecode frame"
> The Xbox 360 doubled down on this while the PS3 tried to do clever things with an innovative architecture.
I don't think this is really an accurate description of the 360 hardware. The CPU was much more conventional than the PS3, but still custom (derived from the PPE in the cell, but has an extended version of VMX extension). The GPU was the first to use a unified shader architecture. Unified memory was also fairly novel in the context of a high performance 3D game machine. The use of eDRAM for the framebuffer is not novel (the Gamecube's Flipper GPU had this previously), but also wasn't something you generally saw in off-the-shelf designs. Meanwhile the PS3 had an actual off the shelf GPU.
These days all the consoles have unified shaders and memory, but I think that just speaks to the success of what the 360 pioneered.
Since then, consoles have gotten a lot closer to commodity hardware of course. They're custom parts (well except the original Switch I guess), but the changes from the off the shelf stuff are a lot smaller.
- one is about the format of symbol information in the actual ELF binaries which is only an issue if you are not using the standard libc functions for looking up symbols for some strange reason
- one is an issue that impacts targeting a lower version of glibc from a higher one which is a configuration that was never supported (though usually fails more loudly)
- the last one is a security policy change which is legitimately an ABI break, but mostly impacts programs that have their execstack flags set incorrectly
glibc actually goes to a fair bit of effort to be compatible with old binaries unlike most of the rest of the Linux userspace. The binaries I built for my side project back in 2015 (BlastEm 0.3.0) still work fine on modern Linux and they dynamically link against glibc. This is just a hobby project, not a piece of professional software, and a build from before this JangaFX company even existed works fine.
I find it really bizarre when people talk about Linux binary compat and then complain entirely about glibc rather than the sort of problems that the manylinux project has had to deal with. glibc is one of the few parts of userspace you can depend on. Yes, setting up your toolchain to build against an old glibc on a modern distro is a bit annoying. Sure, if you do something sufficiently weird you might find yourself outside what glibc considers part of their stable ABI. But from where I sit, it works pretty well.
I think the problem in a nutshell is that it's not trivial(?) to build an executable on a modern Linux distro that links against an old glibc version number (and if it is trivial then it needs to be better communicated).
It is actually quite trivial when building with the Zig toolchain since you can simply append the requested glibc version to the target-triple (e.g. `-target aarch64-linux-gnu.2.xx`), but I think this doesn't work with regular clang or gcc (makes one wonder why not when Zig can pull it off).
Just to add some more context. Zig cc is a wrapper around clang. It can handle cross compiling to specific glibc versions. See https://andrewkelley.me/post/zig-cc-powerful-drop-in-replace...
I imagine it would help with the glibc problems they are taking about. Glibc tries to provide a backwards compatible abi.
> I think the problem in a nutshell is that it's not trivial(?) to build an executable on a modern Linux distro that links against an old glibc version number (and if it is trivial then it needs to be better communicated).
I wouldn't say it's trivial, but it's not rocket science either. Basically there are two main approaches. One is to just build inside a chroot or container with a sufficiently old distro inside. This is generally the path of least resistance because your build system doesn't really have to have any awareness of what's going on. You just build normally inside the chroot/container. The main downsides with this approach are that it's kind of wasteful (you have a whole distro's filesystem) and if you want to use a newer compiler than what the old distro in question shipped with you generally have to build it yourself inside said chroot/container.
The other main approach is to use a sysroot. gcc and clang both take an optional --sysroot parameter which is an alternative root for header and library lookups. This lets you use a compiler on the normal host, but old headers and libs. You can also bake this parameter in when compiling gcc (and also I assume clang, but less sure there) if you want a dedicated cross-toolchain.
You can ship all of your of the libraries you use with your executable. This isn't possible to do with glibc. It's the exception which is why it's talked about the most.
> It's the exception which is why it's talked about the most.
It's definitely not the only exception. libgl is another obvious example since different GPUs need different userland code. I would be surprised if there there had never been compat regressions in those.
libgl can be dlopened, glibc can't be. That is the problem. If libgl has some incompatibility, I can try to work around that. If glibc has some incompatibility, my executable won't even launch.
> libgl can be dlopened, glibc can't be. That is the problem.
What exactly prevents this for glibc? I assume you'd need a dlopen equivalent from somewhere for bootstrapping, but are there other issues (like TLS or whatnot)?
Yeah, TLS is one reason. I don't remember the details, but last time I looked into it, glibc abd the loader have some private interface that they use to load the program correctly, and there are no stability guarantees for this interface.
Ideally, the loader and libdl will ship with the kernel, and glibc will have to use the public interface they expose.
The PEP-600 [0] Rationale section touches on this a bit. The basic problem is that there are things beyond glibc that would be nice to use from the environment for a number of reasons (security updates, avoiding clashes between multiple wheels that depend on the same lib, etc.), but since most libs outside of glibc and libstdc++ don't really have an ABI policy and the distros don't necessarily have a policy on what libraries are guaranteed to be present you sort of have to guess and hope for the best. While the initial list in PEP-513 [1] was a pretty good guess, one of the libraries chosen (libcrypt.so.1) got dropped in Fedora 30 and replaced with an ABI incompatible version. Crypto libraries are an example of something that's actually important to keep up to date so I find this rather unfortunate.
> By the time you added a monitor and a hard drive to get the system you really wanted, it cost closer to $1,000. At that price, you could get an off-brand PC with a VGA monitor.
So I decided to check some prices in 1992 computer catalogs and I'm not sure this is really true. Looking at the Tandy catalog[0], you could get a Tandy 1000 RLX with a 10 MHz 286, 512k of RAM, VGA and a monitor for $1K, but it didn't come with a hard drive at that price. The same catalog also lists a Tandy 1000 RL with a 9.5 MHz 8086, 512k of RAM and CGA/Tandy graphics for $500 without monitor. I think the 600 actually fairs pretty well against that machine.
Looking at a Sears catalog from 1992, they have a Packard Bell 386SX with 1MB of RAM and a 40MB HD for $1100, but without a monitor. The cheapest monitor option on that page was another $290.
Obviously two random catalogs doesn't give you comprehensive pricing data, but at least from this quick look it does seem like the 600 was reasonably competitive at the $500 price point it was sold at.
Seems to me the bigger problems were higher up the product line. Not having a real response to the rise of VGA (first introduced 1987) until October 1992 was a huge miss.
In Portugal it certainly wasn't, and I am quite sure, given I got a 386SX at 20 MHz (with Turbo!), 40MB, SVGA monitor in that year, as upgrade to a Timex 2068 which I had been holding to since 1986.
It was about 15000 euros (300723 escudos) in 1992's money, which given our economy meant a credit spread over three years.
Meanwhile, Amigas were at a quarter of that value, and most folks used the TV anyway, I think I only saw Amigas plugged into monitors at the computer stores.
As an owner of an Amiga 500, I remember watching the 600 flop even though I was mildly interested in upgrading, but he's right that this was the point where a lot of users like me threw in the towel and said "Right, I'm just getting a PC." And I did, and it was probably the best decision I could have made at that time.
For me, Wolfenstein 3D planted the seed and then Doom sealed the deal. Not to mention I was a fan of Sierra games back in the day and the Amiga conversions of the 256-colour VGA games were trash (except KQ6 which I believe wasn’t done by Sierra themselves for the Amiga version). Which was a shame because arguably the Amiga had the best versions of the AGI games (better colour palettes, use of the Tandy 3-voice soundtrack as standard) and great 16-colour SCI conversions (use of sound samples from the MT-32 versions of the soundtracks as far as I can tell looking back).
For arcade-style platform games and shmups, the A600 was competitively priced compared to PC:s, at least for a year or so. The biggest problem with the Amiga line in general was that it couldn't feasibly keep up with 486 CPU:s and the game complexity they enabled. Consumer-level Amigas were supposed to be cheap and cheerful - that was their selling point. Adding a 68040 (or even a fast 68030) would've pushed the price point into another territory entirely.
From at I remember, at least in Europe, it's only from 1994-1995 that you could start thinking having a better proposition with a PC than with an Amiga/Atari ST.
In order to surpass what you had on an Amiga 500/600 in terms of games, your needed at least a small 386 VGA.
Before that, the only way to get a configuration at the same price was typically some Amstrad PC (typically the popular PC1512), far less capable in terms of graphics and music.
Yes, also the lack of peripherals. Amiga had way better graphics and sound in 1985 to 1987 than almost all PCs with no extra peripherals. But with PC compatibles becoming more established also the the range of graphics cards and sound cards grew.
To be honest there's lots of little factual inaccuracies in that piece. The notion that the A600 was "fully software-compatible with the A500" just isn't true.
Similarly, claiming that the SoundBlaster had "22 voice sound" vs the Amiga's 4 voices. Yes, later SBs had a 22 channel FM synth chip, but that's not remotely the same thing!
It would have been more accurate to say that as PCs became faster they became capable of mixing multiple audio samples together before playing them through a stereo (ie. "2 voice") DSP - using the greater CPU power to match (and, eventually, exceed) the Amiga's sound capabilities.
In general, I suspect the author is misunderstanding secondary sources, rather than misremembering first-hand experience.
The Amiga's audio reproduction is an odd beast. Incredibly sophisticated one hand then completely hamstrung by the fact that stereo planning has to be done in software. So you either run it as 4 independent mono channels, or you devote lots of CPU time to software panning, which rules doing that if anything else CPU intensive is going on. So you basically can't have stereophonic audio in games.
I agree. The A600 was a costcut and future proofing of A500 with some extra features.
The disappointment was A1200 and A4000. The A1200 was not enough for gaming, A4000 was not enough for workstation use. Both could have been so much better with pretty minor changes.
I remember how we were fuming at the A4000 opting for IDE over SCSI, and the other ways it was hampered, and how ugly it was. The A4000's was PC-ification of the Amiga without the upsides - things like SCSI helped offset the anaemic higher end M68k CPU's. An A3000 + faster CPU + flicker fixer built in + AGA, would've been vastly superior. Basically Dave Haynie's A3000+ prototype.
And MMU... and CPU daughter cards on the Zorro bus with local RAM for things like "distributed" Lightwave rendering. It would have made performance better for the rather price insensitive pro segment.
Commodore basically catered to no market segment.
Edit: the A1200 could have been better (twice the speed for calculations) if it had some just a tiny sliver of fast RAM instead of only chip RAM. And so on. It should also have had a VGA connector, it wouldn't have cost more than a few cents. People used TVs because the couldn't find multisync monitors. Commodore, a fractal of bad business decisions.
The A1200 lacking fastmem out of the box does seem like a strange decision, but in this case I believe Commodore actually listened to game developers. Having more chipmem instead meant that games could effortlessly load more graphics, music and sampled audio.
As for the VGA connector, there was a cheap Amiga RGB->VGA adapter available. Connectivity wasn't the problem. The issue was that VGA monitors couldn't cope with a 15 kHz PAL/NTSC signal. Many didn't work with the 50 Hz PAL refresh, either. In order for a VGA connector to be meaningful, hardware would've been needed to address this, adding to the cost of the machine - and ruining the 50 Hz sync of a massive, pre-existing games library.
If it had even 32 kilobytes of fast mem, (it had 2 megs of chip, same as the original Playstation) it would have made a huge difference for games.
Back in the day, people didn't even know Amiga 1200 could output VGA resolutions. Also, you needed a special adaptor cable which I only saw home-made versions of. If it had a VGA connector people would know they could run productivity software on a regular monitor. Having a TV for gaming and a monitor for word processing would have been fine. There's precedent with Atari supporting both regular (TV-like frequencies) monitors and the monochrome monitor for productivity.
(IIRC you had to press both mouse buttons at boot or something like that to get VGA frequencies. All of this could have been made automatic if the VGA monitor was plugged in or something. BOM cost would have been almost 0. No strategic thinking at all at Commodore leadership, no understanding of the market.)
32 k of fastmem probably wouldn't have made much difference at all. A single routine or two could maybe run faster, but data for them would still have to be mostly in chipmem. Plus, you'd need to add space and tracing for it on the motherboard. Once you've done that, 32 k seems a bit pointless. Then you might as well have added a SIMM socket, too. The initial doubled speed and increased video bandwith compared to the A500 was already a major step up for the type of Amiga games that were popular when the A1200 was designed.
Whether or not people knew of the scandoubled modes back in the day, well, me and all my friends certainly did, and all the A1200 reviews I've read mentions them. Having both a TV and a VGA monitor sort of defeats the purpose of a cheap, compact entry-level machine. Atari users typically had _either_, not _both_: The monochrome monitor was for the DTP and music studio markets.
There was no mode switching in the early startup menu, apart from being able to toggle between NTSC and PAL. Selection of AGA scandoubled modes were made in the Workbench preferences. Adding some kind of auto-sensing hardware would add to the cost of the machine and require a rewrite of Workbench to cope with this in some way without interfering with screenmode preferences (and what if you indeed had both a TV and a VGA monitor hooked up at the same time?).
In hindsight, I think the A1200 was a decent solution to a hard problem: constructing a cheap, worthwhile upgrade while remaining compatible with as much existing software and hardware as possible.
Back in the day, people didn't even know Amiga 1200 could output VGA resolutions. Also, you needed a special adaptor cable which I only saw home-made versions of.
The Commodore monitors which were released at the time intended for use with A1200/A4000, the 1940 and 1942, had a permanently attached 15-pin VGA cable. I know because I have one for my A1200.
These monitors were dual-sync 15 and 31 KHz devices, and were perfectly usable as a VGA monitor with a PC. Probably not a lot of PC users bought them.
But if you bought the 1940 for use with the Amiga 1200, you couldn't plug it into the computer without a 15->23 pin adapter. I'm sure it came with the adapter in the box, but still...
Have you read Brian Bagnall's series of books on Commodore? It really digs into the full universe of badness that was Commodore management in excruciating detail.
I recall reading commodore had a glut of PC cases and tooling they wanted to use up, hence the weird case for the A4000 with the mouse ports cut out inconveniently on the side.
In the US, in the 90's, there were many, many other cottage-shop PC builders, setting up shop.
So yeah - your catalog Tandy has a 'competitive' price, but you could - in those days - take such a catalog into "Joe's Computers" and get a cottage-shop PC, or at least parts for it, like .. on the regular, much cheaper.
Pasadena Computer Trade Show was how I built my mega-PC's in those days.
(Amiga dominated in Hollywood, though, for a while .. I had pals who scoffed at my mega-PC's with their fully tricked-out Video Toaster rigs, and their Dec Alpha render-farms were often pitched against my SGI boxes, over lunch-time jests...)
My general feeling is that folks tend to have somewhat unreliable memories about how early certain things got cheap. Catalogs are not perfect, but they don't change over time. That said, I managed to find a Computer Shopper from August 1992 on the Internet Archive [0]. There definitely are some systems in there advertised at the $1K price point with HD and monitor (though generally no sound card), but also a lot of sellers with nothing that cheap for a complete system.
Not sure what the proper conclusion is about how competitive the pricing of the 600 was from that, but I think the catalog does reinforce how much further in the hole they were a bit higher up the price chart. If you wanted to spend a bit more, the only 1st party offering from Commodore until the fall of 1992 was the Amiga 3000. Not sure what they were charging for one of those at the time, but apart from the builtin SCSI I don't think it compares very favorably with a 386DX with SVGA and those weren't all that expensive by then. The 4000 did launch a little after this catalog, but it was priced much higher than the 486 systems that were already on the market.
The 90s was days spent looking at advertisements in the free local weekly that were just lists of parts and prices; you could build your own or have one assembled for big discounts compared to pre-made machines.
And for home users and even smaller businesses - that’s just what you did.
> is this actually useful without a cartridge slot?
It has 2 of them actually. One standard one on the back and an internal one that I think doesn't use the standard connector that is intended for hardware expansion. Despite what the product page says, the internal slot is mapped to slot 2 and not slot 1 in the MSX mapping system, so it doesn't conflict with the external cartridge slot.
That said, most computers of this era including the MSX had a fair bit of software released on cassette tape as well. There were also disk drives and software released on disk as well.
Looking at some of the more famous games for the system: Metal Gear was released on cartridge, but Snatcher was released on floppy disk with an expansion cartridge that contained extra RAM and a sound chip. It seems this system has support for that expansion builtin though.
If you like the PDP11 instruction set, then you'll probably enjoy 68000 as well. It drew a lot of inspiration from the former. You lose the "deferred" addressing modes (though the 68020 adds something similar), but gain access to a few other useful ones. Conveniently it was used in a lot of consumer devices (Mac, Amiga, Sega Genesis, etc.)
reply