> Did you know that you can’t write to video memory on the Game Boy Advance in units of 8 bits, and if you do, it’ll still work but all your graphics will be messed up with no obvious way to figure it out unless you read one specific paragraph of the documentation? Hey did you know that when you compile a binary optimized for size, memory copies will be by units of 8 bits, but not if you use debug mode?
Andrew (or any Zig mavens) -
What would be the idiomatic way of solving this issue? Can you define your own memcpy with @std.mem.doNotOptimizeAway or something?
Andrew’s point is that marking the memory as volatile should prevent this optimization. (i.e. the compiler isn’t allowed to split a 16-bit volatile write into two 8-but writes.) This is idiomatic because volatile is intended for MMIO.
There’s a proposal for a way to disable LLVM generating builtin calls like memcpy. But I’d argue volatile would still be more appropriate in this case.
> I started using Linux five years ago, mainly because I couldn’t figure out how to get Python working on Windows
Windows being by far the most dominant operating system from 97-2015 definitely exacerbated a dearth in knowledge among the young. In the early days internet access was not ubiquitous and shipping an operating system without even a primitive programming environment definitely lead to missed years of programming for me.
I learned a lot about windows, lots of UIs and how the system was constructed somewhat, but no visual studio, no functioning compiler in base and the only interpreter being fucking batchfiles with no docs… come on…
I will also say that starting to use Linux in earnest around the age of 20 was a very humbling experience having been raised on Windows. Back in the 00s being able to deal with Microsoft esoterica by way of registry hacks and MMC snap-in wizardry meant that you were a "smart computer kid" that non-technical folks found impressive. It wasn't until I started using an OS that acted more consistently and predictably that I realized I'd been building a career (and decently-sized ego) out of Microsoft-specific make-work. Being freed up to learn more timeless and universal concepts lined up perfectly with my college coursework centered around writing shells, compilers, etc. and completely changed how I interacted with computers for the better.
Apropos - if I ever win the lottery and don't need a job I'm going to spend some time learning Plan 9. Superficially it seems to be Unix without many of the warts combined with some concepts from NT/VMS that make good sense.
My brother built his company (IT services in the manufacturing sector) on the back of Microsoft starting in the mid 90s. He's done extremely well and continues to work on interesting stuff (these days cloud, complex databases, CRM, etc). Employees are by-and-large long term and well compensated. Not FAANG numbers to be sure, but we're in a city with a reasonable CoL.
But he focused on building a long-term sustainable business, not chasing some buzzword-du-jour, VC funded pipedream with a quick exit, and thus no matter his personal success will never have any HN cred.
I interpreted this more as his friend wouldn’t be respected as much by the HN crowd because he’s an entrepreneur with a sustainable lifestyle business, not one that is based on hot cutting-edge tech, breaking new ground, or having the possibility of a 9-10 digit exit. (note: been working in startups most of my career, no options were ever worth a cent so I take it all with a grain of salt and don’t judge)
It’s also similar to working for a boring shipping company vs working at FAANG. You could be doing an identical job at both, but one is perceived by some to have a special cachet.
> his friend wouldn’t be respected as much by the HN crowd because he’s an entrepreneur with a sustainable lifestyle business, not one that is based on hot cutting-edge tech, breaking new ground, or having the possibility of a 9-10 digit exit
That is indeed what it says, but I've never seen anything to substantiate this. Lots of people are very curious about people who have side hustles, or side hustles that became main hustles. HN is a set of pretty un-herdlike individuals, and you're as likely to see someone who thinks socialism's a great idea as someone who thinks capitalism's lifted billions out of poverty.
Python is actually relatively easy to set up on Windows. (Or at least it was easy to set up on Windows 98 in the early 2000s. I don't know how it's now, or five years ago.)
You just download the official Python installer, and off you go. It even comes with Idle as a reasonably good IDE.
The python from Windows Store is not good, it redirects some path writes and will make the debugging experience of tools using python dreadful. Always install from the win32 installer from the website.
That's not the point. The point is that even after you install Python on Windows using the official installer, and unless you put that early in the %PATH%, running 'python' on a powershell / Windows terminal will pop up the Windows Store. That's just how fucking stupid Windows is these days.
If you install Python using the official installer from python.org and check the "Add interpreter to PATH" box when installing, running `python` in shell will launch it.
Experiencing this for the first time was really jaw dropping. And I believe by default "python3" has distinct behaviour of _telling you_ to visit the Microsoft Store, but not opening it. Bizarre.
It's currently just as easy to use for learning and experimenting on Windows as you describe. I don't use Python on Windows heavily enough to evaluate it for more serious development.
Long time ago I participated in a Python course, where it just wouldnt work for half of the students (like 10 people) - and the lecturer couldnt figure it out.
As far as I remember Python REQUIRED to be installed on C: and in something like C:\Python where it would bring 5000 various files.
Even more time ago, I was at a Java class, where my code would work on my computer but just wouldnt work on the computer of the lecturer. I remember the guy spend like 10 minutes looking at the (very simple) code and trying to figure out why the code worked on Windows but not on Linux
On modern Windows there's this weird pre-installed python stub which (IIRC) asks you to install Python from the app store and which can confuse shell script which check if Python is installed.
In terms of IDE support, VSCode with the MS Python extension bundle is a pretty good Python IDE.
In the early 2000s Python was probably the easiest language to install in windows after vbscript and vba. Download the ActiveState python installer, launch it, and you're done. And this included being able to write GUI applications, interact with COM object (Office for example), it was quite nifty, I used it for automating a number of tasks at my first job.
It’s still easy to setup, it’s all the commands that are different in powershell. So if you’re doing a lot of Google programming it’s not going to be easy to actually use Python if you’re running things from a command line.
That being said Microsoft know windows is terrible for software development which is why it’s so easy to use Linux inside Windows with WSL2. Since Docker Desktop for Windows requires WSL2 it’s very common for any developer AD account to have it locally as admin.
For better or worse, UNIX has won on embedded and server room.
I think this is the HN bubble at work. Maybe you mean something qualitatively or quantitatively different than I do when you say server room but every client I've worked with in my extremely conservative industry in the past ten years has been running on site, coloc, or cloud "server rooms" on windows.
I’ve spent a long time in non tech enterprise and I think unix won. 15 years ago you would hire someone and they would be proficient in Windows and Microsoft office. Now people have grown up with Macbooks, Chromebooks and mobile/tablets. It’s not a major issue yet, but it’s increasingly becoming one as the generations in the workplace are slowly shifting.
I don't think Microsoft is stupid, they know that they have the lions share of developers- they're also aware that they are losing to linux in a big way when it comes to service deployment environments. I even heard Azure itself does not run Hyper-V on Windows anymore (though, who can be sure).
It's a smart play for them to have WSL- it means less friction to target Linux and will keep people writing software on Windows (which increases the likelihood of making software for Windows- especially developer tooling).
I think the parent is wrong that developing software on Windows is hard, my original comment up-thread was largely regarding the fact that in the late-90's and early-00's internet access was not common and getting your hands on a Microsoft Development environment was the furthest thing from easy. Linux, ironically, was easier as it came with all kinds of documentation and interpreters and compilers out of the box.
Lots of tools are only built to run on Linux and are ported over in ways that just about run to windows. Wsl was an enormous step up from Cygwin in allowing windows developers to use those tools and wsl2 is another huge leap in that direction.
I don’t think windows sucks, I think that a lot of people outright refuse to use windows and design their tools to run on Linux only, making choices that are optimal for Linux but not for windows. two examples are processes vs threads and opening/closing lots of files. Ironically the Linux first approach of processes over threads doesn’t really scale to the level we want. Postgres is a great example of trying very hard to be multithreaded as it’s hit the multiprocess limit, and struggling to do so.
It’s possible to accept there are many dev tasks that can only really be achieved on Linux without accepting Windows sucks. The first is pretty much inarguable, the second is emotive and imprecise.
Yeah, back then, `pip install` just didn't work for anything written in C. I seem to remember a Windows-specific site with builds of popular extension packages - Cython, lxml, OpenSSL, and the like. It was unofficial, and you couldn't use a package manager to install them, but after manually downloading them, they tended to work.
Wheels, Conda, and WSL, helped - now, all the R&D guys at work use Python on Windows with PyTorch, TensorFlow, and all that without problems. I'm also using (mini)conda (on Linux) instead of virtualenv - it's nice because I don't need to worry about all my venvs breaking after an OS upgrade.
Anyway, compared to those days (I started with 2.4 on Windows, switched to Linux around 2.7), working with Python on Windows is at least feasible.
The big problem with installing packages written in C is that on Windows, you needed the matching version of VC++ to build, which wasn't anywhere as easy to obtain as the standard build tools on Unix platforms, usually (where they'd often even be preinstalled, and if not, it's one apt/yum invocation away).
The eventual solution for this was something called "Microsoft Visual C++ Compiler for Python 2.7" - which was pretty much just enough of VC++ command line tools packaged to make it possible to build things.
Yeah, upgrading Python from build-in ancient version that came with Linux distro (usually 3.8, or even 2.x) is actually much harder than doing so on Windows.
You shouldn't use the OS's Python for your work. That Python is for OS utilities' use. It's meant to be old and stable. Instead, use pyenv, conda, asdf, mise, etc. - it'll allow you to install multiple versions of Python and easily switch between them for your projects. You're gonna need it anyway once you try writing integration tests for multiple versions of Python, like with tox or nox.
you don't have to upgrade, you can have more than one python install at the same time. You only really need to know about how to manage your PATH environment variable.
windows 95 did come with Qbasic for MS-DOS. it wasn't the latest and greatest, but it did contribute to a moderately large online scene of kids making Qbasic games or Windows-like Qbasic GUIs for DOS.
holy shit, thanks for the tip! I also learned programming from those usborne books, using qbasic (The dragon32 ran a version of microsoft basic, so the code for it was generally quite compatible with qbasic).
In my experience, there was a big difference between DOS, where programming tools were relatively easy to find, and Windows, where they weren't. Until widespread Internet access, programming on Windows as a kid required finding the rare adult who had access to the tools.
In countries where software piracy was the norm (e.g. most of Eastern Europe) I can't say I recall there being a difference. Turbo Pascal and Turbo C++ were pervasive because they were just copied around on floppies as needed - that's how most schools would end up with them, for example. But, a few years later, people would similarly share e.g. Delphi.
My experience was growing up in a semi-rural area in the U.S. in the 90s. At that time, DOS software was plentiful, but rarely purchased and usually shared over floppies. The users tended to be geeks, and so they had some pretty sharp tools on hand. Whereas, in the age of Windows, especially 95 and on, copy protection became stronger, the user base expanded massively, demand shifted to office apps and games, and software started to grow too big to fit on a reasonable number of floppies. This transition happened while I was a young child, so perhaps it was just poor timing, but still it seemed like DOS was a sea of open possibility while Windows required a bank account which I didn't have.
Maybe he meant debug.com. Not easy in the 90's in Spain neither. And Visual C++ under Win32 from pirated CD's was a nightware to start with.
In the 20000's, an early 3-DVD release (Sarge) was like night and day. Properly documented with all the manuals obtainable either thru Synaptic or the bundled magazine-book.
Yeah, but beyond the learning to program steps, it was about time to acquire programming environments for Quick/Turbo Pascal, NASM/TASM, Turbo/Quick Basic, Clipper DBase,... by whatever means, and the same for the knowledge how to use them, on an offline first world.
And for this, one had to be curious, track down people with knowledge, either from computer magazines ads, hanging around electronic stores, eventually find people with similar interests, to meet them physically.
In my case, I had no access to software from other people, but the library had tons of old books for programming all kinds of computers in mistly basic. So first I learned translating commodore etc to gwbasic. The peek and poke gave a trapdoor to assembly, which granted e.g. mouse access. After I managed to install gwbasic as a TSR, I realized I basically had hollowed out the interpretrr, and started writing straight in debug.com. It took 3 years before I discovered MIX C in a dusty corner of a bookshop. It cost a fortune for this kid, and frequently miscompiled things, but it was one of the first pieces of software I got that didn't come with the computer.
Exactly this. In Spain, 90's gaming was either the Game Boy or the Play Station, and PS2 in early 00's. Computers were for offices and most people began to use them at libraries and in early 00's, tons of people had a PC at home but download stuff from cybercafes. Getting developer stuff from magazines was mandatory, but these were pretty expensive and the materials sucked a lot.
That wasn't true of CP/M, UNIX, VMS, Atari ST, Amiga, and many others.
The reality was not the same as 8 bit home computers, and even on those you were limited by BASIC. Anything else like machine code, required the same learning effort.
If you wanted to program, you either bought the programming tools, or pirated them, most of the time.
Then you would need to buy books, computer magazines, or have the luck of local library with computing section, to actually learn how to use them.
my mum had a Commodore 64, and before that a zx-spectrum
The only programming book in my local library was COBOL.
There was no included disk (it was missing) but in hindsight, it would likely not have worked on windows anyway.
Your premise is that I knew what I was looking for, I kinda didn’t.
My first proper exposure to programming anything was when college told me there was a thing called visual studio and .NET; its only later when installing linux that I found that many of the systems were written in interpreted languages and could be modified (yum, apt etc).
My premise is lack of curiosity, or do you think a small town in a country just out of 40 years of dictorship, and colonial wars, was flowing with computing information all over the place?
Just having access to electronics was revolutionary.
How would we even know what to look for, if it wasn't for our curiosity?
Are you arguing that your experience was helped by that?
My premise is that Microsoft dominance took a few years off my programming journey- obviously I was curious and motivated because otherwise it would have been impossible to he where I am now without motivation and curiosity.
I’m not entirely sure what your argument against this is other than that you potentially had it worse, which is not my point. Obviously motivation and curiosity are enough eventually, since we’re here.
If Linux was the monoculture then it would have happened sooner for me, much sooner.
My experience was helped by being curious, having the luck to track down people, that provided software and knowledge to learn in an offline world.
UNIX was already around doing those days, and it surely wasn't that easier either, unless one was already at the university with enough budget for their computing infrastructure, or something like a bank, although in many countries most likely the most advanced stuff would be MS-DOS terminals connected via Novel Netware, running Clipper applications.
Hey now, Windows Script Host (WSH) supported VBA and JScript scripts. WSH was installed by default on systems up through Windows 7, I believe. Scripts run though WSH had access to a wide array of APIs -- a lot was possible. The WSH executable (wscript.exe) was also configured to be the default handler for .js files in Windows Explorer.
Also, HTA apps were an option. Write an HTML file, include a <script> tag with either JS or VBA, and you have access to APIs equivalent to a local HTML file opened on IE5, plus some. This was a popular option at one point for powering CD-ROM autorun menus, among other things.
(The company I worked for used embedded HTA in a massive C++ COM app to render some forms where the fields and values had to be dynamic based on data only known at run time. Debugging was horrible until I learned you could inject Firebug Lite into the HTA apps and at least be able to console.log(). It was truly a dark, dusty corner of Windows programming -- of which there were many.)
Yeah, WSH and VBA was available since Win98, though the documentation was scarce and not part of Windows itself (compare with QBASIC someone else mentioned where not only it came with documentation but the first thing it tells you when you run it is how to open the documentation).
I did use it for an "Introduction to programming" magazine article back in mid-2000s though, main reason because it was already there and all you had to do was open Notepad and start typing (...and make sure you save with a .vbs and not a .vbs.txt extension...). I even got an email a few years later by someone telling me they got into programming because of it :-P.
When I was first getting started with programming in about 2008, I found it to be so unbelievably frustrating. Any tutorial I was following would invariably run into something that didn't work how it was supposed to. I would try to find some work around, and sort of would, but it would lead to other unforeseen issues down the line. I finally came across cygwin, and that sort of made things start to work.
Eventually, though, I just installed a linux partition and literally never looked back. The entire ecosystem of everything I was learning at the time (Python, JS, PHP) was set up for unix. Things have improved over time, and obviously WSL is nice, but it's still a pain.
Opposite hot take: one of the main reasons I got into programming was because Microsoft Access made it accessible enough that I was confident enough to produce value even knowing nothing about how to think like a software engineer
I lived in bumfuck nowhere where there were basically no developers (if you knew how to reinstall Windows you were called "a genius kid"), no computer magazines, and access to the internet was available through dial-up for an hour or two in the middle of the night a couple of times per month because it was:
1) very slow — my modem was never able to achieve more than 2.5-3 KB/s, and
2) pretty expensive given our measly salaries.
I also didn't speak English at all. I first got access to a programming environment at 16, at an age where my Western peers were starting to write operating systems: it was a copy of "pirated" Delphi that came by pure chance. The first couple of years I was making shit that wouldn't impress anybody because there was nothing to learn from besides the standard library, which would probably be achievable by a computer science version of Ramanujan, but not by some random dude like myself.
Please remember about the rest of the world before accusing others of making "cheap excuses". Not every person on this planet lived or lives in the middle of Manhattan; this may very well include GP.
Indeed, as a kid of 10, I remember learning C/C++ thanks to DJGPP, a DOS port of GCC, being free software. I didn't have any money to buy a commercial compiler, though I never asked my parents. I wasn't sure how to frame the question, I guess. Well, regardless, getting your hands on a commercial compiler wasn't that difficult in the late 90s/early 00s. Soon after though small non-commercial indie games kinda died out and everyone was using DirectX using MSVC on Windows, until SDL came out.
SDL appeared in late 90s / early 00s, that's pretty much when it became popular (e.g. most early Loki Linux game ports from used SDL).
As far as compilers, Borland shipped a free version of their C++ Builder 5.5 compiler right around that time, too, so on Windows we had that in addition to MinGW.
I do have fond memories of batch files, though. I showed my classmates that it only took two lines of batch to make the school library's computers open their disk drives every hour.
> the only interpreter being fucking batchfiles with no docs… come on…
QBASIC was always there, though?
From Win98 onwards, Active Scripting was there out of the box with VBScript and JScript. From Vista onwards, .NET runtime shipped as an OS component, and it includes C#, VB.NET, and (back then) JScript.NET compilers in the box. Although, granted, no offline docs in either case.
> Did you know that you can’t write to video memory on the Game Boy Advance in units of 8 bits, and if you do, it’ll still work but all your graphics will be messed up with no obvious way to figure it out unless you read one specific paragraph of the documentation?
I did. At one time I did write a program that wrote to video memory in units of 8 bits, but the emulator I was using did not prevent that, so my program worked on the emulator but not on the real hardware, and I could not figure it out until I found out that was the problem and I managed to fix my program so that it would work.
This is really interesting. As an aside, someone has created a GBA toolchain for the Nim language as well. It's really cool seeing these projects for a variety of languages.
that fancy __attribute output can be achieved quite nicely using a custom pragma/codegendecl -- we did exactly this for the embedded firmware written in Nim at my previous workplace, to get access to some specific features of the ESP32-S3 (things like RTC_ATTR_NOINIT for example)
Natu has an amazing commercial game called Goodboy Galaxy [0]. In addition to the Game Boy Advance version, they've also ported it to Windows and Linux with Xatu [1] (which is also written in Nim).
> Sometimes, the compiler would get too smart for its own good, and recognize that a function that I wrote just copies memory. It would helpfully replace the function body with memcpy to save space.
I would generally expect this optimization to only be done for programs executing in userspace, not os / kernel style programs like this is. I would have expected the equivalent of -nostdlib to handle this automatically. I wonder if the author's build could be tweaked to tell llvm (which zig is based on) not to perform this or other "default to stdlib implementation" style optimizations
This is controllable by customizing TargetLibraryInfo [1] for your target in LLVM. Note, however, that detecting memcpy is always a valid optimization in C, not just in userspace. The C language standard, not POSIX, requires that a function spelled "memcpy" be available.
As a side note, the optimization pass that detects and replaces memcpy is called LoopIdiomRecognize [2]. It actually detects quite a bit, not just memcpy.
Very Interesting! I guess I would still expect the optimization to use the library implementation of memcpy to be invalid when interacting with memory mapped IO. In this case the issue was the size of writes being issued (byte vs word), but you can easily imagine other issues that would not matter something like memcpy. I would have thought volatile might have signaled this, even if the issue here is not related to values being cached in registers. Does zig have a version of volatile?
> IR-level volatile loads and stores cannot safely be optimized into llvm.memcpy or llvm.memmove intrinsics even when those intrinsics are flagged volatile. Likewise, the backend should never split or merge target-legal volatile load/store instructions. Similarly, IR-level volatile loads and stores cannot change from integer to floating-point or vice versa.
Even in a baremetal/OS context, the compiler is generally allowed to split, combine, reorder, and eliminate memory access however it likes as long as the end result is consistent. Being able to do that is very important for performance and code size, which especially matters in embedded or OS dev -- for example, inserting memcpy calls can save on code size for things like passing a medium-sized struct to a function. The memory model for any reasonably low-level programming language nowadays requires you to specify whenever you actually need memory access that look exactly like you wrote.
The OP suggests:
> I know that this is a long shot, but it would be nice if there was some way to specify how memory in certain address ranges need to be addressed.
The downside to this approach is that the compiler doesn't usually know what address range an arbitrary pointer will acccess; it's not resolved until link-time or often runtime. So this would require dynamic checks around every single memory access, which would kill performance. Thus, the solution is to use volatile loads and stores for any memory ranges that need special treatment.
Even GCC emits calls to memcpy. Even for freestanding nostdlib targets. There is no way to turn it off. I had to implement memcpy because of it. Incredibly annoying.
But C/ C-compilers don't guarantee your struct wont have holes (by default), so you may have to do something like __attribute__((packed)) to ensure they are packed structs:
This is not true of adjacent bitfields, at least for C99:
An implementation may allocate any addressable storage unit large enough to hold a bit-field. If enough space remains, a bit-field that immediately follows another bit-field in a structure shall be packed into adjacent bits of the same unit.
That’s usually something your ABI will describe in fairly precise terms, though if (as in your example) you want non-naturally-aligned fields, you may indeed want to both use a packed struct and prepare for alignment faults on less-tolerant architectures.
in microcontrollers it's very common to see code generated that creates structs for the registers. They will typically output fields that are a full machine word in size (or maybe in some cases as small as a byte), and individual bits will be addressed with bitmasking (ie `my_device.some_reg |= SOME_REG_FLAG_NAME` or `my_device.some_reg &= ~SOME_REG_FLAG_NAME` to clear it). It is sometimes necessary to be thoughtful about batching writes so that certain bits are set before the periferal begins some process. A trivial example would be:
> Did you know that you can’t write to video memory on the Game Boy Advance in units of 8 bits, and if you do, it’ll still work but all your graphics will be messed up with no obvious way to figure it out unless you read one specific paragraph of the documentation? Hey did you know that when you compile a binary optimized for size, memory copies will be by units of 8 bits, but not if you use debug mode?
Regarding the nit: this seems to be so common these days that I'm wondering if it's a bug in a default CSS theme in a popular static site generator or something
They must have fixed the link, it points to a page with a .gba file now. I would have liked it to point to a page that loaded a js/wasm emulator with the playable game in it.
Interesting post. I wish it went deeper but I guess it's just a post-mortem.
Related, I found this old post that goes into the basics of GBA coding [1]. It doesn't look nearly as bad as I expected. That said I'm sure performance might be the issue once you start having a few things going.
If you have a good resource please let me know (I prefer C).
> Unfortunately, it only has support for one output. Some of the BIOS functions on the GBA that you might use inline assembly for output multiple values in different registers, which is problematic for Zig. This is currently being worked on so it might be improved in the future.
Can you post the code that's impossible to write properly?
> As it would turn out, the Game Boy Advance has quite a bit of “weird memory” that you have to work around in weird ways. I know that this is a long shot, but it would be nice if there was some way to specify how memory in certain address ranges need to be addressed.
Could you elaborate on this? Does LLVM support this feature? If so, it might not be too hard to get it into Zig.
I'm not a compiler dev and I don't know Zig, but I can think of a couple of things that make this tricky:
- The compiler doesn't typically know anything about memory address ranges; that's the linker's business.
- Even if you taught the compiler about memory address ranges, the compiler often won't know what address ranges a pointer could point to at runtime. For instance, suppose you declare a function that takes a pointer as its parameter; and then maybe elsewhere in your code you call it with pointers to various address ranges. What's the compiler supposed to do, check the pointer value at runtime and branch based on its address range? That's certainly not good for performance or code size.
So we clearly need some way to "color" the pointer type itself as belonging to a specific address range -- a function needs to be able to declare what color of memory it wants to operate on, and it should be an error to pass a wrong-colored pointer to a function. Except you can already do this in programming languages today! Create a wrapper type representing a pointer to a specific type of memory (something like 'struct VramPtr(*mut u32)' in Rust), and define read and write functions that issue volatile writes of the correct type (or even inline assembly if you need some specific instruction that volatile doesn't guarantee).
It arguably fits, because it pairs a 68010 with hardware accelerated 3d graphics. As far as I can tell, it's not quite modern 3d graphics. If you paid for the geometry engine, you did get hardware accelerated transform and clipping. And the framebuffer supported hardware acceleration of 2d polygon. But if you wanted texture mapping, that was all done on the CPU.
I don't think we can consider SGI as having "modern" 3d accelerated graphics until they implemented support for hardware accelerated texturing with the Reality Engine in 1992, long after they switched to MIPS.
Upper right quadrant would be an 8-bit (or maybe 16-bit) CPU paired with a 3D graphics capable GPU. SNES + the SuperFX chip is the closest example I can think of. Similarly, there is the Genesis/Mega Drive + SVP chip.
I can't name a fully 8-bit machine with a 3d focused graphics chip. Maybe there are arcade boards?
It's stretching the definition of "3D-focused graphics chip", but an early example might be I, Robot. An 8-bit 6809 CPU drives a custom polygon-pushing graphics processor. It's primitive but must've been mind-blowing in 1984.
Mega Drive is probably earliest that was capable of it really, but Namco System 21 and especially Sega Model 1 (1990) were designed with 3d/polygons in mind but have relatively old chips in them. Programming for those things could not have been easy.
Extremely excited to read through this. I'm fascinated by Zig and the GBA. It's really cool to see people still developing software on such an old machine.
If making the respective loads and stores volatile doesn't solve this problem please feel free to file a bug against the compiler.
https://ziglang.org/documentation/0.13.0/#volatile