I remember seeing a Mandelbrot program for the C64 where half the image was computed on the floppy drive because it's basically the same computer as the main C64. I think it had a 6502 instead of a 6510 and different memory.
I believe the Apple II floppy drive was "dumb", that is, controlled by the 6502 of the Apple II, so the machine couldn't do anything/much while loading/saving data. But the C64 + floppy drive was essentially a two-node distributed system.
That was because of the slow serial interface on the VIC and C64 side - IIRC, the UART required was removed from the 64 as a cost-cutting measure and it shipped having to bit-bang data to the drive. Overall, this is a very solid design idea.
With a little extra smarts, the drive could deal with ISAM tables as well as files and do processing inside the drive itself. Things like sorting and indexing tables in dBase II could be done in the drive itself while the computer was doing things like updating screens.
OTOH, on the Apple II, the drive was so deeply integrated into the computer that accelerator boards needed to slow down the clock back to 1MHz when IO operations were running. Even other versions of the 6502 would need to have the exact same timings if they wanted to be used by Apple.
The designers planned on using a shift register in the 6522 VIA chips to implement fast serial loading, but an undocumented bug in that chip forced them to fall back to the slow bit banging version that shipped
I don't know how many of you have seen a 1541 floppy drive in person either but it is massive, it's heavier and possibly bigger that an actual Commodore 64 and pretty expensive at the time too.
it's fun seeing c64 people on the defensive about it, a nice change from getting lectures from them about how their graphics were the pinnacle of 8-bit computing
Part of the size was the internal power supply. And that thing got hot, too. I used them at school, but at home only had the smaller 1541-II with an external power brick.
The Apple II disk drives, on the other hand, were not only cheap (Apple was different then!) and fast, but were powered by the ribbon cable connecting them to the computer.
Oh its MUCH better than that. Commodore did this because they had incompetent management. They shipped earlier products (VIC-20, 1540) with hardware defective 6522, but:
- C64 shipped with 6526, fixed version of 6522 without shift register bug
- C64 is incompatible with 1540 anyway
They crippled C64 and its floppies _for no reason_.
It was not for no reason. When adding a screw hole in the motherboard so it could be mounted in the case, they accidentally removed the high speed wire, dooming the C64 to the same slow data speed of the VIC-20 with it's faulty VIA.
I haven't used LLVM in ages. I think it injects references to functions in the CRT when you do certain operations in your code. _fltused is one of them, and I think _ftol? or something like that for floating point numbers is another one. There was also a "security cookie" at some point in the MSVC libs. Unfortunately, I don't remember the linker flags to get rid of that reference, IIRC it had to do with runtime stack checking.
These references do not appear in the .ll file. They are injected when the .ll file is compiled to object files.
I think something in your code triggers a reference to one of the other injected functions and that pulls in the CRT.
Try compiling your test file into an .o or .obj, that is, without linking. Then dump the symbols in the object file to see what symbols are referenced. I suspect you'll see other references to symbols in CRT and you will have to replace those as well with stubs.
Unfortunately, I don't remember the linker flags to replace/suppress the default CRT libs. Well, actually, you might compile to .o / .obj and then manually link on your system. If you're using MSVC check the options to its "link" executable (I don't remember the exact name of the MSVC linker).
Possibly unhelpful for OP, but for MSVC try /NODEFAULTLIB for specific libraries, or IgnoreAllDefaultLibraries to remove everything not explicitly specified to the linker.
On x86 only, if you need to cast floats, try /QIfist (deprecated) to avoid hitting _ftol. Doesn't work for x64 or ARM.
I second this approach, also check with all the verbose flags enabled (eg. clang -v) to get a full list of the linking flags used.
One other thing you may want to try is writing a linker script manually. If you could provide the linker flags as mentioned elsewhere, that would be useful.
I'm German, I don't speak Dutch. But I was able to follow a Dutch tour guide in Den Haag just fine when she was explaining things in Dutch. She kindly repeated everything in English for my benefit (I was the only foreigner) even though I told her I understood her just fine in Dutch.
You have to "adjust your ears" a bit but I think if you know German and English then you can understand Dutch just fine if it's not slang.
It also depends on the particular dialect a German speaks. Dutch is effectively old German before all the various alterations and "reforms" to the German language that were instituted to create fragmentation between the germanic people of Europe, i.e., English, Dutch, Germans, Austrians, Swiss, Belgians throughout the ~16th-20th century by aristocrats driving wedges between peasants between kingdoms and dukedoms in order to define their own nations/ethnicities through language and culture so their royal families could rule over and would find it difficult to associate with each other. It is one of the things that also contributed to the fragmentation of Germany before unification, language barriers that even created unique cultures between sides of a valley that were in different dukedoms.
A similar thing has caused the tension between the germanic and Romance languages that followed the Roman border line N to S that separates Europe.
> Does it make sense to change direction at this point? I envy PhDs working on self-driving cars and rockets and AI.
You don’t need a PhD to work on rockets. Well, you might depending on what you want to do.
There are a lot of software opportunities at rocket companies from test systems, real-time measurements, operator interfaces, flight simulation, and various other internal supporting software.
You might be interested in and have the right experience for, for example, operator interfaces and various internal dashboards and database applications. That might be your entry into the field and you can try to branch out from there into other areas.
I believe Kochenderfer et.al.'s book "Algorithms for decision making" is also about reinforcement learning and related approaches. Free PDFs are available at https://algorithmsbook.com
It’s the title of the blog post and I didn’t want to change it. But yes, it seems to focus on the specific subset of hardware engineering that’s control systems.
Higher order logic was originally developed as a foundation for mathematics. In this paper we show how it can be used as: 1. a hardware description language, and 2. a formalism for proving that designs meet their specifications.
Examples are given which illustrate various specification and verification techniques. These include a CMOS inverter, a CMOS full adder, an n-bit ripple-carry adder, a sequential multiplier and an edge-triggered D-type register.
I believe the Apple II floppy drive was "dumb", that is, controlled by the 6502 of the Apple II, so the machine couldn't do anything/much while loading/saving data. But the C64 + floppy drive was essentially a two-node distributed system.
reply