Hacker News new | past | comments | ask | show | jobs | submit | maaarghk's comments login

oh man! every so often for the past decade I've tried to remember "rinkworks". I recognised it immediately from your post. I remember this being one of the first websites I would read as a kid, 20 odd years ago. cheers for the nostalgia buzz!


Not an NFC chip question, but what kind of microscope do you need to get silicon photos of a chip so tiny?


The trick is to use a metallurgical microscope, which shines the light down through the lens. A regular microscope illuminates from below, which works fine for cells, but not for opaque chips.

Specifically, I use an AmScope ME300TZB-2L-10M microscope, which my friends consider an entry-level microscope, but it works for my needs.


I think the title really undersells it, the video is worth a watch. It's really an impressive effort just to get .NET 3.5 code running on Windows 95, seemingly just to be able to say he could.

HN is usually good for this kind of thing: it looks like NDP is an internal name for dotnet, does anyone here who remembers what it stood for?


I played this for hours last night, I'm going to need to block it :) Set myself a target of 2048 for nostalgic reasons, turns out it takes a long time (for me anyway).

You're right about the dictionary, actually the whole time I kept wondering about how annoying it must have been to choose a dictionary for this game. Even though not accidentally making non-obscure words without noticing is part of the challenge, accidentally making obscure words is annoying!

Maybe I just don't know enough words - but looking through my game log, I was annoyed by "cony", "smit", "huic", "yipe", "nome", "torii", "agon", "mairs", "imido" and "sial", some of which don't display a definition when you click them, but all of which appear in all the scrabble dictionaries referenced on the website you just linked. Meanwhile I was sad to discover vape is so far only in one scrabble dictionary :) And annoyed to discover "oxalic", which is also in all the dictionaries on that site, was not accepted.

I guess there's a spectrum between "advanced scrabble player level vocabulary" and "fun word game", because I imagine (and suspect you have probably had feedback along these lines) _not_ allowing a word which is obscure but still unambiguously used in the modern era would be worse UX overall - the sort that's more likely to make you rage-quit.

I can see why you'd try to get a bit of wordle-esque shareability out of the daily mode even though I like the classic mode more myself. But I think the tutorial popup isn't as comprehensive as it needs to be for someone's first game to be fun. The first time I clicked the link I did an abysmal job at the daily challenge, I think it wasn't obvious that swaps didn't need to be neighbouring like the given example. Something that might be better is to make an interactive tutorial for first-time visitors - come up with a 5x5 board that is quickly solved and demonstrates several strategies and then walk the player through clearing it. I also think the help popup being one click away would be useful.

I would also have liked the help popup to let me know that progress is saved if you close the page, I ended up checking in an incognito window because I had no time to keep playing but wanted to come back and try to reach the target I'd set myself another time!

Anyway - criticism and suggestions aside - well done, it is a fun game and concept!


Could be useful / fun to make a USB floppy drive which supports non standard layouts like Commodore and AKAI.


A couple of examples:

- https://decromancer.ca/greaseweazle/

- https://www.cbmstuff.com/index.php?route=product/product&pro...

I think it'd be interesting to connect a high sensitivity / resolution sampling probe directly to the analog output of the drive heads. You could do software-defined signal processing to potentially recover damaged data. These USB-based tools are getting the signal after being amplified in the analog domain and processed by the drive's electronics.


IIRC - he mentioned that someone _else_ had a DEC machine, and actually used it as their dev box. The dev with the DEC box person developed the kernel panic code, aka the blue screen of death - and blue was chosen because that's the default screen colour when the DEC box is turned on. The idea was to reset the colour to the default before printing the kernel panic message.

So while DEC NT is sort of a footnote, it did have this pretty profound influence : )


This is a bit of a game of telephone - NT Alpha shipped after NT 3.1 (i386 & Mips) and the port was done almost entirely by DEC. The blue screen preceded the Alpha and was really based on the color scheme from the firmware on the Mips workstation which Microsoft built internally. And, of course, the legendary SlickEdit, which was one of the original editors available on Win32.

After NT 3.1, Microsoft assumed primary responsibility for NT Alpha, although there were also some great people at DECWest still involved.

source: me, I'm the 'someone _else_' who owned all the Alpha stuff at Microsoft.


Windows 3.x's fatal exception screens were also blue.


Indeed; I think Windows 2.0 even had it. The story might sound plausible until the holes are poked in the hypothesis.


So were the color schemes on many 8-bit computers of the '80s, such as Ataris and Commodores.


my shell history has this in it, but it might have been for android firefox ~ `-c:v vp8 -b:v 2000k -pix_fmt yuv420p`


A bit off topic but the most interesting part of this was clicking through to the link about TCP BBR. It's available for the Linux kernel but not bundled / enabled by default, but if these related posts are telling the truth, maybe it should be:

https://djangocas.dev/blog/huge-improve-network-performance-...

https://atoonk.medium.com/tcp-bbr-exploring-tcp-congestion-c...


Haha, I had a feeling from the title it would be about the 8051. I've only recently learned about it. I ended up with a bunch of TTL / LVDS / eDP display panel driver boards on hand. They're based around a family of Realtek chips which are hugely powerful for having in some cases a single digit dollar cost - the silicon has a whole bunch of peripherals like analogue video decoders, HDMI/VGA/DisplayPort decoders, colour processing, an OSD generator and signal muxer, DDC/CI interface, IrDA demodulator, PWM / DAC for audio, and many many more things; all with parameters configurable by an embedded 8051-compatible processor (by the same principle as memory mapped peripheral). As someone with next to no serious experience in embedded software, trying to write software to target these devices has been quite the departure and an eye opener in many ways.

The development tools just feel so antiquated. The Keil compiler mentioned in the article has a per-seat license cost in the thousands and runs on Windows, and it feels like it has not received any serious upgrades since the mid-2000s. It runs fine on Wine (with a free trial license, of course), but has basically unusable UX on a hidpi screen. Of course I can pretty much get away with coding in vscode and writing a Makefile which calls `wine ~/.wine/drive_c/Keil/BIN/C52.exe` but it's not ideal, plus, my trial license will of course expire, and this is a hobby project.

I tried switching to using SDCC. Preface, my honest opinion is that the small handful of people who maintain this are doing a wonderful job and have been for years - it's a thankless task for a small audience. But for serious features that a modern day user might expect, like code banking, the implementation is inflexible, supports half of the implementation methods that Keil does, and generates larger code. And of course, there are currently very few people capable of making contributions to improve the situation. The documentation is extensive but split across PDF files for the compiler, TXT files for the linker, unstyled HTML files for the simulator, and various README and errata "NOTES" files for other components.

Meanwhile the only copies of the original Intel documentation for the 8051 I could find were scanned images of a printed book. A lot of random entry level tutorials for beginners are dotted around the net, on websites like http://8052mcu.com/ or in Youtube video lectures uploaded by universities based in non-English speaking countries; but high quality written reference guides seem to be difficult to find. Of course, maybe it's not as bad as that, but just was not easy for me to grok; I realised in hindsight I have the assumptions of von Neumann architecture more or less internalised, so it took a while to get my head around the concept of having three separate address spaces (one for code, another for internal RAM, another for external RAM).

I would not be surprised if this were the case for an equally old but now-niche chip, like the Z80 and its derivatives. But given the neighbouring comments estimating just how widespread this MCU is (billions of units per year?!) it does seem kinda surprising that modern open source embedded development tools of the kind available for platforms like the RP2040, STM32, ESP8266, etc, just haven't reached the 8051 platform. (edit to add: I don't think it's necessarily bad if development tools are simply "old", fwiw, and I do think software can be finished. But in this case there is something of a gulf between the open source solution and the paid solution, and progress to close this has only slowed, not accelerated.)

My only guess as to why (as a layperson) is that the Harvard architecture plus 8-bit stack address space makes it difficult to target with modern compiler tools or something. Of course the modern derivatives being heavily burdened by IP rights also can't help; I suppose the only people who have access to datasheets detailed enough to implement simulators / advanced compiler features, have a day job which affords them access to the "good enough for enterprise" Keil solution : )


You say it might be the same for the Z80? I think, happily, the situation there is a lot more open and well-documented.

There are specific sites such as http://z80.info which are good resources to describe the chip. The Z80 was also the main processor of a lot of computers, in the UK the Sinclair Spectrum sold in the millions, and there are an awful lot of coders who started their careers, or childhoods, playing with them. People of my age would be reasonably familiar with assembly and the things it can do.

The fact that the Z80 was also used in consoles also provides another avenue for information. There are a lot of emulators out there which have implementations if not necessarily good documentation.

(On a similar note there's a lot of legacy code out there, as the Z80 was one of the targets for CP/M, which is itself source-available these days.)

Compiling CP/M, Pascal, C, FORTH, and running BASIC on a z80 are all trivial - with the right supporting hardware/circuitry for I/O.


The web stack might give you more transferable skills but how did you calculate the trade off vs the 50% pay cut? Was the salary not actually very high and at a low ceiling compared to web work - i.e. you expect your salary to soon exceed what you made working in HFT? Or was there a serious risk of layoffs happening far in advance of your retirement? If the goal is to support yourself after retirement, are you saying in certain circumstances halving your salary 10 years into a tech career ultimately optimises your entire-career-earnings?


It wasn't such a purely material calculation. Less stress, more time to spend with family and friends, learning more things and doing something that you know actually adds value to the world -- those things matter.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: