Hacker News new | past | comments | ask | show | jobs | submit login
HDMI ISA graphics card for vintage PCs by improving the Graphics Gremlin (yeokhengmeng.com)
271 points by yeokm1 on Sept 11, 2023 | hide | past | favorite | 53 comments



I've worked on something similar while in college. HDMI is actually a fairly simple protocol, based on VGA. It still has scanlines and such, but the signals are digital instead of analog. So instead of a RAMDAC, you have the HDMI encoder chip. You typically need control some i2c interface to set the HDMI encoder chip in the correct mode, but that's about it.


It's worth mentioning that HDMI encoder chips are actually quite difficult to get hold of as a hobbyist; the HDMI association prohibits their sale to anybody who has not signed the relevant NDAs, paid membership fees and whatnot. The project shown here gets around this by using a freely available DVI encoder chip instead, which works fine (the HDMI specification mandates backwards compatibility with DVI) but lacks support for a number of HDMI-only features such as audio and higher resolutions.

The only HDMI encoder I have seen so far with easily accessible (i.e. leaked) documentation is the CAT6613/IT6613 from ITE, which also happens to be available for purchase in single quantities from a number of Chinese retailers. It seems to be used in the OSSC and several FPGA development boards, so it's about as close to being an unofficial standard for open source projects as it could be.


This is a good reason to bit-bang the protocol. It is in fact possible to get HDMI including audio out of an RP2040, as is demonstrated by infones[1], a NES emulator.

[1]: https://github.com/shuichitakano/pico-infones


Based on what you said, you didn't actually look at HDMI protocol, only the protocol exposed to your HDMI encoder chip. You could have such a protocol on Thunderbolt 3 encoder chip.

FWIW, yes HDMI is still pretty simple, but not as simple as you describe it. Even though there are 3 pairs of data it's not one pair R, one pair G, one pair B (highest-bandwidth HDMI uses 4 pairs), it's just one data bus. The data pairs aren't only used for data colors, but also conveys audio, and info frames (which will include various stuff like HDR or VRR metadata). Of course there is the matter of DRM: the content will often be encrypted (but negotiation of that encryption happen separately, over I2C)


That's only the case in the (relatively) shiny new 2.1 spec with FRL. Prior versions are TMDS with 3 channels for red, green, blue, along with a clock. The audio and InfraFrames slot into the data islands in blanking periods on those signals.


Thanks for the complementary infos. YCbCr420 (which you can't just mux on the channels directly?) dates back to HDMI 1.4 though


As the source device, you can simply choose not to send YCbCr420.


Absolutely - YCbCr 4:2:2 and 4:2:0 use different encodings across the lines to balance bandwidth. Complexity then continues to stack with deep colour pixel packing. There's a beautiful pocket of simplicity for RGB 4:4:4 8bpc which is essentially a direct digital encoding of ye oldy RGBHV signals. More than OK for any retro-comp needs, and a fun starting point for general hacking.


> Based on what you said, you didn't actually look at HDMI protocol, only the protocol exposed to your HDMI encoder chip.

You are correct. Though as part of my studies (and curiosity, of course) I did end up analyzing the signalling protocol. The side-effect of standardizing line protocols is that it offers an abstraction for the engineers working with it. I didn't have to understand the signalling methods per se to use HDMI in my project.

> FWIW, yes HDMI is still pretty simple, but not as simple as you describe it.

I should have added that at the time I used it, HDMI was still in the 1.0 spec (1080i, 60Hz max), which was effectively DVI. Much has changed since then.


While that's true, you really only need to blast out RGB data to get an image on screen. Most of what you are talking about is layered on top and optional.

I did a tiny HDMI implementation in an FPGA for a project, the TMDS implementation was what took the longest.


I remember i2c support was added to the Linux kernel. I never understood what it is.

Next in line is BPF, which is also unclear.


I2C is basically "USB for things where USB would be overkill". It's a standardized protocol [1] that allows for one host to communicate with up to 128 different devices on a shared serial bus consisting of two wires. The specification does not define any discovery mechanism or higher-level functionality on top of basic packet sending and receiving, so I2C is meant to be used in cases where the host already knows what is connected and has the appropriate drivers set up (hence why embedded Linux devices typically use a "device tree" or similar configuration file to tell the kernel about their I2C hardware). Many of the chips used in modern electronics are either controlled entirely via I2C or use it as a configuration port in addition to a separate high-bandwidth interface for data; the accelerometer, power management chip, audio DAC, display and cameras in your phone all have I2C interfaces.

Another use case for I2C is device identification: since I2C-interfaced ROMs are so cheap, it's common to embed them in various types of peripherals and have the host read them out to retrieve information about the peripheral. This is how your PC gets to know which resolutions are supported by your monitor [2] (VGA, DVI and HDMI all have dedicated I2C pins for this purpose) and which type of RAM you have installed [3], for instance.

[1] https://en.wikipedia.org/wiki/I%C2%B2C

[2] https://en.wikipedia.org/wiki/Extended_Display_Identificatio...

[3] https://en.wikipedia.org/wiki/Serial_presence_detect


If you're talking about (e)BPF, the (extended) Berkeley Packet Filter, the easiest way to think about it is like a tiny virtual machine running inside the kernel, which can execute "simple" commands that would otherwise be very slow or very complex from within userspace. The traditional example would be counting the number of packets being sent out by a network interface. But it turns out that eBPF is massively more general purpose than that, allowing people to develop all kinds of monitoring applications.


I2C is a simple serial communications protocol typically used for things like configuring on-board devices and reading sensors.

BPF is anything but simple.


How does its complexity compare to DisplayPort?


HDMI is simpler to have a simple implementation; DisplayPort is simpler to have a fully featured implementation.

HDMI keeps normal analog video timing with blank/hblank, etc. Stuffing anything other than video is pretty complex and a mish mash of finding weird gaps in the video data.

DisplayPort is it's own packet based protocol. Sending things other than video are just other packet types over the same link.



If you only want to drive the monitor in DVI compatibility mode, you don't need very much because the interface is (as you describe) fairly simple and electrically compatible - if you actually want "real" HDMI then it's much more complex.


I imagine a PC with a Graphics Gremlin and either a Snark Barker or Snood Bloober for audio, is going to be quite the hipster toy to have, one of these days .. all this great old, retro, PC hardware designs!


I refuse to believe Snark Barker and Snood Bloober are real device names, what a funny time in computing that was.


"Sound Blaster" was the original funny name - these are both derivations in a modern style, given to clones of that ol' sound card .. and yeah, they are hilarious names, and I hope the trend sticks, personally .. the world tires of X3000-style naming conventions, I'd wager, and for fun things like sound cards and video cards, a bit of naming whimsy would be welcome.


Kids nowadays will never understand how cool it was to have a Sound Blaster, an Athlon Thunderbird and Voodoo graphics.

I might be nostalgic, but to me it sounds much cooler than having a 11700k with a 4090 nowadays


There was a time before Pentium when hardware was not named with trade name but by standard it implements. For instance you had your 486 with VGA, were it Cyrix CPU with Trident graphics card or Intel CPU with S3 graphics card didn't matter much.

For consumers only sound expansions broke this mold because they came with cool names such as Sound Blaster, Gravis Ultrasound, etc. They carried that over from audio market where equipment had "cool" names for some time already (since 70s at least).

Interestingly, Intel made "Pentium" trademark to distance and distinguish themselves from other x86 CPU vendors, but Creative failed to protect Sound Blaster trademark in a way that in mid90s everyone called every OPL3 card "a sound blaster". In that age real Sound Blasters had quality synthesizer, big wavetable with a quality soundfont and a DSP for effects on top of it, a feature which became less and less relevant as 90s went by. In mid 90s games were already moving to MIDI+CD Audio combination and by late 90s MIDI based soundtracks were largely gone. When you were playing Duke for the first time the type of "sound blaster" you had and hardware around it mattered. A clone would render worse MIDI than real Creative hardware and then came the external hardware such as Roland SC series, etc. The differences were real. When you were playing Half Life or UT for the first time just a few years after, there was no difference.

So Creative never managed to actually stamp "Sound Blaster" with something particular. It always remained a synonym for the entire standard and something even cheap hardware was described by. "Sound Blaster Audigy" was mostly reffered to as Audigy or Creative Audigy and it didn't have an ubiquitous meaning because it was clearly separated from all the junk audio cards of the day.


I thought the Sound Blaster AWE32 was nicely named .. Advanced Wave Engine, indeed .. helped them differentiate from the same technology being touted by Yamaha at the time, if I remember correctly ..


They're modern clones of the SoundBlaster 1 card (a real late-80s sound card).


Sounds well Jackson.


I see that you are a self facilitating media node.


I got a Wasp T12 Speechtool - it's well weapon


I have a couple of vintage computers. One is my original first computer. While there is a cheap expansion that would allow me to hook up any VGA screen on it, the part of the experience is the screen.

I've been hooked on to retro computing a while, and I flatted out somewhat. It all depends on what you want to do. If you want to play games or run software I suggest you take a good deep look at the emulators. I had a task to do with my XT and that was pulling the data out from it. I did some original programming for that purpose. I spent days loading software from the internet to it in minutes which was kind of miraculous. I programmed some graphics demos on it. I upgraded it with an XT-CF, etc. In the end there is no need to keep that machine up and running on a desk somewhere. The best purpose would be aesthetic, because it is really beautiful as a package, but even if it were in mint condition, I wouldn't risk running it for a couple hours daily for an useless purpose. Although it would be nice to have an 80s style terminal displaying current whether, RSS feeds, etc, it's just a bad way to run your historic machine out of working hours.


My main trouble with the emulators/virtual machines is that late-90s and early-00s Windows/DirectX games are a huge dead spot.

There are a ton of titles from this era that just don't work on current Windows, even with dgVoodoo. I just want to be able to comfortably get rid of this Win98 SE / WinXP dual boot box I have lying around...


> The frequencies and connectors used by CGA and MDA are no longer supported by modern monitors hence it is difficult for older PCs of the 1980s era to have modern displays connected to them without external adapters.

Some CGA ran over composite, and there's plenty of modern small TVs with a composite input. It's perfectly fine to use a TV as a computer monitor. (I do!)

> This analog-to-digital conversion will also lead to an inevitable loss in video quality.

Oh, now we're splitting hairs! This is super-low resolution, super-low colorspace. CGA was (at most) 16 discrete colors. In many situations it was 4 colors with 2 pallets to chose from. The CGA port was also digital, so I don't understand where the "loss in video quality" argument comes from.

IMO: I don't "get" this. You're no longer running "vintage" hardware; yet a lot of vintage hardware has limited lifespan and may become unrepairable if/when there's degradation inside the chips themselves.

If someone is going to go through all this trouble, it makes a lot more sense to emulate the whole computer.


Count me in as someone who likes vintage hardware but modern displays. In my case it's old game consoles.

The drop in video quality with composite is real. This has less to do with the resolution, but more with the fact that hardware that upscales this to an HD or 4K panel needs to make an educated guess where pixels start and end, and gets it wrong.

It looks quite ugly practically and switching to something with crisp pixels is usually very worth it.

For old game consoles it's often enough to switch to RGB or Component and you don't have to go full digital. Composite (and RF) are quite bad.

This is not an audiophile type of distinction, it's very visible and obvious to almost anyone.


> The drop in video quality with composite is real.

Remember, this is CGA. Some games specifically take advantage of composite: https://en.wikipedia.org/wiki/Composite_artifact_colors

Likewise, remember that the monitor connector is digital. If you build an HDMI (or DVI, DisplayPort, whatever) converter, you're starting with a digital signal, not an analog one: https://en.wikipedia.org/wiki/Color_Graphics_Adapter#Specifi...


> Count me in as someone who likes vintage hardware but modern displays

I should also add that, if you change the graphics card in an old PC, you're inserting a huge variable when it comes to playing a game on "vintage hardware."

Some games resorted to various tricks, whatever, that might not be emulated correctly on the new graphics card.

Again, that's why I find the idea of a "modern" CGA silly: When you use it, you no longer have vintage hardware, and the CGA connector is digital so there's no need to worry about a loss in quality.


FWIW, the lifetime of a CRT display is generally less than the lifetime of the rest of the PC. I don't think it's much trouble to install a card on a PC that didn't happen to have composite out, or to want something better than composite on a modern LCD display. You don't have to get it. It's not for you. That's OK.


So, this is probably a super nimby moment, but the growing excitement around "retrocomputing" (really just old machines) means a lot of hardware is becoming super expensive. Like good luck finding an IDE harddrive under 500 MB that works, people are pulling them out of working old machines, putting them on ebay for 100s to even 1000 usd, then shipping them in after thought packaging that has them fall apart on their way there. This hobby is fast becoming a rich man's game for speculators and huskers.

It's a tragedy really. I feel like efforts like OP are great because they pull pressure off the literally limited stock thereby making the speculators go elsewhere (segasaturn games and the like mostly).


Are there any recommended made to oder PCB services if you just want a card? Googling around seems to be a lot of options all over the world but maybe some HNers have experience with some of them.


JLCPCB has worked for me. There are others, but I haven't had the need to look elsewhere. OSHPark is another I recall.

https://jlcpcb.com/ https://oshpark.com/


JLCPCB is great as long as your components don't get too small. They made a few boards for me with a tiny component that just didn't work. I think they vastly overestimate their minimum trace size.


And PCBWay.


The retrocomputing community has struggled with making display controllers because it's not so practical to make one with 74xx/54xx parts (which have just a few gates on a chip) I saw an ad for once circa 1978 in Byte Magazine which was a circuit board about as big as an IBM PC expansion card where both sides of the board were packed with chips. Something like that costs about the same today as it did in 1978.

Home computers/game consoles of the time mainly had ASIC display controllers but projects like

https://www.commanderx16.com/

don't really have the volume to justify making an ASIC so they wind up using FPGA (like this card) or microcontrollers to function as display controllers. Note the super low-end

https://en.wikipedia.org/wiki/ZX80

did not have a video ASIC but instead tricked the microprocessor into functioning as a video controller which meant that it could only show video when it was done thinking, see

https://www.tinaja.com/ebooks/cvcb1.pdf

though that technique can be used today to turn a (secondary) microprocessor for a display controller.


Related development for those interested in using old hardware with new displays. There is this ISA HDMI addon card in development that will add HDMI port to your existing old display card: https://www.vogons.org/viewtopic.php?t=92512


I’d love to see something like this for late 90s through mid 00s computers, because many of those machines can still find modern uses but are awkward to use with modern high-resolution displays, even if they can sometimes be adapted digitally (DVI-equipped machines). Of course you can always grab an old monitor to pair with said machine, but that comes with heavy picture quality concessions in the case of LCDs, and good CRT monitors are becoming rare and expensive.

So for example it’d be super cool to be able to drop a new GPU into a PowerMac G4 tower and allow it to drive a modern 2560x1440 display under both OS 9 and OS X.


Just be aware that old machines can bog way down when you start asking them to push millions of pixels. Their old graphics subsystems were never designed to handle that much data, bus widths are a major bottleneck.


Yeah that’s a good point. There’s definitely some machines in that bracket that are capable since with the right GPUs they could drive similar contemporary displays though (G5 tower with circa 2005 2560x1600 Apple display for example), and it’d be great to not be restricted to only those old GPUs to gain that capability.

For older machines even cleanly driving a new 1920x1080 display without a dongle would be a nice upgrade.


"the brown colour is displayed incorrectly as dark yellow"

I understand what it means in an RGB context but it is the first time I seesomeone mentionning dark yellow as a color.


There's a lot to say about the color brown :-) https://www.youtube.com/watch?v=wh4aWZRtTwU


Awesome work, Yeo!

Sorry if I'm asking a duplicate question, but have you considered submitting this to Hackaday?


They have already written an article about it :)

https://hackaday.com/2023/09/10/upgraded-graphics-gremlin-ad...


What kind of command.com they use? I don't remember any with autocomplete.


I used Enhanced Doskey. http://paulhoule.com/doskey/


4DOS had it.


thank you!




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: