I run a media-lab at a art university and both HDMI and USB-C is flaming garbage. What you want is a digital video standard that simply pushes an A/V stream over the wire and negotiates the acceptable resolution on the fly. What you get is something that does too much, doesn't work half the time and does things nobody cares about. Last time I plugged in an HDMI source and the darn "smart" television showed the image for 0.5 seconds before displaying a menu that asks me to press a button on the remote to show the image. And don't get me started on DRM/HDCP..
The number of broken HDMI cables (as fraction of cables rented out) is way bigger than for any other connector, suggesting it is completely unsuitable and a broken design.
Whenever I can I go for SDI video, I do. You plug it in and it works. Why "consumer" techology has to be so much more pain than pro stuff makes me wonder.
> Last time I plugged in an HDMI source and the darn "smart" television showed the image for 0.5 seconds before displaying a menu that asks me to press a button on the remote to show the image.
That's entirely the fault of your crappy smart display with some crappy OS and has entirely nothing to do with HDMI as a standard.
I would think as a plug and play standard for A/V stuff, HDMI is one of the farthest along the "just works" spectrum for the vast majority of people. Occasionally I see a device where there's something stupid like switching to a different HDMI source doesn't switch the audio source and you have to use some dumb OSD menu with many nested levels to get to the audio sources, but again, that's not HDMI's fault.
I have had quite a few broken HDMI cables in lecture halls at uni and in meeting rooms at various work places, but I think that's the reality of any connector that gets plugged and unplugged tens of times per day (especially by people who don't care and don't have to pay for them when they break). They just need to replace the cables more often.
> That's entirely the fault of your crappy smart display with some crappy OS and has entirely nothing to do with HDMI as a standard.
Sure yeah, but I don't buy it. If you create a standard that is too complicated or too feature-creeped to be implemented fully and that lack of full implementation means the fundamental role of the standard breaks down, that standard might be part of the problem.
I too could envision a solution that theoretically works perfectly, and all people are doing it wrong if it doesn't. But such standards have to be made with reality in mind. USB-C is another one of those. Cool – now I have a ton of USB-C cables none of which tell me on the cable what capabilities they have. One can't support USB-power delivery, the other doesn't work with video up to certain resolutions, etc.
I get that more data means higher frequency and that this directly translates to more problems, but nobody (at least no consumer) asked for the complexity of the HDMI spec. We want to connect a cable and see the picture in 99.99% of the cases. If that doesn't work 100% of the times the standard is at fault. The base functionality of the thing needs to be so dumb and so clear that it just works, even if the other side doesn't even know what an EDID is. That was the task and the result is catastrophic failure.
I think an awful lot of this could be solved by requiring the ports to export the information they get to the device, and requiring that if the devices can reasonably be able to display the information that they do so. PCs, phones, tablets would all tell you about the cable and the connection. Things without screens and interfaces would not be required to add them, though.
It's not that the cables support varying specs (which I actually have no problem with--you shouldn't have to pay for features you don't need, and some features trade off vs cable length), but that we have no easy way to find them out or test them.
> What you want is a digital video standard that simply pushes an A/V stream over the wire and negotiates the acceptable resolution on the fly.
I think even that is more than it should be. I had made up my own kind of video connection (which is not implemented yet), which only has video (the audio is a separate cable, with balanced analog audio, although the standard arrangement of the connectors is supposed to be defined so that you can clip them together when wanted and unclip them if you want to move them separately), and does not negotiate the resolution (for several reasons).
> Last time I plugged in an HDMI source and the darn "smart" television showed the image for 0.5 seconds before displaying a menu that asks me to press a button on the remote to show the image.
At least a part of that is a problem with the smart TV (which has many other problems, too) rather than with HDMI itself, although HDMI also has many problems.
SDI is better in some ways, but it is not good enough either, I think.
> What you want is a digital video standard that simply pushes an A/V stream over the wire
HDMI is just that - it's the direct evolution of VGA signaling, with each color channel pushing pixels left-to-right top-to-bottom, it even has blanking periods (periods where there's no pixel info transmitted, used to steer back the electron beam on CRTs to the start of the row/column), same EDID format negotiation over I2C, the works.
What makes it crap is the absolute flood of cheap garbage HDMI cables/repeaters/KVMs which barely work even at the best of times and shouldn't be even allowed to be solved, as they are out of spec, but online vendors have flooded their stores with this cheap no-name garbage for some reason.
Unfortunately, the apparent build quality of the cable, or the price mean nothing when trying to get a working one.
Yeah I get that in theory, but then my 10x more expensive pro stuff works worse than the cheap stuff. Sure, because they follow the spec etc. But then it turns out that even name brand laptops (or their GPUs) do it wrong. My point was that the standard is crap. It is way too complicated, wants to be too many things to too many people (most of which are trying to sell stuff to consumers).
HDMI tries to be a video link, an audio hub, a remote‑control bus, and a content‑police checkpoint all at once. Strip out the DRM, kill the optional‑but‑mandatory feature soup, and let the cable do its one job: move bits from A to B. I had Apple laptops not working with 3-digit Pro A/V gear from reputable vendors because HDCP. This is fucking bullshit. By this point I am starting to consider analog video superior to whatever this is supposed to be.
I kinda get what you're coming from - back in college i had a lab where we had to make a NES-style game console from scratch on an FPGA - including peripherals and video output - which used VGA, that's how I got to familiarize myself with the technology. It was super simple, you just output the correct voltages to the RGB pins and sent the proper HSYNC and VSYNC signals, and that was it, worked like a charm as long as you didn't mess up too bad, the monitor figured out the rest.
Recently I dusted off this project and bought myself a new FPGA board, with a HDMI output. Getting that thing to work was a nightmare. Not only did I have to configure the HDMI transciever, I had to build proper EDID support, and get I2C to cooperate, and get the timing of the signals exactly right. And even then, out of the 5 screens I could get my hands on, only 2 worked, one of which would inexplicably go to sleep despite displaying video on the screen.
Getting HDMI working was more painful and took about 10x as long as the rest of the original project did, and even then, it was a partial success.
Yeah, we did do some games, though nothing crazy - we did breakout and asteroids if I remember - not exactly groundbreaking.
The interesting thing about it was that we had very little RAM but the FPGA was quite fast, meaning instead of a framebuffer we scanned out tiles and sprites to the scanline at realtime, at a relatively high resolution (1024x768).
This combined with the low color depth (the 'DAC' was a resistor ladder per color channel, giving us 2 bits per channel, or 6 bits in total), meant we had that dithered, Transport Tycoon-y look, with tilemaps and animaton, so it looked quite okay by the standards of 2d games.
I also thought that HDMI has many problems, like you have, too. My idea is: video and audio and other stuff are separate cables. The video cable will only be sending the digital RGB video data, with the wires for the pixel data and pixel clock, as well as power and ground and vsync. I think it is helpful for the video signal to be digital, but it does not need to be as complicated as HDMI. (The audio cable can be balanced analog audio signal.)
It is not only HDMI; many other common things are too complicated (than they should be) and often have other problems too (you mention some of them, such as HDCP, but there is many more problem than that).
(Someone else wrote: "The sad fact of the matter is that people play politics with standards to gain commercial advantage, and the result is that end users suffer the consequences. This is the case with character encoding for computer systems, and it is even more the case with HDTV.")
I think it makes sense to wire the video and audio separately, e.g. you might connect the audio to a speaker or amplifier, or to a audio recorder, etc. However, my idea was to be able to clip them together and to have a standard arrangement of them so that they can be used like one cable if you do not want to deal with two cables.
Um, you just hit why HDMI sucks. You have a "default broken" state that is required by the standard.
Look, every single interface could have been an evolution of Ethernet (and mostly ARE--HDMI and USB-C are basically enshittified Ethernet). But they weren't because everybody wants to put their fingers in the pie and take out a chunk for profit by being a rent-seeking middleman.
The duty cycle on hdmi connector is like 10k. I imagine probably some of your cables in a lab would actually plausibly hit that without too much issue (then apply standard deviation: some will have broken much earlier, and some won’t quit)
You don't want to know what my headphone extensions (TRSm 3.5 -> TRSf 3.5mm) or my XLR3 cables go through. That is way, way worse than anything the HDMI cables experience, based purey on the look of the cables returned.
I get that HDMI is higher frequency and smaller faults show earlier, but the plug is just inadequate. The plugs are levered off by the stiff cable, the thickness of the cables would require at least something like a Neutrik-D-norm connector, but they do as if something smaller is ok. By this point I am just glad that the receiving side seems to be sturdier 90% of the times, but by this point I also wonder why the heck we don't just use BNC connectors and coaxial cables..
HDMI is a piece of shit designed to keep device owners hostage of the spec consortium and manufacturers, and USB-C is a badly brand collection of specs with infinite diversity that shouldn't even work but somehow some times does.
But there is a reason nobody puts analogical signals in cables anymore. Beyond some bandwidth, the only way to keep cables reasonably priced and thin is to have software error correction.
The number of broken HDMI cables (as fraction of cables rented out) is way bigger than for any other connector, suggesting it is completely unsuitable and a broken design.
Whenever I can I go for SDI video, I do. You plug it in and it works. Why "consumer" techology has to be so much more pain than pro stuff makes me wonder.