Hacker News new | past | comments | ask | show | jobs | submit login

Some programs had embedded graphic subroutines executed when under debugger. It would set CRT frequencies too high and blow up the display.



Sounds like a myth. No crt producer would allow such a case and even if it is true, I would expect lots of law suits.


I'm not sure, but I never saw it as a trap payload.

Rather, it's something that could go wrong when overclocking, since the Xtals were often locked to the video sync rate (my Falcon's "Nemesis" was a real bugger for it); maybe it'd try to change into a mode that actually didn't have the clock it was expecting!

I don't know about "blowing up", but yes, if you put a bad signal into them, some of them might break. I had (probably still have, actually, somewhere, albeit modified for SCART with an LM1881 and my crappy soldering!) an Atari SC1224 RGB monitor in which you fed the horizontal and vertical sync frequencies separately (rather than composite - hence the LM1881 being needed to split the sync). And at one point I had a Falcon, with Nemesis, and Videlity. The monitor did NOT like it if you fed it a horizontal sync outside the 15.6-15.8KHz range (like, say, VGA's 31468.5Hz; oops!).

The result was the big transformer in the back (line output transformer?) heating up and whining and the caps building up voltage, the screen's black level warming to an alarming dull green… I don't think I'd have wanted to keep the power on another few seconds! Although it'd probably only have burnt something out, I didn't want to break anything.

Point is, some CRTs are a bit more… direct than others. I hear the vector monitors (as used in Tempest) are particularly hairy beasts.


While this case might be a myth, there are plenty of documented "killer pokes": http://en.wikipedia.org/wiki/Killer_poke


There were failure modes they didn't always think about.

I once killed off an 80's era TV I was using for a display on a microcontroller. At the time, I was writing low level NTSC driver code, and was abusing the standard a little for some color advantages.

A combination of bright screens, overly dark ones ( sync in the visible raster), and misplaced vertical blank pulses popped that TV. It never displayed the same again.

Still worked, but was just crappy.

As monitors age, they can become vulnerable to this kind of thing. Not so much a single value, or poke, but ugly code intended to stress a display could take a few of the edge case ones out.


It was a fairly common thing, I think? Modern CRTs can detect out of range inputs but at one point you just had to be careful not to supply out of range frequencies. No more onerous than putting the right fuel in your car...


I blow up 14 inch VGA display in Linux by setting wrong horizontal and vertical frequencies.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: