Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"terrifying and clever" describes a lot of the Woz's work...

So I am a little fuzzy on the NTSC standard, because it has been a while since I looked at it... but the hack goes something like this:

Monochrome TV puts video on an upper vestigial sideband with the carrier at (IIRC) 1.25 MHz up from the lower edge of the 6 MHz channel. (The high frequency parts of the lower sideband of the video signal are whacked off with a filter.) The FM sound subcarrier is up closer to the top edge, I forget what the offset is exactly.

Then color came along... so they added a color subcarrier just below the sound subcarrier. This reduced the bandwidth available for luminance, but got you color in return. The color subcarrier IIRC is kind of a flavor of independent sideband, but is really two double sideband signals on the same carrier, 90 degrees out of phase with each other, one skinny, and the other fat but with a notch filter applied to allow the skinny signal to ride in the middle. Or something like that.

Anyway... if you send a monochrome signal with high-frequency components (high frequency == high dot rate) that fall in the range of the color sub-carrier, you will get color artifacts. This is because the color decoder simply responds to the rf it is getting, whether on a proper subcarrier or not.

Another thing I used to know but am fuzzy about is I am pretty sure you need to put out the standard color burst on the back porch of the horizontal sync. Otherwise, the TV receiver will not sync to the "color subcarrier" that you are faking with artifacts.



You are correct. Something like a colorburst must be there. Turns out you can abuse that very considerably. And still get stable, useful displays.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: