Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Watching SD videos and dragging them around on the 1024x768 screen.

People forget that a modern computer has such a large screen the frame buffer would barely fit in the memory of these old computers.

It’s a lot of work to manipulate all these pixels and it requires much larger assets as well.



Right but a 4K monitor only has about 10.5 times more pixels than a 1024x768 one. In 2000 a high end GPU might have had 32MB VRAM and these days even Intel's worst stuff has access to over 1GB over 30 times as much and even low-end gaming GPUs having 4GB with higher end stuff with 8 or 12GB. Plus the GPUs themselves have gotten much faster and OSs have moved more GUI rendering from the CPU to the GPU.


This is very far off topic, but a user of an Alpha workstation might have had much higher resolutions than 1024x768 available to them with, for example, Intergraph-produced video hardware. The workstation in question (we've actually wandered pretty far past the "20 years ago" mark with this particular machine) might have been ancient enough that an MPEG-2 video decoder would not have been a thing, but just playing a bunch of AVIs on a desktop with 1920x1200, or 2048x1152, or something of the sort resolution would have been available, for a price.


The Alpha may have only been using 8 bits per pixel, it all makes a difference.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: