Laptop graphics are always a minefield of barely adequate hardware. I really hope this, as well as Nvidia's ION chipsets see an end to these dark days.
Because "adequate" can have vastly different meanings, depending on what you use your GPU for. I use it mostly for eye-candy at the GUI level. I play a bit with Google Earth and that's about all. My Intel GMA can do that just fine.
If, however, I were into heavy gaming, I would find this setup unusable (down to the fact I don't run Windows).
I am not sure how well an Intel GMA would fare if it were used as a number cruncher. I suspect it wouldn't excel, but, compared to x86 vector hardware, even a lowly GPU should hold it's own well.
That's a problem inherent in the nature of HD codecs. Decoding with a general-purpose CPU is too inefficient. Any platform that has acceptable power consumption for the ultra-portable market is going to have to use dedicated decoding hardware. Once you take HD decoding off the list of things the CPU has to be fast enough for, you have the option to use a small, low-power CPU like the Atom. I don't see any room for improvement by using a different arrangement. Software writers will just have to get used to using the OS provided decoders so that any accelerators present can be used.
Gettting used to using the OS provided codecs can be a problem, because it forces you into specific codec, that you may want to avoid and disallows to use codec you want to use.
For example, I can't imagine Bink being hardware accelerated anytime soon.
ION is the integrated GPU chip, the CPU is usually an Intel Atom. ION does a remarkably good job decoding HD video if hardware acceleration is enabled within the media app.