Pure rumor and speculation, but Silicon Alley Insider is reporting a tip they've received stating that Apple will be adding "QuickTime encoding/decoding chips built into their products." Just like MPEG2 decoders that specifically deal with DVD playback, these chips would presumably handle MPEG4 only, the H.264 codec behind Apple's core video technologies. Does it make sense? Well, yes and no.
If anyone could/should include a QuickTime-exclusive chip in their hardware, it's Apple, who uses QuickTime for iTunes, iMovie, Final Cut Pro and a slew of other programs (as well as plenty of functions within the core of OS X). The chip could be small, cheap and take a load off the CPU while providing silky-smooth playback of any and all Apple-based A/V content. Nothing can beat the quality and speed of a chip dedicated to one particular video function. That's why such solutions are still huge in the professional video industry.
Then again, such a chip would serve a niche use that video cards already handle pretty well, and it wouldn't have the functionality to replace a 3D graphics card. Plus in most Apple products, the CPU isn't burdened by the load of QuickTime because the GPU is handling the work. Also, at WWDC, Apple claimed to be developing QuickTime around multi-core technologies. That would mean NOT a dedicated chip.
Nonetheless, it's a pretty juicy rumor that's fun to think about. [Alley Insider]