The biggest update to the new MacBooks —on the inside anyway—is their graphical muscle, which has been hooked up with some Barry Bonds-level steroids. Apple ditched Intel's crummy integrated graphics and chipset (basically the traffic controller between the processor and everything else) entirely, opting for a new one from Nvidia that combines the chipset and a GPU on a single chip—the GeForce 9400M. The MacBook Pro , being more Pro-erer than the MacBook , now rocks two graphics cards—the integrated 9400M and a separate, beefier GeForce 9600M GT. If that swirl of numbers, letters and BS is confusing, here's what's up. Two graphics cards? It sounds crazy, preposterous, retardiculous. It's actually not. It's not unique to the MacBook Pro at all. PC users might be more familiar with Nvidia's Hybrid SLI , which pulls similar dual-card wizardry. In a nutshell , it lets you use the less power-hungry integrated graphics processor when you're doing lighter stuff to save battery, and then when you want a lot of video-crunching Mr. T powah, you can flip on the discrete graphics card. Of course, there's balls-to-the-wall full SLI too, which uses two entirely separate graphics cards in one notebook for Hulk power and about 45 seconds of battery life, like in one of Alienware's beasts . Nvidia's standard hybrid SLI for PC actually uses both the integrated and discrete GPU at the same time when it goes into turbo mode, and it'll let you switch on the fly or have it automatically flip between the two depending on the power source. But the MacBook Pro uses Apple's spin on Nvidia's tech that simply lets you pick one or the other (not both, booooooo) and you have to manually flip the switch in system prefs, log out and back in, pretty annoying. Battery life is apparently an issue with the new MacBook Pro, considering that the integrated 9400M card now nets you five hours of go-time, the same as the separate, more power-hungry 8600M GT in the previous model, whereas the new discrete 9600M GT now gets you only four. The other major reason for the huge upgrade to more proficient graphic cards in both the MacBook and Pro is Snow Leopard, which will be big on parallel processing and offloading work to the graphics card —graphics cards are particularly adept at parallel processing because of the way they're designed and the fact that they have a buttload of cores. (Here's a more in-depth explanation of that.) And if graphics cards are driving more and more of the general computing experience, the truly shitty ones in the last generation of MacBooks just won't cut it. Nvidia's been heavily investing in "General-purpose computing on Graphics Processing Units" (GPGPU )—again, using the graphics card for more general applications—on its own for a while, actually. When they demoed their latest, most badass cards for me a few months ago, it was heavily tilted on those types of applications, including in-game physics and Folding@Home. They actually have their own development kit called CUDA that lets programmers leverage graphics cards using a standard programming language—PhysX, a physics gaming engine, is probably the most well-known application of it so far . (Nvidia isn't sure when PhysX come to Mac, but they're looking at it.) Not so coincidentally, CUDA for Mac came out in August. These cards also support Apple's own graphics programming language, called OpenCL . So even if you're the type of person that browses the net, edits Office docs and fiddles around in Photoshop rather than the type that plays WoW: Wrath of the Lich King or cuts video, graphics cards will matter to you almost as much as it does to those people: They're going to be critical not just in a lot of the awesome stuff you'll see coming out in the next couple of years but increasingly so in the way operating systems run, whether it's from Apple or Microsoft or anyone else. So get ready to hear a lot more about them. Something you still wanna know? Send any questions about games, snow kitties or pancakes to firstname.lastname@example.org, with "Giz Explains" in the subject line.