The Future Is Here
We may earn a commission from links on this page

Giz Explains: Mac OS 10.6 Snow Leopard Parallel Processing and GPU Computing

We may earn a commission from links on this page.

As you've probably heard, the next version of OS X, Snow Leopard, will not wow us with a crazy circus of features like Time Machine and Boot Camp. So why would Apple spend a year programming an OS that they can't boast has over 300 new features? Here's a quick rundown of how Apple is totally rebuilding OS X to take advantage of Core 2 Duos, graphics cards and parallel processing, in order to deliver serious performance gains. And yes, that is a big deal.


This is not going to be a super technical breakdown of parallel computing for the super nerdy, just a rough overview for my mom. Basically, parallel processing is what it sounds like: Multiple computations or processes or um, just "things," are carried out or done simultaneously, in parallel (at the same time!). Multi-core processors like Intel's ubiquitous Core 2 Duo have quickly become mainstream. They're really good are doing several things at once, since each processor core can crunch away on something-more cores, more simultaneous Captain Crunching, more faster. A brilliant consumer taste of this was actually Rosetta on OS X-on a dual-core system, one core would be "translating" the code from the PPC version, while the other ran the program (roughly speaking).

Sounds gravy right? Well, as Steve alluded in his explanation of Snow Leopard, parallel programs ain't easy to write-they're harder than sequential ones for sure, 'cause it requires the kind of math that can be broken up into little parts you can solve independently and then put back together again. Artificial intelligence, for instance, is not cakey for this. On the other hand, something like tomography-a technique for creating 3D images-totally is, because it's highly vectorizable. Or video stuff (cause you can easily divvy up the chores), videogame graphics and physics, generally.


No surprise that modern graphics cards are actually really good at parallel processing, 'cause of the way they're architected and because they usually have a buttload of cores-Nvidia's latest high-end GeForce card, the GTX 280, has 240. (It's why they're suitable for cheap supercomputers.) Nvidia, for instance, showed me some of the insane physics jujitsu the GTX 280 can pull off, it and ATI both have crazy new graphics cards (FireStream 9250 and Tesla 10P) built for "general purpose" supercomputing. Sony's Cell is sorta like this with multiple cores, but none of these are very good general processors the way stuff is designed now. (You don't see any computers running on an ATI Radeon CPU, or Cell handling the main workload on Toshiba's new laptops, do you?)

You'll note that part of Snow Leopard's feature list is OpenCL, an easy way for developers to tap the parallel processing power of graphics cards, in addition to being optimized for multiple cores courtesy of its "Grand Central" tech set. So Snow Leopard is pretty much all about parallel processing. (Microsoft hasn't been overly vocal about Windows and parallel computing.)

From what Apple has said-and the whole "Grand Central" deal (it "takes full advantage by making all of Mac OS X multicore aware and optimizing it for allocating tasks across multiple cores and processors")-it's clear that Apple is totally re-architecting Snow Leopard around parallel processing, with Grand Central acting much like the real one, organizing, assigning and scheduling a whole bunch of tasks/trains along a bunch of different paths/tracks. It's a major undertaking-Intel and Microsoft are throwing a ton of money at parallel computing themselves-and we're pretty curious about Apple is going to make parallel programming easier for programmers in a way supposedly no one's done before.


Something we missed, or you still wanna know? Send any questions about processors, prostates, Bananas or anything else to, with "Giz Explains" in the subject line.