I've seen amazing demos of DirectX 11, but this is mind blowing. I have a hard time believing that this video is all rendered in real time with DirectX 11 hardware, but that's exactly what it is.
The new GeForce DirectX 11 500M models aren't a GPU breakthrough, but the tried and true Fermi architecture will get boosted graphics and processor core speeds, meaning laptop gaming that doesn't suck, and better application acceleration without taxing your CPU.
We've known AMD's Fusion GPU/CPU hybrid has been incoming, but now we know exactly what to expect from the tiny chips—clean(er) video, low energy, and the size of a fingernail.
The new graphics API comes with new buzzwords. We'll tell you what they mean and how they matter to your gaming experience.
AMD has been trying to crack their Fusion technology—combining a CPU and a powerful GPU in the same chip—for years now. Today, they showed off working Fusion chips in a demo that got Intel and Nvidia's attention:
The timing and price are up in the air, but Acer's next high-end gaming notebook will be the first using DirectX 11 graphics. It's also going to be insane.
We didn't see much of a difference between DirectX 10.1 and 11, but if you're a Windows Vista user who did and has been waiting impatiently: be happy because DirectX 11 is now finally available through Windows Update. [DailyTech]
Like with the jump from DirectX 9 to DirectX 10, you'll have to really concentrate hard to see what's changed between the two versions. If you can even really tell which version is which.
Good Lord—that is badass. What you are seeing here is the product of AMD's next-gen DirectX 11 graphics cards with an Eyefinity feature that allows you to use multiple monitors as a single display.
The first graphics cards that support DirectX 11—the next version of Microsoft's gaming APIs with more fiyapowah—from both ATI and Nvidia will apparently arrive in the next couple of months.
DirectX 11 is coming, and it looks pretty awesome. Sure, you get advancements in shading and better support for multi-core machines, but what's really got our heads turning is the concept of letting programmers use the GPU in your video card to do some of the heavy lifting, meaning your graphics chip becomes a second,…