This year, I bought myself an Nvidia GeForce GTX 970 graphics card. It was time to upgrade. And I was pleasantly surprised to find I could buy a mini version of one of the best cards ever made. Now, I can potentially fit my beefy gaming PC into a console-sized case. But a new card from AMD is about to do small and…
Every couple months I pull out my old gaming laptop from college: a busted-ass Dell m1210 that's had its motherboard replaced four times under warranty because Nvidia couldn't get its shit together. It never works right. I've had enough. I'm sticking this sucker in the oven.
Two years ago, Nvidia announced its original Titan graphics card, a bad-boy built on the company's Kepler technology and for a time the most powerful card out there. Now, the Titan X is here and it's ready to reclaim the throne.
PSA: Apple has launched a repair program for MacBook Pros from 2011 to 2013 displaying video problems. People with video issues who have the affected models can get them fixed for free, and people who already paid to get their laptops fixed can get a refund. This includes MacBooks with scrambled video, video that…
Nvidia is only just starting to put out cards that run on its new Maxwell architiecture, but its eyes are already on the future. Today at its annual GPU Technology Conference, Nvidia announced its next, far-future architecture: Pascal.
Intel's integrated graphics have taken plenty of heat over the years, and most of it deserved. But the climb to respectability that started back with Sandy Bridge is about to get a turboboost. Meet Iris, the biggest generation jump in Intel's integrated graphics to date. Get ready to game.
Today Nvidia is pulling the wraps off the GK110-based GeForce GTX Titan, a single-GPU card that is expected to easily capture the title of Baddest Ass GPU in the world when benchmarks are released this Thursday, February 21st. The Titan is Nvidia's "Big Kepler" GPU, and has double the transistors and almost double the…
Right now, ARM graphics aren't exactly bad. The Samsung's Galaxy SIII with its quad-core Mali 400 GPU leads the pack in mobile prettiness, but the upcoming ARM GPUs, scalable to eight-cores, could blow it out of the water.
We've been hearing for years that integrated graphics—meaning your computer doesn't have its own, separate graphics card—won't catch up to the beefier cards, but it'll be good enough some day soon. Hasn't happened yet. But these reported benchmarks of Intel's new Ivy Bridge processors from CPU World look pretty…
Anandtech benchmarked Samsung's refreshed Galaxy S II phone over the weekend and discovered that its Mali-400 quad-core GPU contained within its EXYNOS chipset is not only powerful, but nearly 2x faster than any other Android device—phone or tablet.
The next Tegra chipset (codename: Kal-El) is still in development, but today we learned that it will feature a quad-core CPU and 12-core GPU that are 5x more powerful than the Tegra 2 innards. It can out 1440p video to a 2550x1600 display as well.
Nvidia's GeForce GTX 580 is what the original should have been: quieter, full-featured, faster and more efficient.
Nvidia gave us a taste of what its Fermi-based notebook graphics cards would be like with the GeForce GTX 480M, but now it's time to meet the whole family. That's seven Fermi GPUs, running the gamut from face-melting to face-singeing.
The new graphics API comes with new buzzwords. We'll tell you what they mean and how they matter to your gaming experience.
Hey! Everyone's favorite video playing software, VLC, jumped to 1.1.0 today and added some GPU decoding for Windows and Linux users as well, giving your CPU a break when you're trying to play those gigantic HD files you've been feverishly torrenting. One caveat: the Windows version only works with nVidia GPUs, as…
AMD has been trying to crack their Fusion technology—combining a CPU and a powerful GPU in the same chip—for years now. Today, they showed off working Fusion chips in a demo that got Intel and Nvidia's attention: