You might not be a big graphics card PC buff, but in the under $100 sector, the new ATI Radeon HD 5570 is definitely a product to know. Tom's Hardware shares their excellent, extremely thorough review of the product here.
And to note, the 5570 basically the cheapest DX11 card that you can actually play games on, unlike the 5450, which is for HTPCs and not so much for pew pew.
In September of 2008—almost a year and a half ago—ATI surprised everyone on a budget with the launch of its Radeon HD 4670.
Released at $80, the card was priced to fight the entry-level GeForce 9500 GT, and yet the 4670's specifications were comparable to the previous-generation's Radeon HD 3870 flagship.
To make a long story short, the Radeon HD 4670's performance humiliated its competition. With 320 shader cores at its disposal, the Radeon HD 4670 changed the game at its price point. The card's presence forced Nvidia to create the GeForce 9600 GSO from high-end parts that were more expensive to manufacture, also causing the company to drop the price of its GeForce 9600 GT.
Since its inception, the Radeon HD 4670 has remained one of the best budget gaming cards on the market (and a staple recommendation in our Best Graphics Cards For The Money column). It is also notable that it held the distinction of being the fastest reference card that didn't require a dedicated PCIe power cable for over a year, until Nvidia introduced its GeForce GT 240, later bested by ATI's Radeon HD 5670.
ATI truly raised the bar on what we now expect from an $80 graphics card with its Radeon HD 4670. And it just so happens that today, AMD is releasing the spiritual successor to that venerable card in its Radeon HD 5570, also priced to compete at $80.
With the Radeon HD 5450 too slow to provide enthusiast-class gaming performance on a budget, and the Radeon HD 5670 priced at $100, we certainly can't help but to have high hopes that this new card might be the Holy Grail; an offering able to deliver usable triple-monitor Eyefinity gaming performance on an entry-level budget.
In the conclusion of the Radeon HD 5670 review I wrote last month, I mentioned that I hoped the Radeon 5500-series would include a DDR3-based version of the Radeon HD 5670. It looks like it's my lucky day:
Yes, the new Radeon HD 5570 is a DDR3-equippped Radeon HD 5670 with a 125 MHz-lower core clock rate. If you look at the data rate, you can see that the Radeon HD 5570 offers less than half the memory bandwidth of the 5670. This is because DDR3 theoretically delivers half of the bandwidth that GDDR5 memory provides at the same clock speed and on the same memory bus. As a result, we can expect a significant difference in performance between these two closely-related cards.
Here's a look at the GPU block diagram:
The Radeon HD 5570 GPU, like the Radeon HD 5670, contains five SIMD engines, each with four texture units and 16 stream processors. Of course, each stream processor sports its five ALUs (ATI calls them Stream Cores). As a result, this GPU boasts 400 total stream cores and 20 texture units. Note that there are two 64-bit memory controllers sharing two render back-ends. Each render back-end contains four color ROP units, resulting in a total of the eight specified ROPs and a 128-bit memory interface.
Lets compare this to the Radeon HD 4670, which the 5570 will likely be replacing:
At first glance, the 5570 looks impressive because of its shader processor increase. But when we dig a little deeper, we can start to see some chinks in the new card's armor. The older Radeon HD 4670 has a 100 MHz core clock speed advantage over the new 5570, and this almost makes up for the 5570's higher shader core count. The older 4670 also includes more texture units and a higher reference memory clock. Granted, it has been our experience that most of the Radeon HD 4670s in the wild actually come equipped with an 800 MHz clock speed (200 MHz under reference). But, from the reference specifications alone, we expect the Radeon HD 5570 to serve as a somewhat-parallel move from the 4670.
At this point, we're really not surprised. The sub-$100 Radeon HD 5000-series cards are not here to raise the bar with regard to game performance. Instead, they offer roughly similar 3D alacrity at any given price, but with the value-adds inherent to the product family: DirectX 11 support, Eyefinity, bitstreaming Dolby TrueHD and DTS-HD Master Audio, and ATI Stream. This is good news if you were upgrading from integrated graphics, and less so if you're already rocking a Radeon HD 4000-series card.
It's hard to avoid sounding like a broken record when it comes to the functionality found on the 5000-series cards: this is the eighth ATI card we've reviewed in six months, and they're identical across the entire line. For an in-depth look at the Radeon HD 5000-series features the best place to turn is probably our Radeon HD 5870 launch article, but we'll quickly go through the checklist to refresh your memory:
DirectX 11 Compatibility:
Until Nvidia launches its GeForce GTX 470 and 480 cards, based on GF100, ATI's Radeon HD 5000-series is the only game in town if you're interested in DirectX 11-based hardware.
Up until now, the list of DirectX 11 game titles has been somewhat sparse. But the eventual proliferation of the API is inevitable as time passes and more developers start working with it. Thus far, we haven't fallen in love with any of the DirectX 11-optimized titles (DiRT 2 being our more recent exploration). However, we are looking to Aliens Vs. Predator as the potential killer app that will make DirectX 11 hardware a must-have for gamers. These expectations are primarily based on the examples of tessellation we've seen demonstrated pre-launch.
Eyefinity Triple-Monitor Gaming
Having experienced Eyefinity gaming, I can say that it is a lot more impressive than I assumed it would be; it really saturates the player's peripheral vision. Having said that, there are still issues associated with multi-monitor gaming on the Radeon HD 5000-series: high resolutions that aren't necessarily playable on mainstream cards, a developing game support ecosystem with some odd aspect ratios, and the need to use one DisplayPort monitor (or an active DisplayPort adapter for older displays). These considerations have the potential to take some of the fun out of Eyefinity, but we expect most of these issues to work themselves out over time. As usual, early adopters will take the brunt of the teething pains as ATI optimizes its drivers.
Bitstreaming Dolby TrueHD and DTS-HD Master Audio
Home theater enthusiasts who wish to send an intact Dolby TrueHD or DTS-HD MA stream directly to their receivers for decoding have a few different options for enabling such functionality. The Radeon HD 5000-series cards are one (and they're ideal if 3D performance is a priority). Intel's Clarkdale-based CPUs with integrated graphics are viable as well, though much-less capable of playing even mainstream games at 1080p. A sound card like Asus' Xonar HDAV 1.3 works as well, but is less of a value now that ATI and Intel support similar functionality. Both companies fully accelerate Blu-ray playback too, whereas the sound card option requires addressing video through some other means.
ATI Stream and DirectCompute
In this author's opinion, ATI Stream and DirectCompute support are the least relevant features that the new Radeons offer today, but they have the potential to make a large impact. It's all a matter of application support, and at this time there isn't enough of that to get excited about. When DirectCompute is used for advanced physics calculations in games, when ATI Stream is used to accelerate everyday applications, then this author will be excited about it. But we're still waiting for that critical mass, despite the fact that both ATI and Nvidia like to show off the few mainstream titles that can be accelerated via GPU-based computing right now.
The Radeon HD 5570 certainly doesn't look like a card sporting 400 shaders—likely a side-effect of a move to 40nm manufacturing, cutting back on die size and thermal output, thereby requiring a more conservative cooler and smaller PCB. Frankly, it looks more like the entry-level Radeon HD 4550 with an active cooler. But we mean that with all due respect; fitting this into a microATX (or even mini-ITX) HTPC enclosure in the living room is a real win.
The Radeon HD 5570 doesn't need a dedicated power connector, which is no surprise since the more powerful Radeon HD 5670 doesn't either. Of course, we do expect this card's idle and load power consumption numbers to be even lower. That's another bit of good news since, again, the Radeon HD 5670 already demonstrates impressive results in this regard.
Notice how small the reference cooler is. The impressive part is that is does a great job keeping temperatures in check, as we will demonstrate in the benchmarks.
Our Radeon HD 5570 lacks a CrossFire bridge, but AMD let us know that these low-end Radeons will work quite well in CrossFire without the bridge connector; in fact, it's one of those designs able to enable CrossFire operation over the PCI Express bus. The thing is, with 400 shader cores per card, it is difficult to imagine a scenario where dual Radeon HD 5570s would make sense. The Radeon HD 5770 costs less than two Radeon HD 5570s, but sports 800 shader cores and comes with faster GDDR5 memory. This is one of those scenarios where a single board is a better value than two less-expensive derivatives.
The card's small size allows for a half-height output bezel swap, as long as you're willing to give up the analog VGA connector. This is interesting because half-height versions of respectable gaming cards, such as the GeForce 9600 GT, are usually accompanied by notable price increases, since they are often custom-designed by board vendors. The Radeon HD 5570 should give half-height card buyers access to some low-cost hardware capable of decent gaming.
This reference model came with VGA, DVI, and HDMI outputs. This is a little perplexing because DisplayPort output is necessary for triple-monitor Eyefinity use. Thus, our sample is not triple-monitor capable. As with most of the 5000-series card, each manufacturer has some flexibility as to the output options it wishes to include, so a version with DisplayPort should not be a difficult find post-launch.
The memory on this reference card is Samsung K4W1G1646E-HC11, rated for 900 MHz operation. We found it was willing to go a lot farther than that in our overclocking tests. Of course, the memory on retail boards will vary based on what each manufacturer decides to use.
Looks a lot like the Radeon HD 5670 GPU, doesn't it? That would make sense, since it's the same thing.
When choosing cards to test against the Radeon HD 5570, we looked at options in the same price league, from the $70 Radeon HD 4670 to the $95 Radeon HD 5670. We also included the Radeon HD 4650 for reference, as this is the lowest-end graphics card we recommend to gamers in our monthly Best Graphics Cards For The Money column.
Many of our test units are factory-overclocked models. To better represent a level playing field (and to address some of the concerns we've seen in the comments section), we have underclocked all of these cards to reference clock rates. The only exception to this is the Radeon HD 4670, which comes with 800 MHz memory (compared to the reference 1,000 MHz). But this was not modified, as most Radeon HD 4670 models are actually sold with 800 MHz memory.
CPU: Intel Core i7-920 (Nehalem), 2.67 GHz, QPI-4200, 8MB Shared L3 Cache
Overclocked to 3.06 GHz @ 153 MHz BCLK
Motherboard: ASRock X58 SuperComputer Intel X58, BIOS P1.90
Networking: Onboard Realtek gigabit LAN controller
Memory: Kingston PC3-10700 3 x 1,024MB, DDR3-1225, CL 9-9-9-22-1T
ATI Radeon HD 5670
725 MHz Core, 1,000 MHz Memory, 512MB GDDR5
Gigabyte GeForce 9600 GT
650 MHz Core, 1,625 MHz Shaders, 900 MHz Memory, 1GB DDR3
Zotac GeForce GT 240 512 MB AMP! Edition
600 MHz Core, 1,460 MHz Shaders, 1,000 MHz Memory, 512MB GDDR5
Underclocked to reference speed: 550 MHz core, 1,360 MHz shaders, 850 MHz memory
Gigabyte GeForce 8800 GT (representing GeForce 9800 GT)
700 MHz Core, 1,700 MHz Shaders, 920 MHz Memory, 512MB DDR3
Underclocked to reference speed: 600 MHz core, 1,500 MHz shaders, 900 MHz memory
Diamond Radeon HD 4670
750 MHz Core, 800 MHz Memory, 1GB DDR3
Sapphire Radeon HD 4650
600 MHz Core, 400 MHz Memory, 512MB DDR2
Hard Drive: Western Digital Caviar WD50 00AAJS-00YFA 500GB, 7,200 RPM, 8MB cache, SATA 3.0 Gb/s
Power: Thermaltake Toughpower 1,200W 1,200 W, ATX 12V 2.2, EPS 12v 2.91
Operating System: Microsoft Windows Vista Ultimate 64-bit 6.0.6001, SP1
DirectX version: DirectX 10
Graphics Drivers: AMD Catalyst 9.12, Nvidia GeForce 195.62
Patch 1.2.1, DirectX 9, 64-bit executable, benchmark tool
Low Quality, Medium Textures, Shadows, Physics, Shaders, Water, and Sound, No AA
Far Cry 2:
Patch 1.02, in-game benchmark
Medium Quality, No AA
Call Of Duty: Modern Warfare 2:
Version 1.0.0, Custom THG Benchmark
Highest Settings, no AA
Version 1.0.0, Custom THG Benchmark
Run 1: Ultra High Settings, No AA, DirectX 9
Run 2: Ultra High Settings, No AA, DirectX 11
World In Conflict:
Patch 1009, DirectX 9, timedemo
Medium Details, No AA/No AF
Tom Clancy's H.A.W.X.:
Patch 1.02, DirectX 10 & 10.1, in-game benchmark
Low Shadows, Sun Shafts
Medium View Distance, Environment, SSAO
High Forest, Textures
HDR, Engine Heat, and DOE On, No AA
Left 4 Dead: Version 126.96.36.199., Custom THG Benchmark
Run 1: High Settings, no AA, no AF
Run 2: High Settings, Medium Shaders, 4xAA, 8xAF
Resident Evil 5: High Shadows and Textures, Medium Overall Detail, Motion Blur On, no AA, no AF
Fallout 3: Patch 188.8.131.52., Custom THG Benchmark High Quality, No AA, No AF
Synthetic Benchmarks and Settings: 3DMark Vantage Version: 1.02, PhysX Off, 3DMark scores
3DMark Vantage Version: 1.02, PhysX Off, 3DMark scores
We begin with 3DMark, the only synthetic gaming benchmark we employ. According to this application, the new Radeon HD 5570 will perform somewhere between the GeForce 9600 GT and its predecessor, the Radeon HD 4670.
Far Cry 2 is the first game in our suite, and it seems to correspond with 3DMark Vantage fairly well. The new Radeon HD 5570 performs a little better than the 4670, but below the GeForce 9600 GT. In any case, performance is acceptable all the way up to 1920x1200.
In Crysis, the new Radeon HD 5570 continues to fulfill our expectation by performing a little better than the Radeon HD 4670, but still below the GeForce GT 240 and 9600 GT.
Call of Duty gives us no surprises, but shows us a tighter playing field. The CoD: MW2 engine is somewhat forgiving as far as performance is concerned, considering how great the visuals look.
The Radeon HD 5570 clearly suffers from its combination of a 128-bit memory interface and DDR3 memory. All of the GDDR5-equipped cards with 128-bit memory interfaces are performing notably better so far. Nevertheless, the Radeon HD 5570 is doing well enough, even at 1920x1200.
The Radeon HD 5570 continues performing exactly where we'd expect it to be.
Fallout 3 is another game that really seems to take advantage of memory bandwidth that the 5570 can't provide, although the 5570 delivers smooth performance all the way up to 1920x1200 in this title.
The Radeon HD 5570 runs well in Left 4 Dead at 1920x1200 with the highest visual settings, but some of its contemporaries run even more fluidly.
H.A.W.X. demonstrates what we consider to be the only real puzzling result in our test suite, with the new Radeon HD 5570 performing right on par with (or a little below) the Radeon HD 4670. Regardless, performance is acceptable across the board.
In DiRT 2, the DirectX 11 effects are hard to notice and definitely not worth the performance hit that the sub-$100 Radeon cards suffer.
The Radeon HD 5570 incurs a performance hit with 8x anisotropic filtering enabled that the other cards do encounter. Aside from this, it is interesting to see how well the 5570 fares against Nvidia's GeForce GT 240 when anti-aliasing is employed, despite its throughput disadvantage.
Aside from gaming results, we can see that the Radeon HD 5570 brings extremely low power usage to this class of gaming card—even lower than the entry-level Radeon HD 4650. There is certainly nothing to complain about here, and this data makes it very easy to recommend the 5570 as an upgrade to folks who don't want to swap power supplies, don't want to add heat to their existing machines, or are looking to integrate the card in a small form factor chassis.
Because we have a number of graphics cards in play sporting non-reference cooling solutions, this isn't a comparison of standard cooler performance. But it does show us that the tiny active cooler on ATI's Radeon HD 5570 does its job more than adequately. A 31 degree load temperature over ambient is a very good result, and speaks to how little heat this 40nm GPU generates at the clock rates the company is using.
It has not escaped us that the Radeon HD 5570 shares the same GPU as the Radeon HD 5670, and that the more expensive card employs a clock rate 125 MHz higher. This gives us a nice round target to aim for as we overclock.
While the Catalyst Control Center's Overdrive tool caps overclocking to 700 MHz core and 950 MHz memory settings (versus the stock 650 MHz core and 900 MHz memory clocks), we wanted to exceed this imposed limitation. Thus, we employed MSI's Afterburner overclocking tool. Altering the config file for this utility allows us to overclock Radeon cards past any artificial ceilings.
Indeed, we were able to take our Radeon HD 5570 sample to 750 MHz core and 1,000 MHz memory—a 100 MHz increase over both reference specifications. While the card doesn't come with a beefy cooler, even our Crysis benchmark runs didn't push the GPU past 63 degrees Celsius. If MSI adds voltage modification support for the Radeon HD 5570 in its software, we'd expect to see additional headroom opened up without running into thermal issues.
In any case, we benchmarked Crysis and Far Cry 2 to see what the overclock would yield:
The overclock doesn't give us a ton of extra performance, but it certainly brings the results closer to Nvidia's stock GeForce 9600 GT and ATI's Radeon HD 5670.
What about Eyefinity? Is the Radeon HD 5570 a viable option for a low-budget triple monitor setup? We wanted to try three 17" 1280x1024 monitors, but as we mentioned, our sample did not have the requisite DisplayPort that'd be needed to enable Eyefinity-based display configurations. To get a rough idea of what triple-monitor performance might look like, we used two monitors, yielding a 3840x1200 desktop resolution. This is extremely close to the 3840x1024 resolution we'd get from three 17" displays.
Eyefinity might viable for desktop productivity, but our results suggest that gaming might be a stretch on a mainstream card like the Radeon HD 5570. Perhaps low-quality settings would be more attainable. But at that point, we'd rather experience a demanding title's recommended image quality options versus toning everything down to get playable frame rates at higher resolutions. Eyefinity is a great feature, but truly taking advantage of it means buying a powerful-enough GPU. This one falls short of that mark, even if you're using relatively-small 17" monitors.
What do you get when you mix DirectX 11, a trio of display outputs (though, bear in mind, the card on our test bench doesn't support three displays), bitstreaming, and a $10 price increase to the Radeon HD 4670? You get a Radeon HD 5570, more or less. This card is a tad faster than its predecessor, but performance is really quite similar.
This isn't necessarily a bad thing. Adding $10 to the Radeon HD 4670 for the extra features seems reasonable if you're coming from the world of integrated graphics. It's a less-attractive option if you're already running a discrete card and would need to sink another $80 into your next card for similar performance. In that case, you'd be much better served by a straight-up upgrade to something like the Radeon HD 5750 or 5770. Instead, the Radeon HD 5570 maintains the status quo when it comes to price/performance, and unfortunately isn't able to out-do the similarly-priced GeForce 9600 GT.
With the 5570's game performance so close to that of its predecessor, you're faced with the same conundrum we've seen with all of the other sub-$100 Radeon HD 5000-series cards thus far: do you want game performance or a bundle of value-adds? Are you willing to sacrifice detail settings and anti-aliasing in favor of Eyefinity and DirectX 11 support (which is of questionable utility on a mainstream GPU anyway)? Just as the buyer with $100 has to choose between the feature-rich Radeon HD 5670 and the fast Radeon HD 4850, enthusiasts with $80 have to decide whether they'd prefer the feature-rich Radeon HD 5570 or quicker GeForce 9600 GT.
Aside from raw gaming performance, it's important to acknowledge that the features being offered by ATI's Radeon HD 5570 are, in fact, compelling today. Consider that this is a half-height reference card, able to transform even the smallest systems into viable gaming machines. Power usage is extremely low for the performance offered, and no auxiliary power connector is needed. And yet, the Radeon HD 5570 manages playable frame rates in every one of our game tests at 1680x1050 (and sometimes 1920x1200). Triple-monitor Eyefinity gaming could be viable in less-demanding titles, such as World of Warcraft, at a price substantially lower than the Radeon HD 5670. Just make sure the board you buy has a DisplayPort output first.
All of these features make it easier to recommend the Radeon HD 5570 over the slightly less expensive (but notably slower) Radeon HD 5450.
Conversely, we can't deny the appeal of Nvidia's GeForce 9600 GT for the gamers sticking to optimal performance on a single display for $80.
We'll also point out that, in this crowded price segment, it's amazing how much difference a few dollars can make. The new Radeon HD 5670 can already be found for $95 online, representing a modest $5 drop from the MSRP within a month of its release. If the Radeon HD 5570 follows suit and distance itself from the GeForce 9600 GT by $5 or $10 more, then it becomes a more attractive buy. With AMD's monopoly on DirectX 11 hardware for the time being (and near future, given the lack of detail on any mainstream refresh to Nvidia's lineup), this might not realistically happen until the competition can deliver an updated feature set.
At this point, AMD's DirectX 11 portfolio is now complete, from top to bottom, $80 to $680. The only missing piece of the puzzle might be a Radeon HD 4650 counterpart in the 5000-series, though we've heard no mention of such a card. It's been a long journey.
Reprinted here courtesy of Tom's Hardware.