Intel's 11th-Gen Processor With Iris Xe Graphics Is Really That Good

Intel’s Tiger Lake processor as it appears in pre-production systems.
Intel’s Tiger Lake processor as it appears in pre-production systems.
Photo: Intel

Intel’s Tiger Lake could be the leap in integrated graphics and processing power Intel’s needed for the last half-decade. My immediate thought when Intel announced its long-awaited Tiger Lake 11th-gen mobile processors earlier this month was, “Of all the times for Apple to move to its own CPU.” The claimed performance in video editing and work productivity tasks definitely grabbed my attention, but it was the promise of 1080p gaming at 60 frames per second that piqued it. For graphically demanding games, 60 fps at 1080p is a great achievement for an integrated GPU. I’ve only spent a little time with Intel’s latest mobile processor and while it hasn’t always matched the promise, and 60 fps at 1080p continues to elude it, what Tiger Lake can do overall is still so seriously impressive that it’s likely one of the best integrated graphics processors out there.

Advertisement

Tiger Lake isn’t in available laptops at the moment but when Intel offered me the chance to check it out and test the biggest Tiger Lake claims I naturally said yes. This isn’t a total review of Intel’s latest mobile CPU. I didn’t have the time to do a full round of tests or see how its battery life will compare to predecessors and competitors, but I did get a decent idea of Tiger Lake’s potential and it’s seriously impressive.

For testing, Intel instead sent me a reference laptop with a Core i7-1185G7. It’s not a finalized laptop design from any manufacturer and the results here might change based on any final designs from the manufacturer or Intel itself. Intel’s 11th-gen mobile processors are a system-on-a-chip (SoC), which integrates the CPU, GPU, and RAM soldered onto a single circuit. This Core i7 with Iris Xe Graphics has 4-cores/8-threads and a max frequency of 4.80 GHz. By comparison, AMD’s Ryzen 7 4800U with integrated Radeon graphics has 8-cores/16-threads and a max boost clock of 4.2GHz.

Advertisement

Pre-installed on the test unit, Intel included the same programs it featured during its announcement stream, which focused on video and photo editing tasks like colorizing and encoding. The benchmarks use automated scripts to run those tasks in the actual programs, so it’s a more accurate measure of what performance would actually be in a real-world scenario than a completely synthetic benchmark test. Don’t worry, I did my own tests to complement the ones Intel suggested. More on those in a bit.

To break down the tests Intel provided: Adobe Premiere encoded a 402 MB MP4 video to a smaller 37.1 MB file size; One Compute Photo Workflow measured how long it takes to perform colorizing and upscaling several photos of various dimensions, and One Compute Productivity Workflow used programs like MS 365 programs to measure how long it takes to do certain tasks.

While I don’t have a laptop with a comparable 10th-gen Intel processor on hand, I was able to run the same benchmark scripts on a Ryzen 7 4800U with integrated Radeon graphics, which happens to be the same one Intel did a side-by-side comparison with during its event. This probably isn’t a surprise but Intel easily beats AMD in the benchmarks provided by Intel. It is much faster than AMD at resizing photos, photo tagging, and video encoding, but only slightly faster at colorizing and exporting files in Word and Powerpoint.

Advertisement

For photo tagging, the discrepancy was so large between Intel and AMD that it wouldn’t fit into a chart properly. Those results, for analyzing 1,000 photos to tag, were 30 seconds for Intel and a lengthy 110 seconds for AMD.

Advertisement

But remember, these are benchmarks picked by Intel to show off Tiger Lake. That’s why I ran it through the standard battery of tests I run every laptop through here at Gizmodo. And that’s where Intel starts to lose some of its lead. To be fair, the Intel CPU does have fewer cores/threads than the AMD, so that will affect how it handles core/thread dependent workloads.

The Ryzen 7 4800U was faster than the Intel Core i7-1185G7 rendering images in Blender. Geekbench 4 scores are par for the course; Intel has the faster single-core score (by a lot), and AMD has the faster multi-core score (by a lot). Looks like in this area the tug-of-war continues between the two chip makers for right now.

Advertisement

But AMD’s integrated GPU is no match for Intel’s Iris Xe Graphics when it comes to gaming. Intel had about a 12-13 fps lead over AMD in more graphically demanding games. Far Cry 5, one of the more forgiving games in our usual benchmark tests, averaged 35 frames per second at 1080p on low. At 720p on low, the game averaged 55 fps. The AMD chip just couldn’t keep up.

Advertisement

But crucially the Xe integrated GPU’s performance is also a huge jump over Intel’s current UHD Graphics 630, which averages 17 fps, and still a major jump over Intel’s Core i7-1065G7 Iris Plus Graphics 25 W variant, which averages 33 fps. The Xe’s not 1080p at 60 fps, but getting the same framerate at a higher resolution than the previous-gen Iris graphics is impressive. This is an integrated GPU! It’s not something you’re going to be expected to pay a premium for like the discrete Nvidia graphics in gaming laptops. This is the kind of chip you’re going to find in your typical $1,000 Dell or HP laptop and it makes gaming, albeit at lower visual quality, an actual possibility.

Intel didn’t say all games will be playable at 1080p with Iris Xe Graphics, but it seems like a lot more could be than what it showed off at its August 2020 press event: PlayerUnknown’s Battlegrounds, Grid, Mount & Blade II: Bannerlord, Doom Eternal, and Battlefield V.

Advertisement

I popped on Doom and it averaged 45 fps at 1080p on low graphics. At 720p on low, it gets well over 60 fps. Again, a massive improvement in the entire integrated graphics space. While it’s not the 1080p/60 fps ideal, playing Doom at 45 fps looked and felt super smooth.

Advertisement

Battlefield V hovered between 34-54 fps at 1080p on low graphics, but the gameplay was choppy. Moving the resolution down to 720p stabilized the game and bumped the framerate to 64 fps. Less graphically strenuous games like Life is Strange: Before the Storm managed a surprising 60 fps on the “Hella High” setting at 1080p. Overwatch cranked out 100 fps at 1080p. The screen on Intel’s reference laptop is only 14-inches, so playing games at 720p didn’t seem all that different from 1080p. Even with my glasses on I couldn’t tell the difference, but then again I was running around shooting things and not paying attention to the finer details.

My only concern at this point is how quickly the chassis heats up. Manufacturers will need to carefully insulate their laptops or risk ending up with a chassis that gets uncomfortably warm—too warm to game after about 10-20 minutes. I didn’t see the CPU temp go above 80 C while gaming, and heat is far less of an issue if you’re using a laptop to type up a Word document, but it’s something to keep in mind.

Advertisement

Overall, I’m damn impressed by what I see so far from Intel’s 11th-gen processor with Iris Xe Graphics. It’s definitely not going to replace my desktop PC. But for its recent shortfalls in the desktop CPU space, Intel pulled out all the stops this time. As if laptops weren’t already going to get more interesting with Apple Silicon and AMD starting to take a larger share of the market, Intel jumped back into the ring swinging dual maces in the air. I’m looking forward to getting my hands on some manufacturer devices.

Advertisement

Staff Reporter, Reviews at Gizmodo. Formerly PC Gamer, Maximum PC.

Share This Story

Get our newsletter

DISCUSSION

dixie-flatline
Dixie-Flatline

Is that correct, that the CPU and GPU rendering scores are exactly the same?