Nvidia GeForce RTX 4090 Founders Edition Review
Nvidia GeForce RTX 4090 Founders Edition Review

It’s been a long time since I could start a graphics card review without the caveat that global supply chain problems, semiconductor shortages, and a cryptocurrency boom made it all but irrelevant because most people wouldn’t be able to find one, or afford one even if they did. But Nvidia’s new generation powerhouse, the GeForce RTX 4090, comes at a time when those problems are mostly behind us, and we can get back to marveling at ridiculously high frame rates at top-end but not outrageous prices. It feels good! I’ve spent the last week putting the RTX 4090 Founders Edition through its paces, testing performance in both synthetic benchmarks and real-world gaming. It’s a big fella, and it comes with an even bigger price tag – $1,599, to be specific – but when we’re talking about a piece of gaming hardware capable of running something like Cyberpunk 2077 in 4K (with ray tracing) at a frame rate that can actually take advantage of a 144Hz monitor, it’s hard to argue that it isn’t worth every penny.

Nvidia GeForce RTX 4090 – Design and Features

First up, we have to address the elephant in the room – that is, the RTX 4090 itself. This is a massive graphics card, roughly the same size as the equally huge 3090 Ti before it. It’s a three-slot graphics card, and weighs just over five and a half pounds. But it is big and beautiful, with the same silver-and-black color scheme that debuted on Nvidia’s 30-series cards two years ago.

Nvidia’s engineers have turned in an impressive spec sheet, even compared against the still-awesome RTX 3090 Ti. It features 16,384 CUDA cores, up from 10,752 on the RTX 3090 Ti. It has a base clock of 2.23GHz that boosts up to 2.52Ghz, and 24GB of GDDR6 VRAM, making it capable of 83 shader teraflops of compute power, up from 40 on the 3090 Ti; 1,321 Tensor teraflops, up from 320; and 191 ray tracing teraflops, up from 78.

Those are all huge jumps. However, as we’ve seen many times before, simply doubling the core or shader counts doesn’t automatically mean that you’ll see double the performance in games. In this case, though, Nvidia has produced some substantial results, and the improvement I’ve seen in most of my tests is definitely more than just an incremental upgrade. This is a real generational leap forward.

Clearly, this is a card designed to dominate 4K gaming, but creators and professionals should find a lot to love in its 24GB of GDDR6X video memory. It’s overkill for even the most demanding games today, but if you’re creating 3D models, that capacity pays dividends in rendering time. At 24GB, OctaneRenderer will rarely have to tap into your system’s memory and will instead utilize the faster frame buffer of the card itself, dramatically increasing its rendering speed. Likewise, in Blender, you’ll be free to continue to work in the viewport while your render completes in the background.

Working within these applications is also significantly improved. Denoising renders is much faster, thanks to the GPU’s AI cores. But if you’re a game developer working in Unity, Unreal Engine, or Nvidia Omniverse, the most impressive improvement may just be the support for DLSS 3. The same benefit to games applies here: real-time, ray-traced renders at a fraction of the performance cost. Nvidia claims up to four times faster rendering performance compared to the RTX 3090 Ti.

If you’re a professional 3D artist, a dedicated workstation GPU will still be the best fit. In the same way RTX cards are optimized for gaming, Nvidia’s workstation cards have specific optimizations for professional creators. But those optimizations come at tremendous price increases. If you’re working for a small firm or are flying solo on a work-from-home project, the RTX 4090 is an excellent middle-ground that will save you thousands of dollars compared to its workstation equivalent.

Another major upgrade this generation is support for AV1 encoding. AV1 is a compression codec that allows you to render and stream video at a higher quality with less bandwidth. While the RTX 30-series supported AV1 decoding, encoder support is huge for the creation process. It’s incredibly efficient, so you can increase the video quality of your live streams without also increasing your bandwidth. Likewise, video rendering times in DaVinci Resolves can be cut by up to 50%.

With great power comes great heat, but Nvidia has made advancements in its cooler design too. From the outside, things look mostly the same. It’s a similar dual-axial design to what we saw last generation with the large, overlapping heatsink. This allows the rear fan to blow straight through and shed hot air up and out of the case. This time around, Nvidia has opted for larger, fluid-bearing fans to increase airflow and decrease noise. The space between the fins has also been optimized to increase airflow by up to 15%. We were impressed by this design when it debuted on the 30-series. It worked well then, and the same is true now. In our testing, the 4090 peaked at 64.7C, which is impressive for such a high performance card using an OEM cooler.

Nvidia GeForce RTX 4090 – Gaming Performance

Test system: Z390 Asus ROG Maximus XI Extreme Motherboard, Intel Core i9-9900K CPU (stock), Corsair H115i PRO RGB 280mm AIO CPU Cooler, 32GB Corsair Vengeance RGB Pro DDR4-3200, 1TB Samsung EVO Plus NVMe SSD, Corsair HX1200 1200-watt power supply.

Starting off with our synthetic benchmarks, the 4090 didn’t just arrive on the scene – it positively crashed through the wall like the Kool-Aid Man hopped up on his own juice. In 3DMark Fire Strike Ultra, the 4090 delivered an eye-watering score of 21,872. To put that in perspective, it’s around 50 percent more than the RX 6950 XT’s score of 14,512 – the best that we’ve recorded in house up til now – and about on par with the previous overclocking world record for Fire Strike Ultra for not just one, but two GPUs at once.

Unigine Heaven tells the same story: the 4090 scored leaps and bounds ahead of every card we’ve ever tested, with an improvement of 26% at 1080p, 39% at 1440p, and 31% at 4K compared to the 3090 Ti.

The ray tracing synthetics continue the trend, with the 4090 nearly doubling – and in some cases actually doubling – the scores of the 3090 Ti. These are some seriously impressive numbers.

Moving on to our gaming benchmarks, the 4090 continues to dominate. Our standard four-game test suite consists of Borderlands 3, Gears Tactics, Metro Exodus, and Total War: Three Kingdoms. All tests are run at the highest available graphics preset, with ray tracing and DLSS enabled, if available.

Just as in the synthetic tests, the 4090 takes a considerable lead ahead of every other card we’ve tested, across every game and resolution tested. Of course, it should. This is the top of the line card of a new generation, so only the 3090 Ti – which is the half-step upgrade over the 4090’s generational predecessor – should even have a chance at staying competitive with the new card.

The question is, how much of a lead does the 4090 jump out to? Looking at a wider selection of games specifically at 4K, the 4090 holds a large margin over the 3090 Ti, ranging from a 14% improvement in Metro Exodus all the way up to a 90% margin in Shadow of the Tomb Raider. That means framerates all around or above 100 fps – and remember, this is at 4K, max settings, with ray tracing turned on. Again, those are some seriously impressive numbers.

Nvidia GeForce RTX 4090 – DLSS

The 4090 undoubtedly owes some of its success to Nvidia’s continued improvements to DLSS, or Deep Learning Super Sampling. That’s the AI-enabled tech that lets games render at a lower resolution, in this case 1440p, but still output 4K with visual quality that is nearly indistinguishable from native 4K. Early versions of DLSS came with some sacrifices in fidelity due to the AI upsampling wizardry happening behind the curtain, but the tech has since advanced now to a point where games look as good, if not better, when you’re using DLSS than when you’re not.

The latest version, DLSS 3, is exclusive to the 40-series cards, but it offers a new feature that literally generates frames by itself. There’s an extremely technical explanation on Nvidia’s site if you’re interested, but in short the GPU looks at two sequential frames, calculates the differences between them, and then uses its AI chops to generate a frame in between them.

This is a major advancement compared to last generation and can be downright game changing for fps performance. While DLSS 2 looked at motion vector data and individual frames to intelligently upscale images and improve performance, the RTX 4090’s optical flow accelerator determines paths of motion and creates new frames all by itself, separate from the native frames of the game engine. It works in tandem with Nvidia Reflex, removing any latency that might make the game feel sluggish. Nvidia’s figures claim that DLSS 3 has the potential to deliver up to 4x the fps when it’s turned on.

In testing Cyberpunk 2077 running at 4K resolution with max settings, including ray tracing, I saw a significant uplift when using DLSS versus not, and even more so with frame generation turned on. With DLSS disabled, the 4090 scored 41.9 fps in the in-game minute-long benchmark. With DLSS enabled, but frame generation turned off, the score jumped up to 84 fps. With frame generation on…an incredible 136 fps, while still looking absolutely magnificent.

Playing a game as visually demanding as Cyberpunk 2077 – at 4K, with ray tracing – at framerates in excess of 120 is truly impressive.

Now, frame generation does come with a slight tradeoff, which is that since the GPU is generating additional frames, it introduces a tiny bit of latency as it inserts those frames into the stream. Remember, these are GPU frames being created outside the game engine. On its own, DLSS 3 could potentially look 120 fps while feeling like 60. But fear not: Nvidia Reflex compensates for this latency and then some. In fact, the added latency was imperceptible to me in the bit of Cyberpunk I played with frame generation turned on. But that might not be the case in other games, and it might be more problematic in fast-twitch esports games where every millisecond counts. Still, playing a game as visually demanding as Cyberpunk – at 4K, with ray tracing – at framerates in excess of 120 is truly impressive.

Another thing to be aware of is the fairly steep system requirements Nvidia recommends for the RTX 4090. To push the GPU to its potential, Nvidia suggests that you run the latest processor and DDR5 memory. It also requires DirectX 12, hardware-accelerated GPU scheduling, and is incompatible with v-sync (G-Sync is fine). As you can tell from our test bench, the hardware requirements are not a hard rule and you, like us, can still benefit if you haven’t upgraded your system recently.

The final piece of the puzzle is, of course, price. Typically new graphics card generations launch with the top end of what I’d consider the mainstream cards – in Nvidia’s case, the 3080, 2080, and 1080 of generations past. This time, though, the 4090 is the first to arrive, meaning I only have the previous generation as a point of comparison. And while we know how the 4080 will be priced – $1,199 for the 16GB variant, $899 for the 12GB variant – it’s impossible to know how those cards will perform until we’ve had a chance to test them ourselves.

So of course the 4090 is going to blow everything out of the water – it’s the enthusiast-level card of a brand-new generation. But is it a good value? That remains to be seen.

Still, comparing to everything else you can buy right now, including the 3090 Ti, its numbers are impressive, and that card only just launched earlier this year with a retail price of $2,000, though it has since seen its price drop to as low as $1,100 on sale. I said before that the 4090 holds a lead of at least 15 and up to 90 percent over the 3090 Ti at 4K. Considering those numbers, a price premium of 45% over the former king doesn’t seem so bad.

About Post Author