Comparison reviews

Far Cry 6 Benchmarks and Performance

Far Cry 6 has arrived, ushering in a new API and enhanced visuals compared to the last two games in the series. This marks the first time Far Cry has used the DirectX 12 API, a requirement for the DirectX Raytracing options. We’ve benchmarked over two dozen AMD and Nvidia GPUs, including most of the best graphics cards, to see how they stack up. Surprisingly, performance isn’t significantly worse than Far Cry 5, at least not on our test system that appears to impose some CPU bottlenecks.

Officially, the Far Cry 6 system requirements look quite tame, at least for the minimum setup. A Radeon RX 460 or GeForce GTX 960 (both with 4GB VRAM) should suffice for more than 30 fps at 1080p low. Stepping up to 1440p ultra at 60 fps moves the needle quite a bit, with a recommended RX 5700 XT or RTX 2070 Super. Meanwhile, ray tracing at 4K ultra drops back to a 30 fps target, and an RX 6800 or RTX 3080 is recommended — and you really do need at least 10GB VRAM to handle native 4K ultra with DXR.

TOM’S HARDWARE GPU TEST PC

We’ve used our standard graphics test PC, equipped with a Core i9-9900K CPU, to see how the various GPUs actually stack up. That’s a faster CPU than the recommended Core i7-9700 / Ryzen 7 3700, though we still ran into some bottlenecks.

We used Nvidia’s most recent 472.12 drivers, which are not ‘game ready’ for Far Cry 6 — we asked about getting updated drivers but were not provided any in time for this initial look at performance. Meanwhile, AMD provided early access to its 21.10.1 drivers, which it released to the public on October 4, and those are ‘game ready.’

Like most recent Far Cry games, Far Cry 6 includes a (mostly) convenient built-in benchmark. We’ve used that for our testing, as it minimizes variation and keeps things simple. We tested each card multiple times at each setting, discarding the first run after launching the game — that run usually scores better as the GPU hasn’t warmed up yet and can boost slightly higher.

For this initial look we used a preview version of the game provided by AMD and Ubisoft. We’ve since gone back to retest a few cards with the public release, and didn’t notice any significant changes in performance. We’ve tested at 1080p, 1440p, and 4K using the medium and ultra presets. We left off the HD textures at medium and enabled them for ultra. We also tested at ultra with DXR reflections and DXR shadows enabled. We enabled CAS as well, mostly because I like the way it looks and it has a negligible impact on performance, left off motion blur, and only tested at native resolution — though I did run some additional tests with FidelityFX Super Resolution to see what sort of scaling users can expect.

We’ll provide a deeper dive into the settings and image quality below, but let’s get straight into the graphics card benchmarks.

(Image credit: techy’s points)

Did we mention that Far Cry 6 is an AMD-promoted game? We can’t say for certain what’s holding the Nvidia GPUs back, but the fastest cards can’t even hit 144 fps at 1080p medium. Actually, if we’re being frank, none of the cards maintain close to a steady 144 fps experience — AMD’s fastest GPUs average around 144 fps, but minimums all hover in the low 100s. Nvidia’s GPUs currently top out at around 130 fps.

There are also some weird variations among the GPUs at this admittedly low setting. Basically, the CPU bottlenecks combined with different GPU core counts and other elements mean that cards that are normally slower can end up outperforming higher-end models. We see that with AMD’s RX 6700 XT as well as the RTX 2080 Ti. Again, updated Nvidia drivers, in particular, will likely improve the situation, but this is how things currently stand.

We don’t have an RX 460 or GTX 960 still hanging around, so our slowest card for testing is the GTX 1050 Ti. That’s generally a bit faster than the GTX 960 4GB, depending on the game code, though looking at the GTX 980 and GTX 1060 6GB, we suspect the 960 would be relatively close to the 1050 Ti. Anyway, it manages a reasonable 44 fps at 1080p medium, and dropping the settings to low should give it an added boost.

If you’re after 60 fps or more, just about any reasonable mid-range or higher GPU released in the past five years should more than suffice. The GTX 1060 and RX 570 both clear that mark and most of the other GPUs hit triple digits — which is good, since GPU prices are still all kinds of messed up.

(Image credit: techy’s points)

Most of the tested GPUs take a relatively small hit going from 1080p to 1440p at medium settings. We also start to see performance rankings line up more as we’d expect, though the RX 6800 XT still took top honors for AMD, and the RTX 3080 Ti edged out the RTX 3090. Everything from the RTX 2060 and above clears 60 fps, which means cards like the GTX 1080 and 1070 should manage okay as well at these settings.

I should also note that the GTX 1080 Ti showed some odd behavior initially, and I’ve now retested. I’m not certain what went wrong before, but retesting showed a 20-35% improvement compared to the original numbers. It seems my well-used sample was acting up, though even with the improved results the RTX 2060 still delivered a superior experience at 1080p, which is not normally the case.

This is also the last setting where most of the 4GB cards can deliver a decent experience, and the GTX 1050 Ti basically can’t go much further. It performs about the same at 1080p ultra, but the card was never really intended for high-resolution gaming.

(Image credit: techy’s points)

Finally, we get the usual range of performance and rankings that we’d expect, with a few minor quibbles. The RTX 3080 Ti still edges out the 3090 — and we’re using the Founders Edition models, so it’s not like we have an exceptionally overclocked 3080 Ti, or even a great cooler for that matter — but nearly everything else lands in the usual spot. It’s finally worth discussing the AMD vs. Nvidia standings, as CPU bottlenecks aren’t a factor here.

The top three cards all hit 109 fps, with AMD still holding a slight edge with the RX 6900 XT over Nvidia’s RTX 3080 Ti and RTX 3090. This is also a resolution and setting combination that still runs okay on cards with 6GB VRAM, provided the GPU has enough gas in the tank. We’re not using the HD textures for our medium quality testing either, which helps a bit.

4K does tend to penalize AMD’s RDNA2 architecture a bit more relative to lower resolutions, as raw memory bandwidth starts to win out over the Infinity Cache. The RX 6700 XT falls below the RTX 3060 Ti, where it basically matched the RTX 3070 Ti at 1440p medium. The RX 6600 XT likewise falls behind the RTX 3060, though it still beats the previous generation RTX 2060 at least.

At the bottom of the chart, the RX 5600 XT and above easily break 30 fps, but both the RX 580 8GB and GTX 1060 6GB come up short. None of the cards with only 4GB VRAM could manage even 20 fps, so we dropped them from the charts.

(Image credit: techy’s points)

Kicking the settings up to ultra quality and enabling the HD texture pack generally results in slightly worse performance than 1440p medium, though interestingly, there are some exceptions to that rule. The bottom half of the chart, roughly, has GPUs that performed better at 1080p ultra, while the cards in the top half appear to hit a different bottleneck.

Ultra quality increases the load on the CPU, so the maximum fps on our i9-9900K drops from around 144 at medium to 123 fps at ultra. Of course, other factors may also come into play, but for the most part, you can choose between 1440p medium and 1080p ultra — or perhaps 1440p ultra with FSR enabled — and get a similar experience.

(Image credit: techy’s points)

Depending on your GPU, 1440p ultra roughly equals 4K medium performance, but now we get even more oddities, particularly with the minimum fps on the Nvidia GPUs. Again, these are pretty clearly driver issues that are likely to be ironed out, as the RTX 3080 had the highest average fps but a lower minimum fps than the next six Nvidia GPUs.

AMD’s GPUs continue to hold the pole positions, and while the gap isn’t quite as egregious as in Assassin’s Creed Valhalla — another Ubisoft and AMD-promoted game that favors AMD’s latest architecture — it’s still larger than in most other recent games.

I also left in one 4GB card here, the RX 570, as an example of how badly performance can tank when you exceed the VRAM. That card did okay at 1080p ultra, but performance drops by more than half moving up to 1440p. Use a card with 6GB, and performance only drops about 25–30%.

(Image credit: techy’s points)

As you’d expect, 4K ultra punishes a lot of the GPUs. In fact, anything with less than 8GB VRAM basically fails to break 30 fps, often badly. So you can generally expect performance to be 40–50% slower than 1440p ultra in GPU limited games, and that’s when you don’t run out of VRAM.

This is also the first time the RTX 3090 has delivered better performance than the RTX 3080 Ti. While 12GB VRAM should be enough, Far Cry 6 at ultra settings starts to benefit from having 16GB or more.

Sixty fps at 4K is still possible on quite a few cards, though nothing that nominally costs under $500. And we haven’t even maxed out the settings yet with ray tracing, so let’s just go ahead and do that next.

(Image credit: techy’s points)

Considering Far Cry 6 has clearly been hitting some CPU bottlenecks at 1080p so far, the penalty for enabling ray traced shadows and reflections perhaps aren’t that bad. Of course, part of that is thanks to the hybrid reflections that the game uses, combining SSR (screen space reflections) with RT to reduce the performance hit.

Considering the developers decided to not support ray tracing on the latest consoles, we were more than a bit surprised to see just how well Far Cry 6 runs with all the settings cranked to 11. Even the old RTX 2060 still managed to break 60 fps, which makes us wonder why Ubisoft didn’t include RT (and FSR) on the PS5 and Xbox Series X — both of those should be more potent than the RX 6600 XT, which also delivered over 60 fps.

Compared to running without the DXR effects, performance drops about 20% on most of the GPUs. There are also still CPU limits in play, as evidenced by the wall at around 110 fps on the Nvidia GPUs and 120 fps on the top AMD GPUs. That’s more than a bit surprising since ray tracing usually pushes the bottleneck so far toward the GPU side of the equation that the CPU no longer matters. But in open world games like the Far Cry series, even with all the graphics effects turning on, 1080p can still end up being CPU limited.

We’re planning to do some additional testing and look at CPU scaling, but we couldn’t get that done in time for the initial launch. Plus, we’ll likely need to redo a lot of these tests once updated drivers come out and the game gets a few patches, but so far, it looks like Far Cry 6 continues the series’ legacy of being CPU limited.

(Image credit: techy’s points)

All GPUs technically remain playable even at 1440p with DXR enabled, though the RTX 2060, RTX 3060, and RX 6600 XT all fall below 60 fps now. The RTX 2080 Ti also shows yet again that the Ampere architecture really did improve performance in a lot of ways, with the RTX 3070 coming in 3 fps above the former heavyweight champion.

We should note that the RT effects really aren’t all that visually impressive. That partly explains the relatively small performance hit. Certain things look better with DXR enabled, but you can easily turn off all the ray tracing stuff and not really feel like you’re missing out. SSR still provides simulated reflections on puddles and such, so all you get are a few other objects reflected that SSR doesn’t handle.

The shadows, on the other hand, remain mostly a waste of computational effort in my book. Sure, they’re likely more accurate, but the default shadow mapping techniques still look fine to my eyes. The result of doing less complex ray tracing work is that Nvidia’s RT hardware in the RTX 30-series cannot shine quite so brightly, so AMD’s top RX 6000 cards still hold the top spots.

And not to beat a dead horse, but drivers are yet again a concern for Nvidia. The RTX 3080 takes the top spot, at 1440p with DXR, which basically makes no sense. But the Nvidia GPUs are limited to under 90 fps by the CPU or some other factor, while AMD’s GPUs can hit closer to 100 fps.

(Image credit: techy’s points)

Last but not least, we have truly maxed out settings at 4K ultra with DXR. For the first time, Nvidia’s RTX 3080 Ti and RTX 3090 claim the top spots, and the RTX 3080 edges the RX 6800 XT thanks to its higher minimum fps — though minimum fps still varies quite a bit more than we’re used to and will likely improve with a patch or two.

Far Cry 6 says you need at least 11GB VRAM for 4K ultra with DXR, and it’s mostly correct, though 10GB apparently will suffice. The 8GB cards, meanwhile, all dropped below 20 fps, or even 10 fps in several attempts to get the benchmark to run. You pretty much don’t want to even try 4K with ray tracing in this game unless you have a card with at least 10GB VRAM. Or maybe we’ll see drivers and patches improve this as well.

The game does have a memory usage bar, which suggests 4K ultra DXR only needs 6.82GB VRAM. That clearly isn’t accurate, however, based on what happened to the 8GB GPUs we tried.

Far Cry 6 Settings Analysis

(Image credit: techy’s points)

(Image credit: techy’s points)

Far Cry 6 has about a dozen graphics settings you can tweak, depending on how you want to count and what GPU you’re using. Here we’ve taken the RTX 3060 and RX 6700 XT — both with 12GB VRAM, so that we won’t hit memory bottlenecks — and tested performance using the ultra quality preset with HD textures enabled. Then we’ve gone through and turned each individual setting to its minimum value and run the benchmark again to see how performance changed. In some cases (motion blur, DXR reflections, and DXR shadows) we’ve turned on a setting that was previously, which typically drops performance.

The charts are color coded with the ultra preset in green, settings that caused more than a 4% increase in framerates are in blue, and settings that cause more than a 4% drop in performance are in red. All the gray bars are settings that caused less than a 4% change in performance, which basically means you shouldn’t worry about tweaking those unless you specifically don’t like the way they make the game look.

Image 1 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 2 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 3 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 4 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 5 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 6 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 7 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 8 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 9 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 10 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 11 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 12 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 13 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 14 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 15 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 16 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 17 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)
Image 18 of 18

Far Cry 6 settings menu screenshots

(Image credit: Ubisoft)

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button