• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Far Cry 5 Benchmark Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,714 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Far Cry 5 takes the fight to crazy Christian fundamentalists in forgotten Montana who are just waiting for the US Government to collapse. We tested 15 cards using today's latest GameReady drivers with actual in-game gameplay for the most accurate picture. AMD Radeon performance results are impressive, often beating their NVIDIA counterparts substantially.

Show full review
 
Last edited:
thnx for including RX 480 in the charts, i feel it gets overlooked alot because of the RX 580 and your charts show performance isnt identical :)
 
Feel the difference:

Far Cry 5 as an AMD sponsored game: running decent on both manufacturers' cards.

Far Cry 4 as an NV sponsored game: "Ubisoft continued its partnership with nVidia into Far Cry 4, featuring inclusion of soft shadows, HBAO+, fine-tuned god rays and lighting FX, and other GameWorks-enabled technologies. Perhaps in tow of this partnership, we found AMD cards suffered substantially with Far Cry 4 on PC. "

Great work Ubisoft and AMD!
 
Well it ain't using the wonderful, wonderful GameDoesn'tWork, ofc it's gonna perform well. Heck, both sides will do exactly that.
 
Always enjoyed this series.
 
"unsimilar"

"requimeents"

I am disappoint.
 
I see Ubi still hasn't figured out what a modern CPU is. Long live single core!

They probably use the in game benchmark, we don't

Those are fantastically Nvidia biased for whatever reason. I swear it's night and day difference on most games.
 
They probably use the in game benchmark, we don't

Well, should that not be the standard? If everybody does whatever they please what is the point?
 
Well, should that not be the standard? If everybody does whatever they please what is the point?
Is your experience with the game going to be the game itself or the benchmark.
 
Last edited:
Is you experience with the game going to be the game itself or the benchmark.

Game obvious, but how can we believe test results if there is no standard?

Then please upload the test (demo that was used so that independent systems can test the result)?
 
Feel the difference:

Far Cry 5 as an AMD sponsored game: running decent on both manufacturers' cards.

Far Cry 4 as an NV sponsored game: "Ubisoft continued its partnership with nVidia into Far Cry 4, featuring inclusion of soft shadows, HBAO+, fine-tuned god rays and lighting FX, and other GameWorks-enabled technologies. Perhaps in tow of this partnership, we found AMD cards suffered substantially with Far Cry 4 on PC. "

Great work Ubisoft and AMD!

Dude, FFXV w/Gameworks running same as FC 5 on 1080Ti @ 4k. So how FC 5 has awesome performance and FFXV doesn't?


I see Ubi still hasn't figured out what a modern CPU is. Long live single core!

Those are fantastically Nvidia biased for whatever reason. I swear it's night and day difference on most games.

Why it's not AMD's fault for not using DX12 and bringing awesome multi core scaling? :laugh:

Let me guess. Somehow Nvidia managed to add benchmark tool which doesn't represent real gaming performance in AMD sponsored game?
 
Game obvious, but how can we believe test results if there is no standard?

Then please upload the test (demo that was used so that independent systems can test the result)?
I prefer to not publish details on my test scenes. You'll have to trust me that I try my best to properly represent typical game performance in my benchmarks.
 
I prefer to not publish details on my test scenes. You'll have to trust me that I try my best to properly represent typical game performance in my benchmarks.

Plenty of YouTubers upload gameplay vids with fps counter if dingaling doesn't believe you.

Don't you know that you are an Nvidia, Intel and AMD shill at the same time?! Lol
 
Last edited:
I see Ubi still hasn't figured out what a modern CPU is. Long live single core!
Actually Ubi can use more then a single core, as AC:O is proof.
The thing is they have 3 equally active engines. Snowdrop (Division), the one in Origins is Anvil, the one in FC5 is Dunia.
That one was used (in different forms) on every FC game since CryEngine and the original Far Cry. Hence it's poor multithreading. I'm guessing they didn't bother too much in further upgrades regarding cpu.
 
Great performance and optimisation there. Thanks for the test @W1zzard . I hope more games are so well made from this aspect from now on.
 
Hmm numbers don’t match my Vega64 and 1800x.

5k ultra setting I got 34fps and dips to 26 and 4K I am getting 58 and dips to 43fps.

My 5k numbers match your 4K settings. Maybe CPU performance.

Vega64, 64gb 3200mhz 14c ram and 1800x cpu on hero6. OS on M.2 and game drive is 3gbx2 raid.
 

Attachments

  • 6332084D-AE99-4AC3-A3C8-D372C8FF4FD2.jpeg
    6332084D-AE99-4AC3-A3C8-D372C8FF4FD2.jpeg
    594.3 KB · Views: 650
  • 35BF742C-0255-440E-9D78-F0DF35F4BFC7.jpeg
    35BF742C-0255-440E-9D78-F0DF35F4BFC7.jpeg
    888.6 KB · Views: 713
Last edited:
Impressive showing from the geforces on an AMD title. Lost heroically =)
 
Impressive showing from the geforces on an AMD title. Lost heroically =)

No gimping involved...unlike when it's the other way around. *cough* crysis 2 *cough*
 
No gimping involved...unlike when it's the other way around. *cough* crysis 2 *cough*

Speaking with insight and backed up with facts. Vega56 over 1080 suggests otherwise :)

I agree though, the Crysis over-tesselation was just silly. I wouldn't be surprised if the developer is "required" to provide superior perf for the sponsor.
 
Speaking with insight and backed up with facts. Vega56 over 1080 suggests otherwise :)

I agree though, the Crysis over-tesselation was just silly. I wouldn't be surprised if the developer is "required" to provide superior perf for the sponsor.

Plenty of games where AMD titles are marginally better if at all. Nvidia simply gimpworks the shit out of everything else or the devs are lazy as hell. These results are what you get on something that has been polished (and it should be after the number of games Ubi makes on the same engines).
 
I find some of the results curious as well. My 1080 Ti produces 50-60 FPS constant on 4K. I therefore find the conclusion that you need CF or SLI for 4K a bit misleading. I have been playing on Ultra 4K for several hours now and the experience is butter smooth.

Actually the 40 FPS reported for the 1080 Ti in the review is roughly what I got when I set the resolution scale to 1.2.

Really curious to know where the difference could be coming from! Maybe a really heavy actual gameplay scene since it's not the in-game benchmark being used?
 
Last edited:
Back
Top