• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Horizon Forbidden West Performance Benchmark

Why is the 4080 Super missing from the benchmarks? I bought one a few weeks ago and would love to see its benchmarks.
 
Is this test system reasonable? It seems madly overclocked to me. 330W and 6 GHz P cores? This is a chiller territory. I cannot imagine how that 280 AIO can cool this.

View attachment 340443

330W is nowhere in Intel specs and 6 GHz is maximal frequency for selected two cores, the CPU will never run at this frequency with gaming load.

I presumed when 6 GHz was written in the test system specs, that it is overclocked to all P cores at 6 GHz, which IS A MAD OVERCLOCK!

I assume the author simply listed the Intel spec boost clock for 2 cores which is 6Ghz. It will however run default at 5.7Ghz all cores (P) while gaming. And it won't consume anywhere near the PL1/PL2 forced limit of 330w, especially in a single player title turning out this low of fps. Likely under 125w most of the time. Even my 14900KS at 5.9Ghz all P cores runs around that or lower in single player titles. It will only push wattage up during first boot and shader compile or if I'm running 1080p low settings turning out 350+ fps. Either way easily cooled with an AIO, not even close to chiller territory lol.
 
I am curios why the 14900KS is used as well, the 7800X3D is still faster and it's for sure a much more commonly used CPU.
 
Do you guys like how this game looks ? It feels weird to me in 4k ultra, some scenes are amazing but there are many very ugly textures like tree roots, some vegetation and rocks. Sometimes you feel like you see ps3/ps4 textures.
 
GameGPU doesn't actually test these, the numbers are approximated using math
funny enough most people don't know they simulate the numbers.

Because it is 2% better than normal 4080. They are almost the same.
This just add 2% to the numbers and you have the results.
 
Last edited:
I am curios why the 14900KS is used as well, the 7800X3D is still faster and it's for sure a much more commonly used CPU.
14900K performs well, it is the second best CPU for gaming.

I would be interested, how the RATIOS BETWEEN GPU PERFORMANCE change with different CPUs.

Say if we used 14900K, 14600K, 7950X, 7800X3D and 7800X and wanted to judge just how each GPU compares to each other, would using some of these CPUs make any significant difference in the results?
 
Not the worst port, although I dont consider a RTX 4060 a reasonable requirement to hit 1080p 60fps. Still not as bad as other recent titles. Cant help but feel if there was no cheat mode DLSS/FSR, then something like a GTX 1060 would manage 1080p 60FPS instead.
 
Not the worst port, although I dont consider a RTX 4060 a reasonable requirement to hit 1080p 60fps. Still not as bad as other recent titles. Cant help but feel if there was no cheat mode DLSS/FSR, then something like a GTX 1060 would manage 1080p 60FPS instead.

At Max settings. Lower them down to High or Medium and the game still looks very good and more cards will reach 60fps. And that's before optimizing for individual settings.

And have a look at the GTX 1060 6GB review here, at launch there were already a number of games that it couldn't run at 60fps Max settings. It wasn't even that good when it launched 8 years ago in 2016 and the 1060 was my first GPU. It was great but still had limitations.
 
Last edited:
72 vs. 60 CUs at similar average clocks, twice the Infinity Cache.

Considering how AMD most probably missed their target clocks on their GCDs made on N4 by a bunch, I think the 7800XT only really goes above the 6800XT when there's driver optimization for RDNA3.
If there's no hand-written code from AMD for the driver to make use of the dual-issue FP32 ALUs, the 7800XT will stay behind the 6800XT.
You are right about 50% for the Infinity Cache,in the 7800xt the Infinity Cache is clocked at 2.5GHz in relation to the 6800xt which is clocked at 1.940GHz,so it has a larger bandwidth and the difference is visible in the new games that support Unreal Engine 5.

As in the first game AMD's platform produces better numbers with SMT off.
 

Attachments

  • Annotation 2024-03-25 045515.png
    Annotation 2024-03-25 045515.png
    109.9 KB · Views: 181
I wish there was a built-in benchmark mode, so few games have that while also being a good port (and good game/with upscalers etc.)
 
I'm surprised to see that the 6800XT is consistently faster than the 7800XT
Because you've been conned by the naming. 7800XT is 6700XT replacement, plain and simple and is the true 7700XT, but AMD had to play Nvidia's egregious renaming game. If the 7800XT had of been released as the 7700XT at $479, it would have been a smash hit as it would have delivered more memory, more bandwidth, and seen a significant jump in performance. Calling it a 7800XT and seeing a few % in raster and say 10% in RT over 6800XT IMO makes it a bit of a joke. Just not as big a joke as the Nvidia 4050 erm 4060 offerings.

Now 7900 GRE has been ungimped a bit it's a fair bit more appealing than 7800XT, but also is not allowed to perform in line with it's hardware.
 
Now 7900 GRE has been ungimped a bit it's a fair bit more appealing than 7800XT, but also is not allowed to perform in line with it's hardware.
IDK about that, the 7800 XT can OC by a similar % compare to the GRE. So with both OCed, the 2 cards are back to where they were relative to each other.
1711355072094.png
 
the 7800XT will stay behind the 6800XT.
Most recent 7900 GRE review gives a slight edge to the 7800 XT in raster across all the resolutions. In RT that gap widens.

I also dunno who in their right mind would buy a 6800 XT as of today when they are likely to have their support cut short 3 years before the 7800 XT.
 
Last edited:
HFW came out in 2022 though as a PS5 exclusive
Well, i've bought it on PS4 at release, it was a cross-gen title (the last with free ps5 upgrade, it explains the lack of RT). Only burning shore was a PS5 exclusive
 
One thing i've noticed is that using DLSS on anything other than quality mode breaks SSR - especially performance and ultra performance mode. Additionally ultra performance dlss removes alot of shadow detail. So 8k is off the table for now, until performance gets improved, the stuff gets fixed, or the release of the 5090.

Such a shame, cause 8k does look gorgeously detailed vs 4k.

4k dlaa

oN2giw8.jpg

8k dlss ultra performance


Imgur kept changing the resolution of that image to 5k and dogsh1t quality, so uploaded to onedrive instead, which apparently can't be shared as an image in the forum - but it does show the full quality once opened and fully loaded.

PS. the images are meant to show the loss of shadow detail on grass and rocks, and SSR semi breaking on water with the 8k ultra performance dlss picture.
 
Last edited:
One thing i've noticed is that using DLSS on anything other than quality mode breaks SSR - especially performance and ultra performance mode. Additionally ultra performance dlss removes alot of shadow detail. So 8k is off the table for now, until performance gets improved, the stuff gets fixed, or the release of the 5090.

Such a shame, cause 8k does look gorgeously detailed vs 4k.

4k dlaa

oN2giw8.jpg

8k dlss ultra performance


Imgur kept changing the resolution of that image to 5k and dogsh1t quality, so uploaded to onedrive instead, which apparently can't be shared as an image in the forum - but it does show the full quality once opened and fully loaded.
Imo you dlaa at 4k is more practical than 8k set to whatever and looks better than 4k dlss set to quality.
 
Imo you dlaa at 4k is more practical than 8k set to whatever and looks better than 4k dlss set to quality.

Lol more practical... yeah, and 720p is more practical than 1080p.

Obviously when i'm using 8k, it's cause image quality is what matters to me.
 
Lol more practical... yeah, and 720p is more practical than 1080p.

Obviously when i'm using 8k, it's cause image quality is what matters to me.
I recall Linus did a blind test between 4k native and 8k and objectively the test resulted in there was no subjective visual difference between the small sample provided.


I mean most enthusiasts today would prefer 4k dlaa at 120hz than 8k dlss set to ultra performance at 60 fps. There is also no 8k display above 60 fps today as well. The industry is moving towards 4k 240hz before 8k is even considered. Sure you can subjectively prefer 8k dlss det to ultra performance at wasted resources with objectively no superior image quality above 4k native dlaa.
 
I recall Linus did a blind test between 4k native and 8k and objectively the test resulted in there was no subjective visual difference between the small sample provided.


I mean most enthusiasts today would prefer 4k dlaa at 120hz than 8k dlss set to ultra performance at 60 fps. There is also no 8k display above 60 fps today as well. The industry is moving towards 4k 240hz before 8k is even considered. Sure you can subjectively prefer 8k dlss det to ultra performance at wasted resources with objectively no superior image quality above 4k native dlaa.

Blessed are the ignorant (and evidently blind) - got nothing more to add.
 
Blessed are the ignorant (and evidently blind) - got nothing more to add.
I wasn't trying to trigger you but can you at least fix the broken links so we can be bias together?

update the link works my bad here is 4k dlaa at max settings.

 
Last edited:
I wasn't trying to trigger you but can you at least fix the broken links so we can be bias together?

update the link works my bad here is 4k dlaa at max settings.


It simply gets tiresome to have comments like this everytime i post something, and always from people who haven't actually seen 8k themselfs.

It's exactly the same situation as 10 years ago, when i was posting 4k shots, and everyone was like "4k is so stupid, it barely looks any better than 1080p!!!" - and just like now they were talking out of their arses.

Fact of the matter is that there is a large difference in image quality between 4k and 8k (which is very evident on the screenshots i posted, even if you have to zoom in alot to see it if you are on a crappy 1080p display...).
Whether or not you can see the difference however is entirely down to your eye sight, and lots of people have poor eye sight... either need glasses, or have poorly fitting glases. Reminds me of me getting a new tv for my mother... replaced her old 720p tv with a nice 4k tv - she couldn't tell the difference at all, and i dont doubt that she couldn't, she simply has poor eyesight.

And im well aware of what 4k dlaa looks like, thanks.
 
It simply gets tiresome to have comments like this everytime i post something, and always from people who haven't actually seen 8k themselfs.

It's exactly the same situation as 10 years ago, when i was posting 4k shots, and everyone was like "4k is so stupid, it barely looks any better than 1080p!!!" - and just like now they were talking out of their arses.

Fact of the matter is that there is a large difference in image quality between 4k and 8k (which is very evident on the screenshots i posted, even if you have to zoom in alot to see it if you are on a crappy 1080p display...).
Whether or not you can see the difference however is entirely down to your eye sight, and lots of people have poor eye sight... either need glasses, or have poorly fitting glases. Reminds me of me getting a new tv for my mother... replaced her old 720p tv with a nice 4k tv - she couldn't tell the difference at all, and i dont doubt that she couldn't, she simply has poor eyesight.

And im well aware of what 4k dlaa looks like, thanks.
Which display do you have? Also how close do you sit with your magnified glass?
 
Which display do you have? Also how close do you sit with your magnified glass?

Classic case of "i haven't seen what you are talking about, but what you have is deffo not better than what i have!!!!!11".

Not gonna entertain your ignorance any further.
 
Classic case of "i haven't seen what you are talking about, but what you have is deffo not better than what i have!!!!!11".

Not gonna entertain your ignorance any further.
Sure let others decide from the images you posted vs the 4k dlaa video I Posted which one is superior subjectively speaking.
Still image vs 4k dlaa at 90 to 100 fps. This should be interesting. I love this.
 
Back
Top