• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Final Fantasy XVI Performance Benchmark

I'm guessing something is off with the transparency or lighting. Even at 1080p the game seems to be vram bandwidth starved not capacity.
 
From what I can see in the images, the performance (or lack of it) is not justified by visual fidelity.
 
Final Fantasy XV ran just as dodgy on my Vega Frontier Edition back in the day. (16 GB Vega64) - next gen cards should have a much easier time.
I remember "testing" how it runs on 980 Ti years ago and ended up testing for about 80 hours. :D
 
I remember "testing" how it runs on 980 Ti years ago and ended up testing for about 80 hours. :D

I played it to completion - my save has over 416 hours on it, Noctis and his gang at level 120, etc.
 
I'm glad I stepped down from 4k to 1440p ultrawide as even my 4090 struggles unless using DLSS and framegen in the latest games at 4k. I'd rather use DLAA and no frame generation personally.

Even though this isn't as technically impressive as the latest UE5 games and uses pretty terrible SSR I still find the image a lot more stable than BMW and the jump over the disgustingly blurry PS5 version is nice.
I don't blame you the 4090 still struggles in some titles at 4k.

For me personally I prefer 1440 UW at 100+ fps to 4k 60
 
I can't stand when mediocre looking graphical games are stupidly demanding, Square up your game.
 
I can't stand when mediocre looking graphical games are stupidly demanding, Square up your game.

That's kinda been the MO over the past 2-3 years minor visual improvements for massively increased hardware requirements.

The best looking games no longer perform well on any hardware without upscaling honestly the last visually impressive game that actually performs well is probably CP2077 but that was 4 years ago already.

To be honest Final Fantasy 15 was more impressive initially and came out 8 years ago...

With the majority of developers switching to UE5 I don't expect that to change much going forward.

Visuals are subjective though and somthing doesn't have to be a technical marvel to be visually appealing.

To me the this game looks good but I'm also playing on a nice oled with DLAA and very good perfomance I did notice that even at 1080p requirements are high but Square Enix isn't known for the best pc ports and even this one requires a mod but I can happilly say it stutters way less than any recent Unreal engine game
 
Barely 50fps at 4K on a game that looks like this is actually insane.
 
Looking at comparison, there really isn't that much of a difference in detail between low and ultra. Kinda makes running ultra kinda pointless. Some of screen compare you can see a lot but others seem min amount of difference. I doubt most would notice if it wasn't compared next to each other like that.
 
Looking at comparison, there really isn't that much of a difference in detail between low and ultra. Kinda makes running ultra kinda pointless. Some of screen compare you can see a lot but others seem min amount of difference. I doubt most would notice if it wasn't compared next to each other like that.

The bulk of the compute load seems to come from SSR and AO - disabling those has allowed me to retain 4K/60 fps at DLSS Quality or XeSS Ultra Quality Plus, without using the dynamic resolution feature or frame generation. The fps is lower, but the gameplay feels better that way - without the motion artifacts.

Speaking of motion artifacts, both frame generation settings suck once the game gets more hectic in the later maps - DLSS-G has motion artifacting problems if fps becomes unsteady and AMD FSR3 FG has extreme judder issues on my 4080
 
The bulk of the compute load seems to come from SSR and AO - disabling those has allowed me to retain 4K/60 fps at DLSS Quality or XeSS Ultra Quality Plus, without using the dynamic resolution feature or frame generation. The fps is lower, but the gameplay feels better that way - without the motion artifacts.

Speaking of motion artifacts, both frame generation settings suck once the game gets more hectic in the later maps - DLSS-G has motion artifacting problems if fps becomes unsteady and AMD FSR3 FG has extreme judder issues on my 4080

Yeah water areas seem to be the heaviest

DLAA 1440p UW
Screenshot (5).png

DLSSQ 1440P UW
Screenshot (6).png

CPU thread usage seems pretty good loading the 3DVC portion of my 7950X3D pretty evenly.

Screenshot (6).png

Loving it in UW.
 
Yeah water areas seem to be the heaviest

DLAA 1440p UW
View attachment 364069

DLSSQ 1440P UW
View attachment 364070

CPU thread usage seems pretty good loading the 3DVC portion of my 7950X3D pretty evenly.

View attachment 364071

Loving it in UW.

I've been tinkering with it all afternoon and found settings I'm happy with, which are just disabling SSR and AO, as well as using DLSS-Q targeting 4K/60 with the RTX 4080. It won't drop below 60 fps even in the most intense combat scenes and areas. Water quality setting per itself isn't the problem, the SSR and AO settings (especially SSR) drop the fps like a brick, I think it must be reflecting water sources like lakes with a high degree of precision and that wreaks havoc on the fps. Disabling these two doesn't make the graphics much worse, but makes the frame rate stable.

Dynamic resolution + DLSS-G frame generation tends to cause massive and I mean massive motion compensation problems for me, I guess the resolution drops significantly enough to upkeep the frame rates I am asking out of my 4080. I've come to the conclusion that when it's to pass a final verdict on the experience, 60 real frames trumps 120 with the assistance of the frame generator here. It's better to keep both dynamic resolution and DLSS-G disabled if it can be helped, if you need frame generation for performance issues opt into a lower static input resolution, such as DLSS Ultra Performance.

AMD FSR is a complete waste of time in this game, XeSS should be compatible with the latest Radeon cards and it looks better in literally every scene and while it lacks frame generation support, FSR 3's frame generator is complete garbage, it judders like crazy even if the source resolution is static and low, which defeats the point of a frame generator, the output is simply not smooth. Might work better on AMD hardware, I dunno. Very poor experience with it on my GeForce.
 
Games are becoming ridiculously heavy these days
And so badly coded that without cheating tech like upscaling and fake frames, they can’t archive decent framerates.
 
Upon just playing this, couldn't help but think why don't games use the 'Rockstar Advanced Game Engine' from Red Dead Redemption 2. Heck, it was released on PC back in October 2018! It is able to smoothly run 4k maxed out with my RTX 4090 and LG C3 42" and looks absolutely fabulous. So, no immersion breaking frame stutter or reduction in texture quality required..

It has been 6 years, it seems weird that only a handful of games run as well while looking that good. Yet, if it was updated and improved with time, by now it would make this game appear really outdated for the frames per seconds we get. . Maybe I shouldn't have just played a few hours of it beforehand..

Heck, modern Cyberpunk 2077 with all the updates to the REDengine includes basically all the enhancements thrown at it. Including all three current DLSS updates. 'DLSS' 3.7.20, DLSS 'Ray Reconstruction' 3.7.10 and 'Frame Generation' 3.7.10. (just make sure you have the latest versions, or it is a mess)

As since the last update of Cyberpunk, DLSS features were updated yet for some reason the game only updated only 1 of the DLSS files for many folk, including myself. If you manually plonk the latest DLSS DLL files within the 'x64' folder within Cyberpunk it pretty much doubles your FPS and improves image quality a lot.

Well, it certainly does for RTX 4000 owners. :D

Scroll down the page from the link and Download here each 3x DLSS files https://www.techpowerup.com/download/drivers/

Anyway, like Starfield maybe it will include improved upscaling options. As in its current form wasting all those watts doesn't seem worth it.
 
Performs worse than Forspoken. This engine is bad...
 
It uses that much on a 4090 cause there's 24GB available. When there's a smaller pool it runs memory management tighter. It runs on 8GB cards fine too.

i have doubt about it, vram 8-10gb is fine/smooth, for how long in the real gameplay, not in the benchmark test..... ?
 
i have doubt about it, vram 8-10gb is fine/smooth, for how long in the real gameplay, not in the benchmark test..... ?
Well, given the bench is running on ultra and working within memory constraints, and you sure as hell won't be running the game on ultra on an 8GB card unless you're in 1440p but using heavy dlss to get a playable fps, I think it's gonna be ok. Memory use is the least of this game's problems.
 
Its a thing people need to know that when looking at mem usage its always on a card with 20+gb. SO game will like use more then it needs, not all games are like some others where they will use 16+gb but yet still run same fps on 8gb card, but still games will use a little extra if possible.
 
Square was on the edge of bankrupcy when they released the first Final Fantasy in 1987 yet it saved the company.

And it's far from being 16th, there are way more spinoffs than the main entries :laugh:
Aren't they all different stories and different characters in each? In a sense, these would be their Final Fantasy...

If that fails to land, just ignore what I said. :toast:
 
Back
Top