• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Metro Exodus Benchmark Performance, RTX & DLSS

DLSS obliterates texture detail and alpha effects and people still don't want to accept this is a terrible upscaling method. Might as well lower your resolution manually, hell it may even look better.
 
My GPU is ready for metro, but are my old CPU up to the task. Gonna try it out when it is released on Friday. So as of now Metro is gonna be the first new game to try out for 2019 and see if 2019 is the year X58 is finally gonna be seen as outdated for gaming:p.

All that ray-tracing and DLSS you RTX guys can keep it. The image really dit not impress. So i am happy with what i got now: GTX 1080 TI :cool:
 
I think DLSS is a scam, looks blurry as crap to me. I prefer it off.
First game to give something to users with RTX (BF5 is just a joke...)

DLSS - Doesn't look like the solution at all, no good implementation I've seen so far and the Raytracing here is actually really good in the pictures I see and performance hit isn't too bad so I can't blame it on the dev

AMD GPU does poorly in a new game.
AMD fans:

A. It is the reviewer's fault! He/she is biased towards Nvidia (Nvidia shill theory)
B. RTX and DLSS are gimmicks!
C. Developer took Nvidia money (Developer is Nvidia shill theory)
D. But XXX sites showed XXX GPU is way better (Accusing TPU theory)
E. I wont play this game (Ostrich strategy)

So yeah, whenever AMD does poorly in something, it is ALWAYS someone else's fault. Not the saint, savior and holy underdog AMD's fault.

A, D, E = Fanboy shit.
B = partially true, it's not worth buying RTX2XXX, emphasis on 2XXX for it's RTX, I believe some sort of it is the future with raytracing or pathtracing but atm, let devs play with it for a bit, this game is the first game where I see they've used Raytracing well, BF5 showed off nothing I've not seen with rasterized gameengines (I can run on any gpu). thus so far it's been a gimmick and just like gameworks.
Don't think many will buy RTX2080 for this game... thus don't buy RTX card for it's RTX, not saying gimmick but for products out there right now it definitely is.
But I'm saying buy a RTX card as AMD can't compete.

DLSS = Not seen anything posetive, performance gain for reduced quality, quality settings exist for a reason and don't need DLSS.

C. here is my opinion, hardly ever when I see a AMD titled game does a nvidia card perform beyond horrible and it's usually just optimized to use amd's hardware where benchmarks show that amd cards perform more according to theoretical performance
while nvidia titles It tanks performance on amd cards like hell and even the game doesn't even look that good and it still tanks performance with it's gameworks stuff and most of the time I can only say: it looks different, not better and not worse it's just plain pointless.
So when I had my 980TI I still disabled gameworks in every title I played cause I never saw any point in having enabled.

Only reason I have a Vega was freesync and Linux support the latter being the only place amd cards really are just superiour.
 
Surprisingly the game doesn't let you select between full-screen, windowed or maximized window. Rather the game always runs in full screen mode at your current desktop resolution, and the "Resolution" setting here controls the rendering resolution. This causes some issues when your desktop is set to 1080p on a 4K monitor — Metro will only run in 1080p, even though you're selecting 4K in settings.
I'm fine :)
Motion blur is always enabled, the options are "low", "normal", "high" — "off" is not available, which is super annoying.
:confused::banghead: , Motion blur is reason that I disabled in BV5 , super annoying.
DLSS can be enabled or disabled, separately from ray-tracing. DLSS is available based on GPU, resolution and raytracing setting. To enable DLSS at 4K you need a RTX 2070 or better, raytracing can be on or off. For DLSS at 1440p you must have raytracing enabled and a RTX 2060. At 1080p, DLSS can only be enabled on RTX 2060 & RTX 2070, only when raytracing is enabled. No idea why NVIDIA chose to limit it that way.
What the hell :wtf:? If you have RTX2060 , you should enable RT then DLSS? I guess there were too much buggy that developers left it to later so they can fix important some Issue/bugs
 
What the hell :wtf:? If you have RTX2060 , you should enable RT then DLSS? I guess there were too much buggy that developers left it to later so they can fix important some Issue/bugs

I have a feeling these are restrictions imposed by Nvidia so they can have "tiers" of some sort. They don't want people to have too much of a horrible experience by destroying their performance/image quality too much with RTX/DLSS so they limit your choice.
 
WTH was supposed DLSS do?

RTX clearly works and nicely. Im quite curious about future and nicely realistically lit games..
 
When I first started looking at the pics, especially first one, I was like... RTX is way too dark .... it really started to show it's stuff in the pic with the snow. Then i went back and looked at the 1st pic and it hit me ... from a game play standpoint, I liked that I could see more (I used to turn up gamma in Diablo caves to "see more") but it immediately became apparent that given the observed light conditions, the darkness was "accurate".
This is going to spark quite a few discussions. DXR means light is more accurately rendered. Some will say the brighter images look better, some won't. It's just like some prefer Canon's more saturated colors, even if Nikon has more lifelike defaults.

Also, it's pretty clear the 2060 can't really do DXR ;)
 
AMD GPU does poorly in a new game.

AMD fans:

A. It is the reviewer's fault! He/she is biased towards Nvidia (Nvidia shill theory)
B. RTX and DLSS are gimmicks!
C. Developer took Nvidia money (Developer is Nvidia shill theory)
D. But XXX sites showed XXX GPU is way better (Accusing TPU theory)
E. I wont play this game (Ostrich strategy)

So yeah, whenever AMD does poorly in something, it is ALWAYS someone else's fault. Not the saint, savior and holy underdog AMD's fault.

Hey please STOP ! Don't fuel Red/Green war even if someone started ,DON'T JOIN!
 
Hey please STOP ! Don't fuel Red/Green war even if someone started ,DON'T JOIN!

He's just salty he bought into the Fury X hype. Let him be. His wounds will heal soon.
 
1080.png

3GB 1060 beating the 8GB 580 :(
Did you look at VRAM allocation? It's a bit over 3GB at 1440p.
 
Join ? He is at the helm. :laugh:
:ohwell:
Did you look at VRAM allocation? It's a bit over 3GB at 1440p.

But 1060 3GB is only for 1080p. Not good if someone buys this card for 1440p, Btw this is Ultra setting , So No one plays at Ultra , almost 95% people on High setting and I'm one of them.
 
Ray tracing and global illumination looks awesome. night and day difference in some scenes. I think global illumination will be the best application of ray tracing in games. but DLSS was really disappointing in this game. strangely even the 1440p image looks sharper than 4k+DLSS on.
 
Last edited:
RVII at 2070 performance,not good.
I'm glad at 1440p with 1080Ti I can expect very good framerates at ultra,I'll wait for the game reviews before buying though.
 
RVII at 2070 performance,not good.
I'm glad at 1440p with 1080Ti I can expect very good framerates at ultra,I'll wait for the game reviews before buying though.

IMG_20190213_221836_731.jpg
 
Last edited:
Looks real nice can't wait to try it
 
Don't trust their performance numbers, they're using built-in benchmark.
steve from gamers nexus says in-game benchmark is unrealistically heavy.
I think what is important here, actually, is the RELATIVE performance of the cards not actual in-game FPS for most. As a consumer, I would be happy to see BETTER performance in game than the included benchmark.

The good news is, the integrated benchmark is repeatable and allows for empircal testing. Run throughs can be inconsistent due to all the variables. The longer the run the better off is what we found for manual runs.
 
AMD GPU does poorly in a new game.

AMD fans:

A. It is the reviewer's fault! He/she is biased towards Nvidia (Nvidia shill theory)
B. RTX and DLSS are gimmicks!
C. Developer took Nvidia money (Developer is Nvidia shill theory)
D. But XXX sites showed XXX GPU is way better (Accusing TPU theory)
E. I wont play this game (Ostrich strategy)

So yeah, whenever AMD does poorly in something, it is ALWAYS someone else's fault. Not the saint, savior and holy underdog AMD's fault.

Perhaps maybe you should look at the facts rather than attacking people.
A. Reviewers make build choices that may negatively impact performance, most of this is not purposefully slanted. That said, a good read of the reviewers guide should show which systems setups a vendor prefers to make theirs shine.... might be worth having more than 1 platform to test on to make sure you are not accidentally biasing. It's a shitton of work and I wouldn't expect it out of day 1 reviews. The performance engineer in me always wants to ensure validity of test cases.

B. RTX is cool but early, DLSS is... fuck I turn off motion blur why would I want this shiite.

C. This is a RTX showcase title... They delayed the game to make it a game works title, this game may not have made it to production without nvidia money.
So yes it is definitely going to be AMD deoptimised... but we probably wouldn't get to play it without that money.

D. there is a reason why you shouldn't rely on a single site, but you should definitely take anomalies with a grain of salt and try to identify why they are different. Never start with the assumption of fudging numbers... Reviewing is hard work. If I recall... there was a site giving amd (cpu) much better numbers because they turned off high precision timer or something which boosted amd a touch and was very detrimental to intel... so once again, configs are VERY important.

E. Plenty of reasons to wait a year for this game... /thread.

I am happy to see my 1080ti will still have a great experience tho.
And the other rig with vega64 on freesync will also be okish.
 
From the Conclusion: What is worth mentioning though is that the game does look more blurry when DLSS is enabled — 4K DLSS still looks much better than simply running at 1440p.

What? No it doesn't. From the last image comparing 1440p to 4K+DLSS, the overall 1440p image looks better to my eyes. The DLSS image is a blurry mess. It literally hurts my eyes. Though I'm the same as others here in that motion blur is one of the first things I turn off in any game.
 
What is really baffling is that if you have a 2080Ti, you can't use DLSS at 2560x1440. Only 2080s or below.

Huh?
 
What is really baffling is that if you have a 2080Ti, you can't use DLSS at 2560x1440. Only 2080s or below.

Huh?
My take on this is to get the performance up where it actually needs it. The 2080Ti can run 1440p with RT enabled and Ultra for 60 FPS.
 
Ouch somewhat poor Radeon performance. :(

But by the time i buy this maybe the drivers and/or patches will improve things. Either way I have just bought a Freesync 1080p monitor so even if my 570 can only do 40~ fps it should still be playable and smooth,..

What is the lower limit of your freesync monitor? Most freesync monitors don't start working till 40fps or above.

My take on this is to get the performance up where it actually needs it. The 2080Ti can run 1440p with RT enabled and Ultra for 60 FPS.

Sorry but 60fps shouldn't be the limiting factor. 60fps should be the minimum. People actually have monitors 1440p monitors that go to 160Hz.
 
"NVIDIA DLSS is a new form of anti-aliasing which renders the game at reduced resolution, upscales it, and then fills in the details using a pre-trained AI network... We are a bit puzzled though, why NVIDIA is limiting the use of DLSS in some ways that make little sense. For example, you can enable DLSS at 1080p..."

You answered your own question. It takes a lot of preemptive work to "pre-train" a given set of tensor cores to run DLSS correctly. It doesn't "just work."

Nvidia needs to code each DLSS option individually for each Turing Core Configuration they plan to support, per game. Hence Nvidia isn't going to waste a ton of time pre-training 1080p for the 2080 Ti. No one would use it, and it takes time/money.

Same reason why they force you to use Raytracing for DLSS in BFV - DLSS isn't needed unless you use Raytracing, and as such they aren't going to waste time training different paths for useless settings combinations. If you aren't using raytracing, you don't need to downgrade your graphics with DLSS.
 
Back
Top