• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Dying Light 2 Benchmark Test & Performance Analysis

An interesting way to appear to dismantle my point without really addressing it. So where do you believe I am coming from? I've extrapolated numbers from professional reviews, feel free to share others and I'll happily do analysis of them, where possible, to show the true cost and performance loss for enabling RT.
You're concerned to take every smallest win away AMD has with RT (probably traditional rendering too), so I don't consider your arguments, as we all know what this behavior is. I have better things to do. If I want arguments like this I go to YouTube/reddit (which I don't).

PS. It's astonishing that people expect a debate or nice talk, after being toxic / aggressive elsewhere 5 minutes earlier. Won't happen, learn manners. Just because you sit anonymous in front of a keyboard somewhere doesn't give you the right to be an ***.

/thread
 
Last edited:
You're concerned to take every smallest win away AMD has with RT (probably traditional rendering too), so I don't consider your arguments, as we all know what this behavior is.
Not at all, in fact not even close, I don't want to take anything away from AMD at all, certainly not any wins, of which there are numerous with RDNA2.

I am concerned with understanding the performance impacts and architectural differences (among many other things) as best I can, if only to aide myself and satisfy my curiosity, but also to help others make informed decisions. It is in this spirit that I quoted what you originally said as it is contrary to the evidence I have seen and presented.

If you don't want to have the discussion, consider my arguments/evidence, and either make possibly a great point and teach me something/broaden my perspective, or be willing to learn yourself, then I guess we both have better things to do? This can certainly be civil.
 
This game is demanding on the new consoles. It's the only game I know that limits them to 1080p 60. The series S only manages 30 fps, a first afaik, since 60 fps would likely force sub 720p resolution.
 
poor me, not getting this game then, My AMD car will be struggling.
 
No HDR support in 2022. :banghead:

I played for two hours. DF optimized settings (with RT GI and AO). Game barely manages to keep 4K60 with DLSS Performance most of the time. I saw drops to mid-40s in a couple of scenes.

I think draw distance is the main culprit of these high requirements. There is almost no pop-up, no LOD changes. The first game had a crazy draw distance at launch, which was killing CPUs. They drastically lowered it in a patch and the game ran much better.
The CPU side is not a problem this time, but the GPU definitely is. They should add more settings to adjust, and draw distance should be one of them.
 
I'm playing it on my 980ti at 60fps with peasant settings and am perfectly happy. :laugh:
 
No HDR support in 2022. :banghead:

I played for two hours. DF optimized settings (with RT GI and AO). Game barely manages to keep 4K60 with DLSS Performance most of the time. I saw drops to mid-40s in a couple of scenes.

I think draw distance is the main culprit of these high requirements. There is almost no pop-up, no LOD changes. The first game had a crazy draw distance at launch, which was killing CPUs. They drastically lowered it in a patch and the game ran much better.
The CPU side is not a problem this time, but the GPU definitely is. They should add more settings to adjust, and draw distance should be one of them.

Trying force enabling Resizeable Bar using NV Inspector, Dying Light 2 is not a whitelisted game in the lastest driver. I tried it and gain about 6% higher FPS
 
I'm playing it on my 980ti at 60fps with peasant settings and am perfectly happy. :laugh:
I guess the lack of dynamic resolution keeps the consoles down on resolution. The 3060 just misses 1440p 60 by a few frames. The 1660ti is not far off from 1080p 60. Strange they wouldn't make a few tweaks to the consoles in order to hit 1440p 60 on the SX/PS5 and 1080p 60 on the SS.
 
I guess the lack of dynamic resolution keeps the consoles down on resolution. The 3060 just misses 1440p 60 by a few frames. The 1660ti is not far off from 1080p 60. Strange they wouldn't make a few tweaks to the consoles in order to hit 1440p 60 on the SX/PS5 and 1080p 60 on the SS.

I'm on 1080p, monitor is 165hz too :laugh:
 
Good review once a gain but could you add cpu core count tests with games ie test it running on 4 cores then 6, 8, 12 etc to see what you really need for minimum gaming at decent frame rates!

Thanks
 
Good review once a gain but could you add cpu core count tests with games ie test it running on 4 cores then 6, 8, 12 etc to see what you really need for minimum gaming at decent frame rates!

Thanks
Depends on your GPU, but normally a 4 (8 threads) or 6 core is always enough. This is a typical single player game, it will be more demanding on the GPU side.
 
5800x is a little low end versus now a 12900k . I Hope to see the config updated soon.
Makes only a tiny difference. Maybe you can come here, bring the hardware and set it up? Very little time to mess around these days, gotta get work done first.
 
They didn't skip it. They needed time to develop it for RDNA 2. Nvidia developed this in 10 years. That's why Nvidias is so much better. Including a way bigger RT core, not comparable to AMDs, which is tiny.

Nvidia developed this in 10 years? Don't believe every marketing line you hear. The drive to push RT on consumer GPUs is purely financial and marketing-related. Every GPU eats rasterized workloads for breakfast right about now, even at very high resolutions. GPU mfgrs are hitting a wall in terms of potential sales since 2017. Realistically, RT was technically 'feasible' since Volta. That's not 10 years old, the cores are another iteration of what they built for enterprise. Repurposed.

This echoes in the implementations of RT as of today. You barely get a noticeable difference on top of just solid rasterized lighting methods. Its grossly expensive to run RT in real time on current GPUs and the benefits are barely tangible. So not only do you get knockoff/bottom barrel cut die GPUs, you get them with lackluster VRAM, very high TDPs, and with a technology that mostly just kills your FPS.

It also echoes in the amount of RT-capable content on launch. The whole industry got caught by surprise and they really still are. Look at the implementation here and you have just yet another confirmation. If they were really at it for a decade, they sure did a shit job getting people involved.

10 years my ass ;)
 
Nvidia developed this in 10 years? Don't believe every marketing line you hear. The drive to push RT on consumer GPUs is purely financial and marketing-related. Every GPU eats rasterized workloads for breakfast right about now, even at very high resolutions. GPU mfgrs are hitting a wall in terms of potential sales since 2017. Realistically, RT was technically 'feasible' since Volta. That's not 10 years old, the cores are another iteration of what they built for enterprise. Repurposed.
I don't agree at all. First of all, if you tell me the "10 years" is wrong, source? Give me then a exact amount of time they needed to do it, otherwise, I really don't see a point. :) The RT core works well, and it could very well (including tensor cores) have taken a long of time to develop.

Secondly, RT is way way more than just financially motivated, this is highly cynical from you. Nvidia isn't a perfect company and does a lot of things for "money", but they love graphics and THIS is way more than financially motivated.

Thirdly, RT was not technically feasible since Volta, if you want to be exact, it was "feasible" way sooner. But you stretch the meaning of the word "feasible" here. We are talking about REAL TIME ray tracing, not whatever you are thinking about, so your whole "point" really isn't a point.

Fourth, source for the RT core, that came before 20 series? There was none, 20 series has the first gen RT core. That is also wrong.
This echoes in the implementations of RT as of today. You barely get a noticeable difference on top of just solid rasterized lighting methods
Let me stop you right there. You don't have a RT card and I don't think you ever saw it really in action. It looks great in a NUMBER of games, including, what I played myself: CP77, Control, MW5. You're a 100% wrong with this.
10 years my ass ;)
Sorry bro, you don't have a point. I really don't care about empty banter.
 
I don't agree at all. First of all, if you tell me the "10 years" is wrong, source? Give me then a exact amount of time they needed to do it, otherwise, I really don't see a point. :) The RT core works well, and it could very well (including tensor cores) have taken a long of time to develop.

Secondly, RT is way way more than just financially motivated, this is highly cynical from you. Nvidia isn't a perfect company and does a lot of things for "money", but they love graphics and THIS is way more than financially motivated.

Thirdly, RT was not technically feasible since Volta, if you want to be exact, it was "feasible" way sooner. But you stretch the meaning of the word "feasible" here. We are talking about REAL TIME ray tracing, not whatever you are thinking about, so your whole "point" really isn't a point.

Fourth, source for the RT core, that came before 20 series? There was none, 20 series has the first gen RT core. That is also wrong.

Let me stop you right there. You don't have a RT card and I don't think you ever saw it really in action. It looks great in a NUMBER of games, including, what I played myself: CP77, Control, MW5. You're a 100% wrong with this.

Sorry bro, you don't have a point. I really don't care about empty banter.

Where is your source its 10 years in the making? Nvidia documents things quite well. Shouldn't be a problem eh? Because their GPUs or products show no evidence of a strong push for (RT)RT until Volta/Turing.

As for the 'Real time' part of ray-tracing, in fact... you'd be wrong. What carries RTX cards are the updated Quadros for content producers, which is where the real money is. *removed diff node - this was Turing (TSMC).

Otherwise, the best evidence we have is what we have on the market at moment X/Y. Which is what I posted about.

The only source I know for 10 years in the making is Jensen saying so on SIGGRAPH. He also said the industry was all in on this... meanwhile Nvidia is throwing money at devs left and right to get their pre-emptive nonsense in games.

I mean really, just look at what happens in the gaming market, and you can distill the evidence quite easily. AMD is confident enough to postpone this completely and a full console gen is doing half assed RT. That's not broad industry support in the slightest.
 
Last edited:
Where is your source its 10 years in the making? Nvidia documents things quite well. Shouldn't be a problem eh?
I asked first? :)
Otherwise, the best evidence we have is what we have on the market at moment X/Y. Which is what I posted about.
The best evidence for what?
The only source I know for 10 years in the making is Jensen saying so on SIGGRAPH. He also said the industry was all in on this... meanwhile Nvidia is throwing money at devs left and right to get their pre-emptive nonsense in games.
Well they did that for a long time, it has up and downsides. The upsides is "sometimes" better graphics, or generally new features implemented, the downside is, that it *can* make things worse in general, if you don't have a Nvidia card. But to be honest, "TWIMTBP" is so old, nobody talks about it anymore. Nvidia optimizations are still there, but mostly about RT/DLSS now, everything else seems to be mostly gone. And I really don't mind their RT/DLSS optimizations, since everyone who seriously wants RT buys a Nvidia card anyway.
I mean really, just look at what happens in the gaming market, and you can distill the evidence quite easily.
I don't see things that cynical. I'm trying to be positive.

As for the 'Real time' part of ray-tracing, in fact... you'd be wrong. What carries RTX cards are the updated Quadros for content producers, which is where the real money is. *removed diff node - this was Turing (TSMC).
Your ninja edit, my ninja edit here: You're saying TURING, TURING IS 20 series, so then, you are confirming what I said. Turing is a GAMING arch first, and only after that workstation. AMPERE is a mixed arch and also more optimized for workload. It's not that efficient for gaming. Look at RDNA 2.
 
I don't agree at all. First of all, if you tell me the "10 years" is wrong, source? Give me then a exact amount of time they needed to do it, otherwise, I really don't see a point. :) The RT core works well, and it could very well (including tensor cores) have taken a long of time to develop.

Secondly, RT is way way more than just financially motivated, this is highly cynical from you. Nvidia isn't a perfect company and does a lot of things for "money", but they love graphics and THIS is way more than financially motivated.

Thirdly, RT was not technically feasible since Volta, if you want to be exact, it was "feasible" way sooner. But you stretch the meaning of the word "feasible" here. We are talking about REAL TIME ray tracing, not whatever you are thinking about, so your whole "point" really isn't a point.

Fourth, source for the RT core, that came before 20 series? There was none, 20 series has the first gen RT core. That is also wrong.

Let me stop you right there. You don't have a RT card and I don't think you ever saw it really in action. It looks great in a NUMBER of games, including, what I played myself: CP77, Control, MW5. You're a 100% wrong with this.

Sorry bro, you don't have a point. I really don't care about empty banter.
The example I keep using is Hollywood effects. Those have all been ray traced since the original Jurassic Park (if not earlier). Without having ot resort to as many tricks, ray tracing is simply capable of more realistic results. Of course, it can be argued that games don't always need to look realistic, but that's missing two important points. First, that games are not the only intended target for video cards. And second, even if the baseline looks more realistic, there's no reason you can't have artistic effects while employing ray tracing.

Also, leave Vayra alone. He's decided years ago RT is a gimmick, arguments won't change his mind. He's entitled to his opinion, you just have to acknowledge it's just his opinion and nothing more.
 
I asked first? :)

The best evidence for what?

Well they did that for a long time, it has up and downsides. The upsides is "sometimes" better graphics, or generally new features implemented, the downside is, that it *can* make things worse in general, if you don't have a Nvidia card. But to be honest, "TWIMTBP" is so old, nobody talks about it anymore. Nvidia optimizations are still there, but mostly about RT/DLSS now, everything else seems to be mostly gone. And I really don't mind their RT/DLSS optimizations, since everyone who seriously wants RT buys a Nvidia card anyway.

I don't see things that cynical. I'm trying to be positive.


Your ninja edit, my ninja edit here: You're saying TURING, TURING IS 20 series, so then, you are confirming what I said. Turing is a GAMING arch first, and only after that workstation. AMPERE is a mixed arch and also more optimized for workload. It's not that efficient for gaming. Look at RDNA 2.
I've already told you what I base my statements on, didn't I?
How can I find evidence of something 'not being said'? Interesting paradox, that :)

We won't get a conclusive answer on this, and you know it. Its just a choice of what you want to believe.

I'm a realist. You're an optimist. NP.

As for my opinion... what is the major issue we have right now in GPU land? Now scroll back to early Turing days and see what I've said about this.
Cost for GPUs has exploded, the shortages happened, and RT is now available to a tiny subset of gamers. Node shrinks later, we're looking at 350W TDPs instead of 250W.

And we call it progress.
 
The example I keep using is Hollywood effects. Those have all been ray traced since the original Jurassic Park (if not earlier). Without having ot resort to as many tricks, ray tracing is simply capable of more realistic results. Of course, it can be argued that games don't always need to look realistic, but that's missing two important points. First, that games are not the only intended target for video cards. And second, even if the baseline looks more realistic, there's no reason you can't have artistic effects while employing ray tracing.
I agree that RTRT can be better implemented today, everyone knows this, it's only in the 2nd gen now and AMD also slowed devs down with their "anti optimizations" optimized for their inferior "Ray Accelerators", that are too tiny to be really good.

However, that said, it is still great in some games and that shows that it CAN make a dent.
Also, leave Vayra alone. He's decided years ago RT is a gimmick, arguments won't change his mind. He's entitled to his opinion, you just have to acknowledge it's just his opinion and nothing more.
He should really move on from his outdated GPU and try it out for himself, my opinion. Otherwise, whatever.

How can I find evidence of something 'not being said'? Interesting paradox, that :)
Yeah well, then there is no point in saying I'm wrong, right? Just leave it be then.
I'm a realist. You're an optimist. NP.
Honestly, realists are mostly pessimists in disguise. I know it, I was like that too. It's too hard to be a realist without being kinda negative. But I don't care about that, you have some cynical views, and before I came back I also saw that in multiple other posts, I think you can do better. Just my opinion, you can disregard it if you want.

And we call it progress.
I don't, but I can't comment on it further now. :)
 
The example I keep using is Hollywood effects. Those have all been ray traced since the original Jurassic Park (if not earlier). Without having ot resort to as many tricks, ray tracing is simply capable of more realistic results. Of course, it can be argued that games don't always need to look realistic, but that's missing two important points. First, that games are not the only intended target for video cards. And second, even if the baseline looks more realistic, there's no reason you can't have artistic effects while employing ray tracing.

Also, leave Vayra alone. He's decided years ago RT is a gimmick, arguments won't change his mind. He's entitled to his opinion, you just have to acknowledge it's just his opinion and nothing more.

See... thát part of RT I don't contest at all. Yes, it can and does improve visual fidelity. At the same time, gaming is not film and we've seen RT implementations that forgot about that. Even Dying Light 2 shows this: RT makes stuff a lot darker, which while realistic, is not always preferable in games.

Note not always. I see how it works for some. Definitely open to arguments in that sense. Also have seen the tech live, contrary to what's been stated above by others.

I'm really still watching it evolve from the sidelines, because buying in today seems like a seriously bad buy.

However, that said, it is still great in some games and that shows that it CAN make a dent.

He should really move on from his outdated GPU and try it out for himself, my opinion. Otherwise, whatever.
Absolutely and absolutely.

But the market rn is simply denying it.
 
Even Dying Light 2
Is such a mess this game. Without RT it looks like a old game, with it, it looks properly RT'd. Weird. The difference is "night" and day, pun intended.
 
As for my opinion... what is the major issue we have right now in GPU land? Now scroll back to early Turing days and see what I've said about this.
Cost for GPUs has exploded, the shortages happened, and RT is now available to a tiny subset of gamers. Node shrinks later, we're looking at 350W TDPs instead of 250W.

And we call it progress.
Honestly, the first 3D accelerators added a whole new card to your system. An they were progress.
Never judge new tech by its first iterations. Potential is a far better indicator ;)
 
Last edited:
Never judge new tech by its first iterations.
Turing was way ahead of its time and aged admirably. It only loses ~5% more FPS than comparable new GPUs like 3080 if RT is activated. Tensor cores work well too.
 
Back
Top