Monday, December 4th 2023

Call of Duty and Cyberpunk 2077 Getting More NVIDIA RTX Love

NVIDIA is celebrating 500 games and applications being part of the RTX ecosystem, and Activision has a large set of feature announcements for the RTX ecosystem. To begin with, Call of Duty: Warzone Season 1: Urzikstan debuts with DLSS 3 Frame Generation support on December 6th. Call of Duty: Modern Warfare III adds full ray tracing and DLSS 3.5 Ray Reconstruction feature to all online multiplayer lobbies from December 6th.

Meanwhile, CD Projekt Red is launching Cyberpunk 2077 Ultimate Edition this week, which debuts with enhanced ray tracing and DLSS support with Update 2.1 (available to both Ultimate and Standard edition players). This includes the Ray Tracing: Overdrive mode, which exits preview-feature status. This adds even higher amounts of ray traced surfaces. For NVIDIA RTX 40-series GPUs, these take advantage of Shader Execution Reordering and Opacity Micromaps features. The game also gets a major global illumination upgrade with Reservoir-based Spatiotemporal Importance Resampling Global Illumination (ReSTIR GI). Lastly, Cyberpunk 2077 now fully implements DLSS 3.5 Ray Reconstruction on NVIDIA RTX 40-series "Ada" and 30-series "Ampere" GPUs, which vastly improves the quality of ray traced elements, such as reflections with supersampling enabled.
Add your own comment

12 Comments on Call of Duty and Cyberpunk 2077 Getting More NVIDIA RTX Love

#1
Onasi
“full ray tracing”
“online multiplayer lobbies”
Bold strategy, let’s see if it pays off for ‘em.
Posted on Reply
#2
FreedomEclipse
~Technological Technocrat~
going to be real honest here. I thought the article was going to be about Nvidia giving away copies of CoD with their new GPUs....

A free copy of that game isnt adding any extra value to the purchase. You probably wouldnt even be able to re-sell it because its such a garbage title. bundling a copy of the game with the card is like double negative. Not only are the GPus stupidly priced but they devalue it even further by bundling a garbage game with it.
Posted on Reply
#3
Vya Domus
They've redefined and revolutionized ray tracing reconstruction frame generating whatever the hell so many times that I've lost count by now.
Posted on Reply
#4
Double-Click
So "gifting" features that should be standard for AAA titles...
That's gotta be the most arrogant piece of PR trash I've seen in a while.
Posted on Reply
#5
QUANTUMPHYSICS
I was fortunate that I was able to get a 3090 during Cyberpunk's release week. I replaced my 2080Ti and saw immediate framerate improvements on "Psycho mode".

Now, many years later, this game is finally ready for mainstream and even the metro-rail works with the latest patches.

I played through the game twice experiencing minor bugs - but nothing game breaking.

Easily the best looking game, with the most immersive music I'd ever played.

Once the game is fully finished, I'll play through one more time on my 14900k / 4090/ 64GB DDR5.
Posted on Reply
#6
wolf
Performance Enthusiast
Gonna need to give CP2077 another boot up to see these enhancements in play, not holding my breath for anything insane, adding yet more reflective surfaces seems a bit lol, the game is crazy reflective as it is already.
Posted on Reply
#7
Crackong
Translation: Moar $$ pumped in 2077 tech demo.

tbh,
I 've play 2077 for one playthough and never touched it again.
This game had gone too far of being a Nvidia tech demo,
and don't even bother to put essential features like a proper New game + mode in the game.
I know the 2.0 is like a whole new game and DLC looks nice,
But looking at my 100 hours playtime and I can't get anything out of it because there isn't a New game+ mode?

Nope,
You can sit right there, bottom of my steam library.
Posted on Reply
#8
Dr. Dro
CrackongTranslation: Moar $$ pumped in 2077 tech demo.

tbh,
I 've play 2077 for one playthough and never touched it again.
This game had gone too far of being a Nvidia tech demo,
and don't even bother to put essential features like a proper New game + mode in the game.
I know the 2.0 is like a whole new game and DLC looks nice,
But looking at my 100 hours playtime and I can't get anything out of it because there isn't a New game+ mode?

Nope,
You can sit right there, bottom of my steam library.
Cyberpunk isn't an Nvidia tech demo, at launch, it was horribly broken, and arguably a game that missed its mark (and wasn't what was initially promised, I hated it for the longest time, although I think i'm beginning to forgive as it's really showing how the devs have worked on it a lot) but as it got fixed and optimized, all that it does is badly expose how crude the Radeon cards are when it comes to latest-generation graphics techniques. AMD just doesn't compete here.
Posted on Reply
#9
Crackong
Dr. DroCyberpunk isn't an Nvidia tech demo, at launch, it was horribly broken, and arguably a game that missed its mark (and wasn't what was initially promised, I hated it for the longest time, although I think i'm beginning to forgive as it's really showing how the devs have worked on it a lot) but as it got fixed and optimized, all that it does is badly expose how crude the Radeon cards are when it comes to latest-generation graphics techniques. AMD just doesn't compete here.
All it does is throwing all the latest and horribly optimized visuals into one giant cappuccino.
They did it when it launches, they still doing it today.

It isn't a AMD - Nvidia thing either.

As a Nvidia user I felt being deceived to see they just keep throwing all these artificial performance barriers to us.
Just to keep our latest graphics cards at lowest FPS as possible.

At the same time, lacking in implementing basic features like a proper New game + mode.

And btw, about the tech demo thing, it is almost common sense now 2077 is THE Nvidia tech demo.
It is too obvious and not worth arguing.
Posted on Reply
#10
Dr. Dro
CrackongAll it does is throwing all the latest and horribly optimized visuals into one giant cappuccino.
They did it when it launches, they still doing it today.

It isn't a AMD - Nvidia thing either.

As a Nvidia user I felt being deceived to see they just keep throwing all these artificial performance barriers to us.
Just to keep our latest graphics cards at lowest FPS as possible.

At the same time, lacking in implementing basic features like a proper New game + mode.

And btw, about the tech demo thing, it is almost common sense now 2077 is THE Nvidia tech demo.
It is too obvious and not worth arguing.
Without getting into that merit as I always considered Cyberpunk to be a bad game, there are no artificial barriers, just extremely costly graphical improvements that aren't worth it on current generation hardware.

It's no tech demo, really. It's a game that makes extensive use of the very latest rendering techniques, that's why it performs so badly on AMD. It exemplifies the 95% rule right now.

We've reached the comfy point in photorealism vs. performance relatively recently but further improvements will require exponentially more compute performance backed by ever more complex graphics drivers.
Posted on Reply
#11
the54thvoid
Intoxicated Moderator
Dr. DroWe've reached the comfy point in photorealism vs. performance relatively recently but further improvements will require exponentially more compute performance backed by ever more complex graphics drivers.
Hmm, that wasn't the Cyberpunk 2077 I played. I'd say there were better games out there, many using 'old-school' techniques to create immersive realism. Too many dull slab surfaces in the game and low res details to be called a comfy point in photo-realism versus performance, at least for me.

Games like RE Village are surprisingly impressive. I loved Days Gone. The major improvement in CP 2077, IMO, is, ironically, all the flashy neon lights. But overall, I don't think the game is visually any better than a lot of others from the past 5 years or so. Hell, the first game that made me scrutinise the screen was Doom 3, way back in 2004. It was visceral.

RT/PT will maybe one day be the norm, but I don't think test driving it in games is ideal. Once someone comes up with a proper and efficient hardware solution, then it would be better to see. Until then, I feel it's still very much like putting candy sprinkles on top of a cake. FWIW, I was an early defender of CP 2077 and played through with very few bugs, albeit at about 40fps on my GSync monitor.
Posted on Reply
#12
Dr. Dro
the54thvoidHmm, that wasn't the Cyberpunk 2077 I played. I'd say there were better games out there, many using 'old-school' techniques to create immersive realism. Too many dull slab surfaces in the game and low res details to be called a comfy point in photo-realism versus performance, at least for me.

Games like RE Village are surprisingly impressive. I loved Days Gone. The major improvement in CP 2077, IMO, is, ironically, all the flashy neon lights. But overall, I don't think the game is visually any better than a lot of others from the past 5 years or so. Hell, the first game that made me scrutinise the screen was Doom 3, way back in 2004. It was visceral.

RT/PT will maybe one day be the norm, but I don't think test driving it in games is ideal. Once someone comes up with a proper and efficient hardware solution, then it would be better to see. Until then, I feel it's still very much like putting candy sprinkles on top of a cake. FWIW, I was an early defender of CP 2077 and played through with very few bugs, albeit at about 40fps on my GSync monitor.
That's the whole idea behind the 95% rule I mention. Another game showing this is Alan Wake 2, they have such high system requirements, but they don't really look like they're any more photorealistic than a lot of other games we've played already, yeah. That's the mark to overcome, but the challenges to do that are immense and more often than not, aren't worth the investment or limiting one's audience to the latest generation hardware owners over it.

Developers got really good at traditional raster graphics, even doing atmospherics that way. Supposedly, the adoption of full blown path traced graphics is intended to simplify the development cycle while somewhat increasing realism, albeit a massive compute performance cost.

I don't think pushing the boundaries makes of a game a tech demo but I also recognize it's a massive outlier especially when it receives all of the very latest rendering techniques and supports them fairly well, perhaps that's my point
Posted on Reply
Add your own comment
Apr 27th, 2024 17:48 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts