• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Accelerates Neural Graphics PC Gaming Revolution at GDC With New DLSS 3 PC Games and Tools

GFreeman

News Editor
Staff member
Joined
Mar 6, 2023
Messages
1,138 (2.67/day)
Ahead of next week's Game Developers Conference (GDC), NVIDIA announced an expanded game roster and new developer plug-ins for NVIDIA DLSS 3. The latest version of NVIDIA's AI-powered Deep Learning Super Sampling (DLSS) technology is now supported in an assortment of blockbuster games and franchises, and being integrated into Unreal Engine, one of the world's most popular game engines. The company is also publicly releasing the DLSS Frame Generation plug-in to further ease developer adoption of the technology.

"Neural graphics has revolutionized gaming since its introduction with NVIDIA DLSS, and we're now taking it to new heights," said Matt Wuebbling, vice president of global GeForce marketing at NVIDIA. "PC gaming super-franchises such as Diablo and Forza Horizon and Bethesda's new Redfall are raising the bar for image quality with stunning graphics while using DLSS to keep gameplay smooth as silk." Since its launch in 2018, NVIDIA DLSS has driven a neural graphics revolution in PC gaming. Neural graphics intertwines AI and graphics to create an accelerated rendering pipeline that continuously learns and improves. Instead of natively rendering every pixel in a frame, DLSS allows the game to render 1/8th of the pixels then uses AI and GeForce RTX Tensor Cores to reconstruct the rest of the pixels, dramatically multiplying frame rates, while delivering crisp, high-quality images that rival native resolution.



Diablo IV, Forza Horizon 5, Redfall - DLSS 3 is in the Biggest Games and Biggest Franchises
To date, over 270 games and applications use NVIDIA DLSS as an AI-powered performance accelerator. DLSS 3, the latest version of the technology, is available in 28 released games and has been adopted 7x faster than DLSS 2 in the first six months of their respective launches.

Among the highly anticipated games being added to the DLSS roster is Forza Horizon 5, named the best open-world racing game of all time by several media outlets and currently holding the highest rating of any racing game tracked by OpenCritic. Forza Horizon 5, which already supports ray tracing, will update to DLSS 3 on March 28.

Redfall, Bethesda's highly anticipated, open-world, co-op first-person shooter from Arkane Austin, the award-winning team behind Prey and Dishonored, is launching on May 2 with DLSS 3.

In addition, Diablo IV, the latest installment of the genre-defining Diablo franchise - multiple games of which are considered among the best of all time - will be launching on June 6 with DLSS 3.

"Supporting smooth gameplay in Diablo IV is a priority for Blizzard," said Michael Bukowski, Diablo IV technical director at Blizzard Entertainment. "We're excited by the high frame rate of Diablo IV running on NVIDIA GeForce RTX 40 Series hardware and DLSS 3."

Additional PC games announcing support of NVIDIA DLSS at GDC include Deceive Inc., Gripper, Smalland: Survive the Wilds and THE FINALS.

DLSS Frame Generation Publicly Available for Developers at GDC
NVIDIA will make DLSS Frame Generation plug-ins publicly available during GDC, allowing even more developers to integrate the framerate boosting technology into their games and applications.

DLSS Frame Generation will be available to access via NVIDIA Streamline, an open-source, cross-vendor framework that simplifies the integration of super-resolution technologies in 3D games and apps.

DLSS technology is always improving through ongoing training on NVIDIA's AI supercomputer. The public release will incorporate the latest DLSS enhancements made earlier this year, including:
  • DLSS Frame Generation takes better advantage of game engine data, improving user interface stability and image quality during fast movement.
  • DLSS Super Resolution improves Ultra Performance mode, with finer detail stability and overall better image quality.
  • DLAA improves image quality, reduces ghosting and improves edge smoothness in high-contrast scenarios.

Unreal Engine 5.2 Integration, Adding DLSS 3 to Unreal Engine Games Simpler Than Ever
NVIDIA and Epic announced the integration of NVIDIA DLSS 3 into the popular Unreal Engine (UE) game engine. Unreal Engine is an open and advanced real-time 3D creation tool that gives game developers and creators the freedom and control to deliver cutting-edge real-time 3D content, interactive experiences and immersive virtual worlds. The DLSS 3 plug-in will debut in UE 5.2, making it simpler for any developer to accelerate the performance of their game or application.

"NVIDIA DLSS 3 introduces truly impressive frame generation technology and the Unreal Engine 5.2 plug-in will offer developers a great choice for increased quality and performance of their games," said Nick Penwarden, vice president of engineering at Epic Games.

View at TechPowerUp Main Site | Source
 
Joined
Sep 6, 2013
Messages
3,008 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
So, DLSS 3 is an accelerator not a software trick.

There you have it. You get a brand new AI accelerator, in case you where wondering why prices are so high.
 
Joined
Dec 31, 2020
Messages
777 (0.64/day)
Processor E5-2690 v4
Motherboard VEINEDA X99
Video Card(s) 2080 Ti WINDFROCE OC
Storage NE-512 KingSpec
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
1/8th of pixels means they're super scaling 800p input into 2160p final image. There is no way.
 
Joined
Sep 1, 2020
Messages
2,041 (1.52/day)
Location
Bulgaria
1/8th of pixels means they're super scaling 800p input into 2160p final image. There is no way.

How beautiful it is! Isn't she? Now at better resolution.
 
Joined
Dec 31, 2020
Messages
777 (0.64/day)
Processor E5-2690 v4
Motherboard VEINEDA X99
Video Card(s) 2080 Ti WINDFROCE OC
Storage NE-512 KingSpec
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
Oh they mean by including the multiplying of the frames is 1 to 8. Yes it is still garbage, just give me 10K Cuda cores stripped down to classic shaders. No non sense.
 
Last edited:
Joined
Apr 14, 2018
Messages
464 (0.21/day)
“high-quality images that rival native resolution.”

Yikes. Considering the number of bugs and artifacts in the majority of games tested (ghosting, stretching, distorted/flickering UI and so on), this is far from any kind of accurate statement.

So far fixing these issues also look like a per-game basis sort of thing, which is bad for everyone.

At least DLSS 3 will remain a catchy marketing gimmick until it’s forgotten about just like most of Nvidias proprietary solutions or “advancements”.

There is way too much focus on upscaling software/hardware recently. While there is a limit to what can be brute forced in traditional rasterization, having developers try to design games with hybrid rendering methods (both traditional and upscaling techniques), seems to end in buggy results more often than not.
 
Joined
May 3, 2018
Messages
2,312 (1.05/day)
How on earth can the AI ever know the motion vectors and small nuances in how things change between two frames if the motion is anything but simple. There will always be weird crap going on they cannot predict.

It would be better if they stuck to improving rasterisation and RT and DLSS2 which I have no problem with. I would also like user selectable resolutions that are to be used for scaling. Eg 1800p for 4K not just 1440p, 1200p for 1440p etc.

But it appears to me Nvidia is now totally obsessed with AI and will throw rasterisation under the bus and market BS fake fps figures more and more.
 
Joined
May 31, 2016
Messages
4,328 (1.49/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
How on earth can the AI ever know the motion vectors and small nuances in how things change between two frames if the motion is anything but simple. There will always be weird crap going on they cannot predict.

It would be better if they stuck to improving rasterisation and RT and DLSS2 which I have no problem with. I would also like user selectable resolutions that are to be used for scaling. Eg 1800p for 4K not just 1440p, 1200p for 1440p etc.

But it appears to me Nvidia is now totally obsessed with AI and will throw rasterisation under the bus and market BS fake fps figures more and more.
It would seem it is easier to DLSS3 games and make them appear running faster than invest in a new card with raw power and performnce. I only hope, this will not lead to 5000 series costing way more for mid cards just to have the DLSS3 advertisement logo with hundreds of FPS when the actual FPS will be mediocre and still charge a lot due to FPS it can achieve.
 
Joined
Feb 1, 2013
Messages
1,248 (0.30/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
How on earth can the AI ever know the motion vectors and small nuances in how things change between two frames if the motion is anything but simple. There will always be weird crap going on they cannot predict.

It would be better if they stuck to improving rasterisation and RT and DLSS2 which I have no problem with. I would also like user selectable resolutions that are to be used for scaling. Eg 1800p for 4K not just 1440p, 1200p for 1440p etc.

But it appears to me Nvidia is now totally obsessed with AI and will throw rasterisation under the bus and market BS fake fps figures more and more.
It was only a matter of time. Real-time graphics consumers got dumber, and the producers found their hammer. Prepare to see all graphics outside of the movie industry as nails for this accelerator.

Until the biggest consumers (competitive) start to be adversely affected, where the generated graphics is causing aim mis-predictions, there will be no push back. It took decades to address refresh rates, input latency, frame pacing. How long will it take players to say the graphics frame isn't accurate?
 
Last edited:
Joined
Apr 14, 2018
Messages
464 (0.21/day)
It was only a matter of time. Real-time graphics consumers got dumber, and the producers found their hammer. Prepare to see all graphics outside of the movie industry as nails for this accelerator.

Until the biggest consumers (competitive) start to be adversely affected, where the generated graphics is causing aim mis-predictions, there will be no push back. It took decades to address refresh rates, input latency, frame pacing. How long will it take players to say the graphics frame isn't accurate?

The proof is already there for anyone who cares to look though. And for some odd reason visual artifacts, ghosting and other problems don’t matter at all when “LoOk At MaH fPs” is all that matters.

It’s weird that people buy high end parts 4070/4080/4090 that can all run 1440p and for the most part 4k above 120 and 60 fps respectively, and then they actively turn on upscaling to visually degrade the quality of the image. And then praise DLSS as some amazing thing.

On top of that, using DLSS 3 in scenarios where it should theoretically have the best impact (midrange parts 4060-4070) there’s potential to not meet the FPS thresholds to where you end getting an even worse experience in terms of frame pacing and noticing more the visual artifacts because that FPS threshold isn’t being met.

Very difficult to understand why the tech is so positively lauded.
 
Top