• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Releases GeForce 511.23 Game Ready Drivers

I do not get this. Why on earth would I want to scale down to a lower res for better graphics. I'm running 2160P 4k res anyway. Another marketing gimic for nvidia. Now with your super duper card you can run 4k and scale it down to 1440p for better graphics um WTF. Like Seriously getting sick of this. I want 8k gaming at 150 fps. Hoping the Geforce 4000 series has the hopper. https://wccftech.com/nvidia-h100-ho...-dies-43008-cuda-cores-and-48-gb-hbm4-memory/ that right people 2 dies 2 gpu's single card. SLI but much better.
It's called supersampling and it does increase the overall quality of the image significantly. Say you have a 1440P monitor and your VGA is capable of running a certain game at very high FPS in 4K. Then supersampling stage can render the image in 4K and downsample it to 1440P which significantly improves the quality of the image rather than running it at native 1440P
 
Playing with the new DSR has set some stupid records for me:

5760x3240 @ 80Hz, in starcraft II.
(3840x2160 80Hz reported by panel, 2560x1440 165Hz native)


Lets just say... it was pretty.
And my 3090 chugged at times lol.
 
Playing with the new DSR has set some stupid records for me
It's kind of incredible really, the 3080 powers through a lot of games at 2.25x and it IQ is incredible. Bonus points if the game supports DLSS. 2560x1080 up to 5120x2160 down to 3440x1440 lol and the IQ is phenomenal, and still performs better than native in the titles I tested.
 
Nvidia H100 Hopper, will cost as much as a family car :laugh:
3090TI single card cost the same as a car in Canada.

It's called supersampling and it does increase the overall quality of the image significantly. Say you have a 1440P monitor and your VGA is capable of running a certain game at very high FPS in 4K. Then supersampling stage can render the image in 4K and downsample it to 1440P which significantly improves the quality of the image rather than running it at native 1440P
You can take 1440p and shove it somewhere. Seriously when the 10 series came out 4k was the thing. Then Nvidia backtracked to 1440p. Totally sad I jumped on the 4k gaming and then got screwed in the end by nvidia. Also I get a good laugh when people say certain games do not support SLI. Yes maybe in the game it's not supported but in the driver it is. I never set game settings for SLI I do it on the driver side. I play like 3 games that say no support for SLI yet if I check the SLI indicator and the GPU in W10 it shows both being used and peaking 100% both GPU's so guess what SLI works. I know this because I have been doing SLI since my Orginial Diamond Monster Voodoo 2's in SLI. I remember playing Quake 3 arena on those babies and had better FPS then the newer single card Voodoo 3's.

3090TI single card cost the same as a car in Canada.


You can take 1440p and shove it somewhere. Seriously when the 10 series came out 4k was the thing. Then Nvidia backtracked to 1440p. Totally sad I jumped on the 4k gaming and then got screwed in the end by nvidia. Also I get a good laugh when people say certain games do not support SLI. Yes maybe in the game it's not supported but in the driver it is. I never set game settings for SLI I do it on the driver side. I play like 3 games that say no support for SLI yet if I check the SLI indicator and the GPU in W10 it shows both being used and peaking 100% both GPU's so guess what SLI works. I know this because I have been doing SLI since my Orginial Diamond Monster Voodoo 2's in SLI. I remember playing Quake 3 arena on those babies and had better FPS then the newer single card Voodoo 3's.
Also in Canada the 3090TI is 3,261.99. Price tag. You can buy a used card under that easy.
 
These drivers seem to work quite a bit better than the 497.09s I had prior.
 
Any point in using this on my 980ti? current driver is 497.29
 
Still no optimal settings available in Halo Infinite with a 3070 Ti. Oh well I just run it on ultra with ~100 fps

3090TI single card cost the same as a car in Canada.


You can take 1440p and shove it somewhere. Seriously when the 10 series came out 4k was the thing. Then Nvidia backtracked to 1440p. Totally sad I jumped on the 4k gaming and then got screwed in the end by nvidia. Also I get a good laugh when people say certain games do not support SLI. Yes maybe in the game it's not supported but in the driver it is. I never set game settings for SLI I do it on the driver side. I play like 3 games that say no support for SLI yet if I check the SLI indicator and the GPU in W10 it shows both being used and peaking 100% both GPU's so guess what SLI works. I know this because I have been doing SLI since my Orginial Diamond Monster Voodoo 2's in SLI. I remember playing Quake 3 arena on those babies and had better FPS then the newer single card Voodoo 3's.


Also in Canada the 3090TI is 3,261.99. Price tag. You can buy a used card under that easy.
Nvidia did sway everyone to 4K back around the 900 series because that's when 4K, HDMI 2.0 all hit the market. Sorry you took the bait. 4K isn't hugely better for IQ compared to 1440p on a normal sized monitor (or tv) yet takes much more processing power and vram. They are still hosing the market this way via vram they put on new cards. My brand new 3070ti only has 8gb, which is barely enough for 1440p ultrawide, it would not be enough for 4K. My 1080ti had 11gb but a lot less horsepower. They are skimming on vram to reduce their own costs and force 4Kers to buy top of the line cards. It's true.
 
Any point in using this on my 980ti? current driver is 497.29

Not really, but I for one have grabbed them anyway. I'd like to get an RTX card at some point, but I'm not prepared to splash the sort of cash required to nab one at the moment.
 
Agreed. My experience with them has been good so far.
I haven't notice any monitor flickering yet even at desktop like I did with the 497s.
 
Aaand they broke compute.
Who hired AMD devs?
Remember, just because a new driver is released doesn't mean you have to use it. If the driver you're using renders functionality you need and the new one breaks it, stick with the one that works.

Sarcasm or are you truly that clueless about the hardware needed for this?
That was very clearly sarcasm, thus the raise eyebrow emoji..
 
I haven't notice any monitor flickering yet even at desktop like I did with the 497s.

Yeah I had flickering before too with gsynch on in windowed mode but it seems gone now.
 
Remember, just because a new driver is released doesn't mean you have to use it. If the driver you're using renders functionality you need and the new one breaks it, stick with the one that works.
Indeed. FWIW my cuda based mining app is working, so CUDA seems ok.
 
Yeah I had flickering before too with gsynch on in windowed mode but it seems gone now.

I would get it even with gsync off, and thats in game and desktop.
 
Not really, but I for one have grabbed them anyway. I'd like to get an RTX card at some point, but I'm not prepared to splash the sort of cash required to nab one at the moment.

Me too, i intend to wait 3 or 4mths and see how much cash i have secreted. Till then the 980ti is still good as i only game at 1080p atm anyway
 
Me too, i intend to wait 3 or 4mths and see how much cash i have secreted. Till then the 980ti is still good as i only game at 1080p atm anyway

Definitely, I can still play at 1080P, 1440P and 4K depending on the game, and at 120Hz with the first two. And thanks to DSR anywhere in-between too. So definitely don't feel like I'm missing out that much.

Happy to wait.
 
3090TI single card cost the same as a car in Canada.


You can take 1440p and shove it somewhere. Seriously when the 10 series came out 4k was the thing. Then Nvidia backtracked to 1440p. Totally sad I jumped on the 4k gaming and then got screwed in the end by nvidia. Also I get a good laugh when people say certain games do not support SLI. Yes maybe in the game it's not supported but in the driver it is. I never set game settings for SLI I do it on the driver side. I play like 3 games that say no support for SLI yet if I check the SLI indicator and the GPU in W10 it shows both being used and peaking 100% both GPU's so guess what SLI works. I know this because I have been doing SLI since my Orginial Diamond Monster Voodoo 2's in SLI. I remember playing Quake 3 arena on those babies and had better FPS then the newer single card Voodoo 3's.


Also in Canada the 3090TI is 3,261.99. Price tag. You can buy a used card under that easy.

000933.png
 
Remember, just because a new driver is released doesn't mean you have to use it. If the driver you're using renders functionality you need and the new one breaks it, stick with the one that works.
While this is perfect from an user standpoint, telling your users to revert back when they also want to play God of War is an issue.
 
3090TI single card cost the same as a car in Canada.

Also I get a good laugh when people say certain games do not support SLI. Yes maybe in the game it's not supported but in the driver it is. I never set game settings for SLI I do it on the driver side.

Get SLI working in starcraft II legacy of the void without protoss pylon rings being broken, and get it working in killing floor 2 without vanishing textures.
Some games are 100% totally and utterly broken with SLI. SC2 had SLI support, and when the expansions came out they still claim it has SLI, when the official profile just disables the second GPU and no more (WITHOUT fixing the pylon bug, at least when i last tried).

Lucky you, playing some Esports titles and benchmarks... but there are many, many games out there that not just lack SLI support, but outright break when it's enabled.
 
Last edited:
While this is perfect from an user standpoint, telling your users to revert back when they also want to play God of War is an issue.
I understand your perspective, but disagee. There is nothing wrong with skipping a driver point release. God of War still runs on the drivers released in December. We have to remember that drivers and game optimizations are a very complicated task.
 
Last edited:
I understand your perspective, but disagee. There is nothing wrong with skipping a driver point release. God of War still runs on the drivers release in December. We have to remember that drivers and game optimizations are a very complicated.
Some games demand latest drivers to play, and check at the launcher. Rare, but some do. Personally I just say "screw em" to games that do that. Much like I do with overbearing anticheat solutions...
 
Some games demand latest drivers to play, and check at the launcher.
I have never encountered this on drivers less than 2 years old. I have seen it on drivers that were older though. Of course I don't do much gaming on anything but titles from GOG so that might be why..
Personally I just say "screw em" to games that do that. Much like I do with overbearing anticheat solutions...
Right there with you!
 
I've had some clients play odd online games that required driver updates, but I'm totally with you: not in my library!
 
You can take 1440p and shove it somewhere. Seriously when the 10 series came out 4k was the thing. Then Nvidia backtracked to 1440p. Totally sad I jumped on the 4k gaming and then got screwed in the end by nvidia.
Not just Nvidia. The entire industry is guilty of this. Case in point - the Xbox Series S is advertised as a 1440p 60fps console which is utter bullshit. It is a 1080p 60fps console at best. With some titles managing only 1080p 30fps. Upscaled to 1440p of course. That upscaling is what was advertised.

Similarly, Xbox Series X/PS5 are advertised as 4K 120fps but it is more like 4K 60fps for modern titles. Can they run 4K 120fps? Yes, there will be titles that support it but those would be very few. You can't run something like Spiderman Miles Morales at 4K 120fps while retaining its impressive visual fidelity.

And you can bet that the RTX 3080/3090 won't be 4K cards for very long either as Ray Traced Global Illumination starts becoming more commonplace.

Graphics card companies need bullshit to advertise other than "our cards are faster than before". This is what ends up creating the 3D glasses hype, Nvidia PhysX hype,
Tesselation hype, 4K hype and more recently RTX hype. DLSS is probably the only feature they have created in recent times that isn't complete smoke and mirrors. It's pretty good for resolution scaling and anti aliasing.

I am not saying that Tesselation and RTX aren't useful. They absolutely are. But they are artificially tacked on to existing games and then a few years later it turns out that it's useful for some things only and isn't a game changer as advertised. Hell, tessellation completely stopped being advertised by 2012-13 and it was all the rage in 2009-10.

There are also features that are hard to advertise but developers find very useful. Examples of this are DX11's Compute Shader and DX12's Mesh Shader.

This is why I think DLSS became useful so quickly. It doesn't need artistic supervision to be used and is relatively straightforward to implement in existing workflows.
 
Last edited:
Back
Top