• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Blackwell RTX and AI Features Leaked by Inno3D

The blurring that comes from upscale and TAA is abysmal. Stalker 2 looks like utter shit and if someone says otherwise, needs their eyes checked. The smearing and ghosting is beyond ridiculous and the game still runs like crap.

Silent hill 2 remake isn't nearly as bad but it still isn't excusable. Fog now affects performance when in the past it was to help hide limitations yet created a great atmosphere.

Isn't that same with depth of field now too?
It's supposed to blur things far away to make it less work to render.
yet everyone enables it & complains they can't read texts far away.
Then enable D.L.S.S to make it readable again, instead of just turning off "depth of field"]
I feel like we're going backwards in graphics.
 
DOF aims to blur background while foreground remains unblurred. It kind of tries to replicate what DLSR lens with low aperture value naturally do. TAA and DLSS affects everything on the screen.
 
For me, I got 4090 (although it's second-hand and like a new item) for 1K, and I still felt like getting ripped off. On the other hand, I really need to get this stuff to support my research project.
I am done spending more on a GPU for gaming and put the spending on a console instead (PS4).
I plan to get a PS4 Pro in the near future (for me, it's way more logical than to make a donation for Nvidia, lol).
 
Last edited by a moderator:
Isn't that same with depth of field now too?
It's supposed to blur things far away to make it less work to render.
yet everyone enables it & complains they can't read texts far away.
Then enable D.L.S.S to make it readable again, instead of just turning off "depth of field"]
I feel like we're going backwards in graphics.

Motion Blur and DoF are always the first settings I disable in any game. :D
 
Upscaling. DLSS is just the best implementation of it and it is hard to deny.
I personally dont care for upscaling, but if I had no choice but to use it and we have the current options, I will always use the one that works for everyone, instead of the one that takes away my options, even if such option is not the absolute best.
Nvidia executes well on that front and even Intel seems to surpass AMD's efforts.
See above.
If anything, it is AMD that should do better here. They gambled on an approach and so far, they lost.
I dont use FSR and obviously, cant use dlss.

That said, I have been perplexed by the claims that FSR is absolute trash and DLSS is bigger than the second coming, so I have read and watched many videos where some unbiased reviewers (very few these days sadly) got to a point where they say FSR is good enough and depending on the game and dev, the same flaws observed in one, show in the other.

So when I read comments like that (FSR is trash, AMD lost, ETC) confuses me and make me believe that is someone simply repeating the other non AMD customers baseless attacks to defend Ngreedia.

Same group that still claims that all AMD drivers are trash.
I really need to get this stuff to support my research project.
If they are the only one providing the tool that you need, I can understand and support the purchase of a 4090.
 
Motion Blur and DoF are always the first settings I disable in any game. :D
Yep, I always want to see what the card can do without software gimics. You know people complained about PS3 graphics back in the day but when it was programmed correctly it was much better than some PS4 games. Like FF13 on the PS3 I could see in the distance clear as a bell anything. Then I played FF15 on the PS4 Pro and everything in the distance was blurry. When I saw that I knew right away they had to cut corners to make the game playable. Even most of all the newer games on PS4 PS5 blur the distance objects. Sad really sad.
 
Any chance developers start using the AI portion to make CPU characters not so dumb? (bots in Call of Duty for example)

PhysX was cool when it came out and ragdoll physics took over, now it's kind of a default working behind the scenes in all our games.
 
Any chance developers start using the AI portion to make CPU characters not so dumb? (bots in Call of Duty for example)
Aren't CPU Characters already powered by "AI" since their inception. Yeah i wouldn't call that AI intelligent, but same could be argued about today's.

Also the AI most people know about is LLM which is for text-based (chat-bots in a way) which wouldn't really work that well in a 3D Movement-type Area. At least I assume it wouldn't.
 
Aren't CPU Characters already powered by "AI" since their inception. Yeah i wouldn't call that AI intelligent, but same could be argued about today's.

Also the AI most people know about is LLM which is for text-based (chat-bots in a way) which wouldn't really work that well in a 3D Movement-type Area. At least I assume it wouldn't.
LLM based ai will only get you enemies that can insult you like a 5th grader. They'll need a model that's trained on tactical experiences of swat teams and special forces to get what you want.
 
LLM based ai will only get you enemies that can insult you like a 5th grader. They'll need a model that's trained on tactical experiences of swat teams and special forces to get what you want.
Not to mention the cost of performance you have per bot in a match, imagine someone leaving your online match and your games start lagging. wouldn't be great.
 
For me, I got 4090 (although it's second-hand and like a new item) for 1K, and I still felt like getting ripped off. On the other hand, I really need to get this stuff to support my research project.
I am done spending more on a GPU for gaming and put the spending on a console instead (PS4).
I plan to get a PS4 Pro in the near future (for me, it's way more logical than to make a donation for Nvidia, lol).
If you need it for research purpose, then I would say $1K is not bad because if its work related, you could probably claim it on taxes or something.

But I feel ya, everything is becoming a waste of money. Heck, the PS5 Pro is a waste of money over the normal PS5. I used to buy brand new many many years ago but most of the time, I will not. I just cant really comprehend most prices these days, not in CPU and not in GPU. CPU side isn't as bad though since I remember the Intel Extreme processors going for over a thousand.
 
Last edited by a moderator:
I personally dont care for upscaling, but if I had no choice but to use it and we have the current options, I will always use the one that works for everyone, instead of the one that takes away my options, even if such option is not the absolute best.

See above.

I dont use FSR and obviously, cant use dlss.

That said, I have been perplexed by the claims that FSR is absolute trash and DLSS is bigger than the second coming, so I have read and watched many videos where some unbiased reviewers (very few these days sadly) got to a point where they say FSR is good enough and depending on the game and dev, the same flaws observed in one, show in the other.

So when I read comments like that (FSR is trash, AMD lost, ETC) confuses me and make me believe that is someone simply repeating the other non AMD customers baseless attacks to defend Ngreedia.

Same group that still claims that all AMD drivers are trash.

If they are the only one providing the tool that you need, I can understand and support the purchase of a 4090.
But that's the thing, FSR support isn't better than DLSS support. Each upscale method has its own approach to the market, developers need to implement it, and they need to implement the best version of it. The DLSS push in that respect is better. FSR's open nature does not make it appear in more games and does not improve the solution itself by a meaningful margin.

AMD's FSR strategy, therefore, while I applaud it for the approach in a general sense, is not as effective as Nvidia's approach.
With FreeSync you saw a different result, for example. And why? Because the technology just works, and works everywhere. Support.

I'm just observing the market. I don't have any favoritism towards any brand. I view AMD's open approach as marketing, as much as I view Nvidia's ecosystem approach as marketing. In the end it doesn't matter much: what matters is what technologies survive and deliver the best results. And then, we hope the industry embraces them ubiquitously.
 
Aren't CPU Characters already powered by "AI" since their inception. Yeah i wouldn't call that AI intelligent, but same could be argued about today's.

Also the AI most people know about is LLM which is for text-based (chat-bots in a way) which wouldn't really work that well in a 3D Movement-type Area. At least I assume it wouldn't.

Game AI just uses a set of branching conditions to determine behavior. You can make it somewhat decent by having a ton of those branching conditions but because it relies on branching logic the complexity of coding that increases exponentially with size. It's extremely tedious and absolutely not a good fit for character AI.

On the flipside, LLM based AI can consider billions of parameters right now and that number will only increase. I'm not sure if you've ever modded Bethesda games but the number of AI parameters is in the tens and I expect more "advanced" traditional AI in games like Elden ring to be 150 or less. They really are worlds apart but it makes sense, LLMs are designed similar to neural networks in your brain.

I supsect that once tools become available for devs to add LLM based AI we might start seeing it. The problem right now is that there is no pre-made infrastructure for devs to do so and thus you either have to make a bespoke implementation or wait.

There's also a tendency of the video game industry to put a lot of it's funding towards graphics. Take a look around the gaming industry, most of the improvements have been to how good a game looks but little to none to other systems like Audio, Physics, ect.
 
There's also a tendency of the video game industry to put a lot of it's funding towards graphics. Take a look around the gaming industry, most of the improvements have been to how good a game looks but little to none to other systems like Audio, Physics, ect.

That's what got me thinking about it. I threw a grenade near a pile of tires in Warzone the other day and they all...MOVED! OMG. Then I accidentally shot a traffic cone while in a gunfight and it tipped over!

Everything was "dead" in the environment before Black Ops 6 Warzone dropped. For me anyways. In games of yesteryear, physics was a big deal. The sandbox was destructible in quite a few games. It's all went away to the point that tipping over a traffic cone w/ a 5.56 round surprised me!

Be nice with all this horsepower they could make bots not stand in front of you and plate up instead of finishing you off when they had you dead to rights.

A fully armored APC shouldn't get stopped dead in its tracks to a stick-built wall lined with gypsum, let alone a lazy 3D-sprite plant in an open field.
 
There's also a tendency of the video game industry to put a lot of it's funding towards graphics. Take a look around the gaming industry, most of the improvements have been to how good a game looks but little to none to other systems like Audio, Physics, ect.
That feeling when Valve creates a drop-in solution for actual simulated realistic 3D sound with proper HRTF and just makes it open and available for everyone to use and then nobody does.
 
That's what got me thinking about it. I threw a grenade near a pile of tires in Warzone the other day and they all...MOVED! OMG. Then I accidentally shot a traffic cone while in a gunfight and it tipped over!

Everything was "dead" in the environment before Black Ops 6 Warzone dropped. For me anyways. In games of yesteryear, physics was a big deal. The sandbox was destructible in quite a few games. It's all went away to the point that tipping over a traffic cone w/ a 5.56 round surprised me!

Be nice with all this horsepower they could make bots not stand in front of you and plate up instead of finishing you off when they had you dead to rights.

A fully armored APC shouldn't get stopped dead in its tracks to a stick-built wall lined with gypsum, let alone a lazy 3D-sprite plant in an open field.

Yep, some games back in the late 2000s had decent physics systems but then game devs stopped caring and we saw regression on that front for awhile.

A lot of those physics in online games to this day still aren't synchronized either. In otherwords, physics is just for show. They'll look different depending on the client so at the end of the day, just like most things in games, there's no depth and only done for the looks.

That feeling when Valve creates a drop-in solution for actual simulated realistic 3D sound with proper HRTF and just makes it open and available for everyone to use and then nobody does.

It's crazy how much an improvement it is to audio and pretty much no one outside of VR uses it.
 
@evernessince
I may not be much of a CS guy myself, but HRTF implementation there is amazing and the fact that you can customize it and there is even an in-game EQ… yeah. Meanwhile, AAA games release with the same mediocre “hollywood-ish” home theater mix with barely any channel separation and it sounds so flat when used on headphones you wonder why we bother spending millions on “muh graphics” when EAX enabled games from the late 90s had a better soundscape.
 
@evernessince
I may not be much of a CS guy myself, but HRTF implementation there is amazing and the fact that you can customize it and there is even an in-game EQ… yeah. Meanwhile, AAA games release with the same mediocre “hollywood-ish” home theater mix with barely any channel separation and it sounds so flat when used on headphones you wonder why we bother spending millions on “muh graphics” when EAX enabled games from the late 90s had a better soundscape.

The first time anyone played Unreal with a Sound Blaster is a core memory. Absolutely haunting.
 
You mean ATI? Yeah, me too and earlier. What's your point? That isn't even an argument.

Enjoy being ripped off. As someone else said, if AMD or intel came out with a better gpu with similar performance, you wouldn't buy it.
Can cows fly or can Amd/intel make similar performance, No. not ATM.

Ripped Off? Lets say something cost 300$ more u willing to pay
U use it 24month
its only 12.5$/month, i say its cheap.

Gpu cost is VERY LOW vs anything else we need for daily life, food, car,gasolines..

And still we cry here like something cost a kidney or two.
 
Can cows fly or can Amd/intel make similar performance, No. not ATM.

Ripped Off? Lets say something cost 300$ more u willing to pay
U use it 24month
its only 12.5$/month, i say its cheap.

Gpu cost is VERY LOW vs anything else we need for daily life, food, car,gasolines..

And still we cry here like something cost a kidney or two.
What?

Huh, judging by what is saying by other senior techs here too, it does seem to be a rip off.

But believe whatever you like. Because you like to get raped in prices, doesn't mean the rest of us does.

But that's the thing, FSR support isn't better than DLSS support. Each upscale method has its own approach to the market, developers need to implement it, and they need to implement the best version of it. The DLSS push in that respect is better. FSR's open nature does not make it appear in more games and does not improve the solution itself by a meaningful margin.

AMD's FSR strategy, therefore, while I applaud it for the approach in a general sense, is not as effective as Nvidia's approach.
With FreeSync you saw a different result, for example. And why? Because the technology just works, and works everywhere. Support.

I'm just observing the market. I don't have any favoritism towards any brand. I view AMD's open approach as marketing, as much as I view Nvidia's ecosystem approach as marketing. In the end it doesn't matter much: what matters is what technologies survive and deliver the best results. And then, we hope the industry embraces them ubiquitously.

Well, I agree to a degree but if one just plays STALKER 2 and see how bad DLSS and everything else has been, I would wager that it is also trash. Just because FSR is worst, doesnt actually mean DLSS is much better.


 
Last edited:
AI this, AI that... All marketing BS with little to no truth to it whatsoever. These are things that are most likely driver based and artificially limited to the 50x0 series. At best I would assume that NV have a new RT denoiser algorithm, and most likely with a perf hit. NV can see that their "amazing" RT features are beginning to be caught up with. They are nowhere near as good as they thought they were, and now they have to start marketing fake reasons why their RT is better than AMD's RDNA4s or Intel's B580, an almost bottom of the range chip doing great RT now!

I throw up just a little in my mouth when I see companies plastering AI all over their marketing, when in 9.9 times out of ten, the only actual AI going on is the text used in the marketing blurb. NV promised years ago that they would use AI to remake their drivers because it was so much better than "human" code. What is it, 3 years now? Yep, more BS.
 
"NVIDIA today reported revenue for the third quarter ended October 27, 2024, of $35.1 billion, up 17% from the previous quarter and up 94% from a year ago. "

All this due to AI. So you bet they're going to shove it into everything. If it's usable or not.
 
"NVIDIA today reported revenue for the third quarter ended October 27, 2024, of $35.1 billion, up 17% from the previous quarter and up 94% from a year ago. "

All this due to AI. So you bet they're going to shove it into everything. If it's usable or not.
More like real or not.
 
Back
Top