• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Design Issues May Postpone Launch of NVIDIA's Advanced Blackwell AI Chips

Do you expect the world to give you luxury good for free too?
Who said GPUs were luxury goods? No, what I mean is, if a GPU that costs magnitudes more than what I consider a sensible price can't give me smooth gameplay with maxed out graphics, then what's the point?
 
Stop fighting. Points given. Posts deleted.

No reply bans - but if you keep this crap up, there will be more points, and reply bans.
 
I agree with him, the 4080 is the absolute minimum for any RT or UE5 game running at 60fps or higher without DLSS and frame-rate trickery on a 1440p 240Hz or 4K 120+Hz monitor. And for some, high resolution, high refresh rate gaming is a thing. The 40x0 series being limited to only DisplayPort 1.4 is pretty bad considering there are now monitors that support full bandwidth DisplayPort 2.1 without compression, it's undeniable that nGreedia has held back the monitor manufacturers from offering faster, higher-resolution monitors. The media also had raised an eyebrow when nGreedia said the 40x0 series was DisplayPort 1.4 only.

Another point he makes is the cost. When you buy an $1200+ card, you expect to max the settings in the game. You shouldn't expect it to give you 35fps in return. The 4080 is too slow for the money, plus some people save for a long time to afford such a card and will expect it to be in their system for 3-5 years. It's an investment, and then to be told you $1200 1 year old card is not only slow in 2025 games, but that amazing new monitor you've been waiting for won't work with you card because it's only DP1.4...

Not everyone uses their computer the same as you or me, you have to respect that.
There's currently two main problems with games currently: lack of optimisation and the fact that developers are putting in settings that offer no actual benefit to visual quality yet tanks performance regardless. For a large number of games, ultra offers the same visual fidelity as high, but offers worse performance for nothing in return (you may see a slight improvement if pixel peeping but nobody does that on a daily basis). Nobody should be expecting to run "maxed out" at this point in time as it offers nothing but bragging rights. Would I prefer seeing DP 2.1 on GPUs? Of course. Would I like higher VRAM capacities? Naturally. But a lot of our current problems are stemming from the software side (and the mentality of users) rather than a case of hardware being too slow.
 
There's currently two main problems with games currently: lack of optimisation and the fact that developers are putting in settings that offer no actual benefit to visual quality yet tanks performance regardless. For a large number of games, ultra offers the same visual fidelity as high, but offers worse performance for nothing in return (you may see a slight improvement if pixel peeping but nobody does that on a daily basis). Nobody should be expecting to run "maxed out" at this point in time as it offers nothing but bragging rights. Would I prefer seeing DP 2.1 on GPUs? Of course. Would I like higher VRAM capacities? Naturally. But a lot of our current problems are stemming from the software side (and the mentality of users) rather than a case of hardware being too slow.
Totally agree! But it's odd that some of these games run very well on anaemic consoles with no really issues. But, yes, in some ways PC hardware is so powerful now that sloppy programming is totally acceptable by developers. When computer hardware was slow, the lengths developers went to, to wring out as much performance as possible was the way.

I mostly blame Microsoft for their awful DirectX. I will never get over when they admitted to not knowing how it worked because the team responsible for it had been fired or left. A new HAL, written by AMD, Intel and nVidia would be amazing, and built from the ground up to directly access the hardware of modern GPU's. Microsoft forgot to code years ago.
 
Totally agree! But it's odd that some of these games run very well on anaemic consoles with no really issues. But, yes, in some ways PC hardware is so powerful now that sloppy programming is totally acceptable by developers. When computer hardware was slow, the lengths developers went to, to wring out as much performance as possible was the way.

I mostly blame Microsoft for their awful DirectX. I will never get over when they admitted to not knowing how it worked because the team responsible for it had been fired or left. A new HAL, written by AMD, Intel and nVidia would be amazing, and built from the ground up to directly access the hardware of modern GPU's. Microsoft forgot to code years ago.
People are still pushing ancient hardware to the limits, like with this. There's really no excuse for the mess we're experiencing with current game releases.
 
People are still pushing ancient hardware to the limits, like with this. There's really no excuse for the mess we're experiencing with current game releases.
I think it's hard to write well optimized code, and it's also time-consuming - Thus expensive. It's easier (for the sake of profit) for the end-user to buy a faster graphics card than the developer to spend 6 more months working on better, more performant code.
 
Back
Top