Friday, February 9th 2024

NVIDIA GeForce RTX 4070 Ti Drops Down to $699, Matches Radeon RX 7900 XT Price

The NVIDIA GeForce RTX 4070 Ti an now be found for as low as $699, which means it is now selling at the same price as the AMD Radeon RX 7900 XT graphics card. The GeForce RTX 4070 Ti definitely lags behind the Radeon RX 7900 XT, and packs less VRAM (12 GB vs. 20 GB), and the faster GeForce RTX 4070 Ti SUPER is selling for around $100 more. The Radeon RX 7900 XT is around 6 to 11 percent faster, depending on the game and the resolution.

The GeForce RTX 4070 Ti card in question comes from MSI and it is Ventus 2X OC model listed over at Newegg.com for $749.99 with a $50-off promotion code. Bear in mind that this is a dual-fan version from MSI and we are quite sure we'll see similar promotions from other NVIDIA AIC partners.
Sources: Newegg.com, via Videocardz.com
Add your own comment

122 Comments on NVIDIA GeForce RTX 4070 Ti Drops Down to $699, Matches Radeon RX 7900 XT Price

#76
freeagent
3valatzyI haven't seen anyone talking about the RX 7900 XT.
It’s the 4070Ti that brings up so much hate.

Edit:

That is with vsync enabled.. the only game I have to cheat with is cp2077.
Posted on Reply
#77
kapone32
freeagentMy tv is 60Hz so I aim low :)

Still, erry one talks shit about the card and it’s actually not too bad.
at 4k 60 HZ I am sure that is fine. My 6800XT played 4K 60Hz (At the time) fine too.
bonehead123Still too damned expensive....regardless of mfgr, model, version etc....

What we need ATM are well-rounded, well-spec'd cards that can do 99% of what we need them to, AND are affordable for the average everyday user, including gamrs, CAD folk, the Blender crowd etc....
Blame Nvida. If AMD kept their pricing you would not be able to buy any GPUs from them. The narrative is not that powerful.
Posted on Reply
#78
MarsM4N
Over here only one 4070 Ti dropped down to 759€ (from a eBay dealer). The rest all start from 800€ and up. :laugh: The 7900 XT on the other hand starts at 759€ (from credible sellers).
AssimilatorConsidering Gamers Nexus includes Blender as one of their tests? A lot more than you believe, again facts trump your feelings.
It's not like you can't Blender with a AMD card, it just takes a little bit longer. ;) Which only matters if you have to render regularly, if you render a video from time to time it shouldn't be a big deal. Blender is also pretty "cherry picking" since AMD is doing pretty good at the other productive tasks.

Also you can't blame AMD entirely for the bad Blender performance, as the following article shows:
Quote: "For years, AMD users have eagerly waited for Blender to tap into their hardware’s ray-tracing capabilities fully. Blender officially added AMD HIP support with the 3.0 release in December 2021. However, this did not take advantage of the dedicated ray tracing cores available in the Radeon 6000 and 7000 series GPUs. The wait is finally over, as the latest Blender 3.6 update officially enables support for AMD ray tracing cores. This enhancement promises to significantly accelerate the rendering process, showcasing the potent synergy between AMD hardware and Blender’s advanced rendering algorithms. We’ll delve into the impact of this update and how it promises to improve rendering workflows.

Blender’s decision to enable AMD ray tracing cores marks a pivotal moment in the world of 3D rendering. This follows Maxon’s recent inclusion of HIP in their Redshift renderer. We are increasingly seeing AMD looking to professional workflows with their video cards. They still aren’t entirely competitive with NVIDIA, but this comes as a warning shot. AMD is taking GPU rendering seriously, and if they are able to make the same sort of improvements as they did with CPUs when they introduced the Ryzen line, 3D artists stand to win. We are excited to see what the future holds for GPU rendering."
Posted on Reply
#79
Vayra86
Vya DomusOn low end parts, you're not gonna see them doubling the VRAM on a 4090 unless they slap a Quadro sticker on it and charge some 2000$ more.
Are you trying your hardest to avoid the point I'm making or are you reading past it?

There is a 4060ti with 16GB.
There are midrange Ampere cards with 16GB.
These cards have no business in gaming whatsoever. There's nil advantage to them over their half VRAM counterparts (perhaps situationally, but that won't last), especially the 4060ti.

That's them catering to that exact demand right there but on a much more 'democratic' price level. Still overpriced for what it really is. But. An Nvidia card with 16GB on the newest architecture. It can do RT. It has AI. It has creator tools. It has everything your little gamur heart wants. Yadayada. You get the gist ?

I'm not telling you this makes sense in any kind of realistic economy or for real pro markets. But it makes sense in the hearts and minds of prospective buyers. Young people with little knowledge of what they might do or can do with that GPU perhaps. Somewhat more knowledgeable people that know how to put VRAM to use. Etc. There's a market here. Niche? I'm not so sure. I think a lot of people are sensitive to this.

I can't even deny I was totally insensitive to this, say for example a feature like Ansel. It's not like I would have picked Pascal over any other GPU at the time for it. But still. Its yet another neat little tool you can use, and I've pulled some pretty nifty screens for my desktop from it. All these little things really do matter. If you buy an Nvidia GPU, you get a package of added value that AMD simply cannot match. I'm now past the point of caring too much about all of that, but its a package nonetheless.
Posted on Reply
#80
Unregistered
its pretty simple nvidia is ripping you off enormously.

Amd tried to but they got their shit together.

500 for 7800xt
700 for 7900xt

is almost normal price. Should have been release price but we cant get everything.

whoever thinks any 70 or 70ti card is worth 700-900€ has lost their damn minds especially with that vram greed und ridiculous 60 card bit bus.

while a 7900xt gives you ti super to 4080 performance depending on the game.
#81
Vayra86
Nosferatu666its pretty simple nvidia is ripping you off enormously.

Amd tried to but they got their shit together.

500 for 7800xt
700 for 7900xt

is almost normal price. Should have been release price but we cant get everything.

whoever thinks any 70 or 70ti card is worth 700-900€ has lost their damn minds especially with that vram greed und ridiculous 60 card bit bus.

while a 7900xt gives you ti super to 4080 performance depending on the game.
I'll be honest, all of the choices we've really had this gen, still, are less than ideal. Palatable, at this point, is the furthest I would go, and that only counts for a small selection of GPUs on either side of the line. And you're right... almost normal price. Almost. But its also nearly Q2 2024.
Posted on Reply
#82
Unregistered
True that bit its getting better by the month. we are still feeling the aftermath of mining and scalping this will take years. But consumers are also stupid as shit accepting a gaming card for 1600€ is insanity and calling it a good deal while it has objectively the worst value. ‍♂️

The funny thing is the 4090 is so cut down it would barely make an actual 80ti card.

but hey people bought the freaking 3090 for double the price of a 3080 while being 10% faster. stupidity has no bounds especially gamers as you can see on the gaming industry.

Nvidia lied when they said 90 card will replace titans . Titan rtx birns the 3090 in some productivity tasks because it was areal titan. not a wannabe so they can raise the prices by 2. and people eating it up
#83
freeagent
I paid 679USD for my Ti last July, good deal :)
Posted on Reply
#84
Random_User
Sweet. Seems AMD is too slow and lags in the game nVidia runs. How much it will take, until nV will do another punch? Make note, nVidia didn't even had to lower the prices. They could keep simple keep gouging the market, or even raise the MSRPs. AMD is being beaten with their own methods.

However, the availability of the 4070Ti and other nVidia cards, is another question.
OnasiThat’s a pretty big “unless”. I love how a lot of people just downplay that Radeon are absolute cheeks in anything apart from gaming. If you do, for example, any amount of work in Blender you basically have no actual choice. NV is the only game in town. The fact that AMD still hasn’t even tried to compete with OptiX is pathetic, to be blunt. How they think that they can just keep their back turned while the competition reaps the rewards is baffling to me.
Youre right here. However, some time ago, the same sentences were shot towards GCN and Vega achitectures, which even being the low end gaming were compute monsters. Contrary to nVidia counterparts, which with the exception of very high end, were completely anemic for the said tasks. Now the tables have been turned, but the narratives left the same.
Vya DomusYou don't understand how this works, neither Nvidia or AMD care about regular consumers and professional workflows, it's not relevant for that segment. Nvidia cards perform better because of Quadros, since they share the architecture but Nvidia wants people to buy Quadro not Geforce, that's why they more than once went of their way to cripple productivity performance on those cards and limit the VRAM. Those market segments are distinct, regular consumers do not care about professional workflows, you are just wrong.

Not to mention Nvidia is using CUDA to gatekeep this market, AMD could make a GPU a billion times faster and it still wouldn't matter. They'd be absolute idiots to try and focus on this one thing that simply does not even matter, it's not going to get them more market share.
I don't try to attack you, but just some points to note.

Everyone knows, that Nvidia gatekeeps the market with CUDA, and does the really dirty things towards their fans, consumers, heck, even towards their precious clients and partners. They shit on absolutely everyone. Thats the fact. nVidia is anti-consumer and pro-investor trillion bucks worth corporation. And it growth like the mushrooms after the rain.
But at same time, what prevents AMD, to overtake the situation, and provide "morally" correct, OpenSource alternative to the CUDA, as well as countless of other confortable tool sets? To stop this vicious circle, to disrupt the monopoly. But AMD doesn't fight that, and instead joins this game, and might be in collusion with Nvidia. Anybody can point out the disgusting tactics, Nvidia wages, and how locked is their proprietary ecosystem. But it worth a credit to their many endeavours, and many SDKs they open to developers, excluding direct "incentives". There's no need to bribe game developers, as most already make games in regard with consoles, which carry Zen2 and RDNA2. What is needed is to help, support developers, make the process as easy as possible, so the devs won't evn care about nVidias fat suitcases.

Again, why AMD can't make invest into own viable, effective and comfortable and quality ecosystem? Pproprietary or not. What prevents AMD to do so, except their greed. At this point, it looks like AMD is the laziest company, as they sit on the laurels of EPYC/Ryzen, and gouging them as much as possible, and just ocasionally respond to the rivals. And they use OpenSource banner of their stuff, just to offload the development on the shoulders of clients and community.

Every couple months of every release since Zen2 is almost double over MSRP milking of the trustful consumers. And only when Intel or nVidia, try to undercut them, they pring down the prices, to more sane level (still not sane). And that considering, that AMDs chiplet way, is miles cheaper, that intel's and Nvidia big monolitic dies. They still cost same or more, regardless. Even considering the Nvidia's premium tax, this looks as scam. For years AMD was pouring into everyones ears, that chiplet strategy would become both energy efficient and cost effective, and will bring their product prices down by alot.

Also It took AMD almost two decades, to roll out ROCm. And it's only to accompany it with their MI200/300. This shows AMD invested into it only to rival in the Nvidia's race for AI, and also wanted to take a piece of that profitable pie. And it still isn't complete alternative to CUDA.

And make no mistake, consumer VGAs that nVidia sells by miles more to their "dumb" fans, are still supported with both gimmics and really strong features. The gimmics are impotent RTRT, that doesn't work without "deblurers" and fake frames. The PhysX, Hairworks, Gsync, etc. And the strong point is their encoding. Like it was mentioned, everyone is able to become a streamer and youtuber, and even mid range cards from both nVidia and intel, provide the vastly better performance than even top AMD cards are unable to reach. AMD simply has no alternative. The CUDA is just a small bonus in this perspective of regular consumer.

Now again, what was the reason, AMD did not make any moves toward rising their marketshare, and market penetration for their own products. Why AMD doesn't fight for their share like nVidia with their somewhat overestimated and to some extent BS producs? AMD is not the underdog they were years ago. They have wery big profit margins, and tons of cash to fund any software and hardware development. There's no one who can say that AMD is poor company. At same time they behave like they are market leaders or monopolies on every front, and they don't have to do anything anymore.

Look, once nVidia, had less than 50% of market share, while having inferior products. They did invest in their marketing, and R&D, even while using anticonsumer tactis. Why AMD just sit and wait while the market gonna come to their hands, without even trying.

Yes, APUs are great. This is the way, the absolute majority of desktops should be, and be that powerefficient. The exceptions are demanding CAD/Rendering and scientific use cases. However, as you've said yourself, these are not the tasks for the ordinary users. Which doen't need the ultra hiend dGPU to run the games. Majority of gamers already use medium or even lower end class GPUs. What is needed is more powerful iGPU, that is capable of High settings with 1440P. And at this pace it's not really far from reality.
But that merrit is not due to sheer AMDs generosity. This is the result, of AMD has the stock of unsold "bad" binned mobile chips, that are not capable of mobile use.

Same goes to Ryzen. Despite how amazing it is, that's just literally the bottom of the binning. The absolute leftovers, that were not suitable firstly for EPYC, and then Threadripper. And even then, AMD managed to go further and cut down many featurs for desktop users, that came with Zen chips for free, as they were already there (people for years mocked intel for the same stinky tricks).
And even worse, they pulled the "worst intel", and started to artificially limit and fragment the chipsets and motheboard capabilities. They let partners to rogue with their BIOS settings, and thus damage the image of Ryzen. They had RAM problems during AM5 launch, and recently the STAPm fail. The QC is nonexistant. Their motherboards cost more than intel ones, while missing many absolute necessary and basic features. Again due to partners being loose. All of this within one socket/platform launch. This is disater. Intel has been burned for this crap for decades. Now the tables turned. But it seems AMD dosn't draw any conclusions.

Why this matters, is because such incompetent behavior is dangerous not only for AMD itself, but to entire market. Loosing one participant due to it's reckless moves, and the market would collapse. The next RTXxx50 would cost a grand, if will be at all. Every consumer, buyer needs a competition. It's impossible, when the partaker already gave up.
AssimilatorAnd all it took for price competition to restart was for NVIDIA to release products that compete with its own products, while AMD sits in the corner and sucks its thumb.
AssimilatorUh yeah, do you know why companies optimise their software for NVIDIA and not AMD? Because NVIDIA pays them to and AMD does not. Because NVIDIA understands that the ROI on that tiny expense is going to be many times greater. This is business 101, yet AMD perpetually fails to understand this.
Indeed. This is almost like the Buldozer vs Sandy Bridge drama all over again. When intel was competing itself for almost eight years. AMD needs to roll out their "Zen" of GPU, or they will loose the consumer gaming market completely. Intel is already reached the marketshare, that AMD were gaining for decade,. With just being couple of years present on the market, and even having their Xe failed launch. What AMD is going to do, when Battlemage wll happen. I bet intel doesn't sit their idling on their arse.
Posted on Reply
#85
Minus Infinity
Who cares about run-out pricing. It's being discontinued and is crap card even at $699. Ti Super is only worth $599 at most.
Posted on Reply
#86
Random_User
The problem is, that even considering very limited data from Steam HW survey, it shows that 4060 still has about the same share, as HD5450. What that means, is that the GPU pricing and positioning is utter sh*t. Be it reasonable, the share would surpass the 1060 6GB in a nick of time. As much as it isn't really worthy to upgrade over 3060Ti.
john_What they did with PhysX, where they where offering a software support to non Nvidia systems that was hilariously slow to force people to buy Nvidia GPUs. Hardware PhysX died in the end, but CUDA is a different beast.
Nvidia knows how to create the illusion of being open, while driving consumers to it's proprietary options.
AFAIK, PhysX still relied on CUDA. And it still used CPU heavily. Much like encoding/decoding due to VRAM compression operations. It still was a CPU tech, but artifficaly locked behind proprietary GPU.

Another question, where it is possible to trick games to use Radeons, as there's no way they can't run such a basic task?
kapone32I have a question on the Steam Hardware chart. In the GPU section it shows that the 3060 laptop has increased in January. The only issue with that is that 3060 laptop GPU has not been available since 2022. How could that be?
That might be those poor laptops, that these sweatshops and cafés were running the Etherium and other crypto garbage. What nobody tells, is where have gone all the storage that been used for Chia mining? :rolleyes:
Vayra86Are you trying your hardest to avoid the point I'm making or are you reading past it?

There is a 4060ti with 16GB.
There are midrange Ampere cards with 16GB.
These cards have no business in gaming whatsoever. There's nil advantage to them over their half VRAM counterparts (perhaps situationally, but that won't last), especially the 4060ti.

That's them catering to that exact demand right there but on a much more 'democratic' price level. Still overpriced for what it really is. But. An Nvidia card with 16GB on the newest architecture. It can do RT. It has AI. It has creator tools. It has everything your little gamur heart wants. Yadayada. You get the gist ?

I'm not telling you this makes sense in any kind of realistic economy or for real pro markets. But it makes sense in the hearts and minds of prospective buyers. Young people with little knowledge of what they might do or can do with that GPU perhaps. Somewhat more knowledgeable people that know how to put VRAM to use. Etc. There's a market here. Niche? I'm not so sure. I think a lot of people are sensitive to this.

I can't even deny I was totally insensitive to this, say for example a feature like Ansel. It's not like I would have picked Pascal over any other GPU at the time for it. But still. Its yet another neat little tool you can use, and I've pulled some pretty nifty screens for my desktop from it. All these little things really do matter. If you buy an Nvidia GPU, you get a package of added value that AMD simply cannot match. I'm now past the point of caring too much about all of that, but its a package nonetheless.
Even if 4060Ti had wider bus, the GPU is still incapable of using all the VRAM, fast enough. Maybe 10-12GB would be better, but still doubtful.
Posted on Reply
#87
Chrispy_
Random_UserWhat nobody tells, is where have gone all the storage that been used for Chia mining? :rolleyes:
Landfill!

Chia mining didn't use lots of storage, it' used up lots of storage. Expected lifespans of TLC SSDs was 90-days of Chia mining per TB of capacity. I assume QLC drives didn't even last long enough to be worth bothering with. For a mechanical drive of any capacity to survive more than 6 months was also an outlier, apparently, with death usually at the 3-5 month mark.

Even as someone who mined and holds crypto, I couldn't see the point of Chia, and I'm not really sure I see the point of Bitcoin mining. Digital, nation-independent, DeFi is the future, and Bitcoin started that, but we don't need to mine it wastefully. A successful independent DeFi doesn't have to generate 90 Mt of CO2 a year for no justifiable reason.
Random_UserEven if 4060Ti had wider bus, the GPU is still incapable of using all the VRAM, fast enough. Maybe 10-12GB would be better, but still doubtful.
The narrow bus is exactly what the 4060Ti needs. My own personal 4060Ti is undervolted and underclocked to a 125W power draw but even hamstrung like that and rendering at just 1080p I'll run into situations where the overlay says it's not fully loaded and neither is any single CPU core. That's either memory bandwidth or game engine bottlenecking, and I know it's not game engine because the same scene runs at 100% GPU usage on the 4070 or 7800XT.

It's also ROP-limited, so resolution scaling on the 4060Ti is pathetic compared to the 3060Ti but since the bandwidth bottleneck is so great, we don't really get to see the ROP limitation. For the 4060Ti to be a better 1440p card, it would have mostly needed more bandwidth, but also that would just revealed the ROP deficiency, which is more situational but still an issue holding it back.

Sadly, if you head to Wikipedia and look at the one-spec-sheet-to-rule-them-all, you can see how the 4060Ti is really a successor to the 3060 8GB in terms of bandwidth, SM+GPC counts, and relative position of that silicon in Nvidia's range of GPU dies. It's a long way off the 3060Ti and the only reason it gets close is because TSMC 4N lets Nvidia clock it 55% higher than the 3060Ti on Samsungs underwhelming 8nm node.
freeagentThat’s all I play at, usually at maxed settings, RT on. 4K/60 @ 55”
And you've enjoyed 15 months of games at 4K from 2022 to 2023 titles I presume?

TLoU-P1, CP2077, Hogwarts, MS Fligh Sim all exceed 12GB at 4K on max settings. You'll notice it uncapped because it manifests initially as microstuttering and frame-pacing issues but realistically at those settings (expecially overdrive in CP2077) you're unlikely to be getting much more than 60fps in heavy scenes anyway, so artificially hindering the stuttering/pacing with 60Hz cap means it's less of an issue in those older 2022/2023 titles. Realistically, the issue with the 4070Ti isn't it's performance over the past 15 months, it's how it's going to perform in the next 15 months now that so many more games in the development pipeline are moving to UE5 and also ditching any semblence of PS4 and XB1-compatibility now that those consoles have been dropped for good.

The 4070 Ti isn't a bad card. It's objectively better than the 4070 and 4070S, both of which are considered "decent" cards. Everyone talks shit about the 4070Ti because of the asking price, and the sheer hubris/cheek/greed of Nvidia in trying to launch it at $900 as the 4080 12GB.

If you fall into the trap of comparing its price to other 40-series cards on pricing, you'll end up drinking the Nvidia kool-aid and justifying the cost relative to the 4080 which was just ridiculously poor value, but the reality at launch was that the $800 4070 Ti was bringing 3080 12GB/ 3080 Ti levels of performance and VRAM to the table at the exact same performance/$ point for new retail cards at that time. It wasn't dramatically faster than the 6950XT (another chart in the same HUB Jan '23 update) which was selling for $649, not that it had the same feature set.



Hardware Unboxed has been doing these monthly GPU pricing updates for a few years now, and around the 4070 Ti's launch it's clear to see why there was so much hate for the card and it's all because of the price. The only reason you can say it's a decent card is because you bought it at a deep discount which means you weren't actually price-scalped by Nvidia.
Posted on Reply
#88
john_
Random_UserAnother question, where it is possible to trick games to use Radeons, as there's no way they can't run such a basic task?
There wasn't. What people where doing was running patches that where defeating Nvidia's lock. There was also an Nvidia driver (I think 256.something) where they had "forgotten" to implement the lock. That driver was working fine with Radeons as primary cards without wanting any patches. Nvida's excuse was that they could not guarranty CUDA and PhysX stabillity and performance if the main GPU was not an Nvidia one, which was a HUGE pile of BS. So people had to use software PhysX that was totally UNoptimised. Somehere in 2014 I think, when hardware PhysX was already dead, I think they removed that lock. In my second system I have a GT 710 as a second card for PhysX and CUDA (well more accurate is "for wasting power and occupying one PCIe slot") that runs fine CUDA and PhysX without needing any kind of patch to activate those.

Nvidia's lock was so anticonsumer that even someone who had payed full price for an Nvidia card to use it for CUDA or PhysX, while also using a higher end/newer AMD card as main GPU, couldn't, because Nvidia was forcing the option the Nvidia card to be primary. So, If I had for example a GeForce 9800GT and was buying at a latter time an HD 4870 to use as my primary 3D card, Nvidia was punishing me for not being loyal by disabling CUDA and PhysX. That was a very sh__y business practice from a company that had 60% of the market back then, less of a billion income per quarter and less support from the public and press. Imagine them today having 80%+ of the market, billions of income every quarter and total acceptance from public and support from tech press, how they move in the background. And people expect AMD to offer super competitive options and not lose money in the process for no results.
Random_UserAFAIK, PhysX still relied on CUDA. And it still used CPU heavily. Much like encoding/decoding due to VRAM compression operations. It still was a CPU tech, but artifficaly locked behind proprietary GPU.
When running on the GPU, CPU wasn't doing much. When running on the CPU everything bad was happening, like CPU maxing and FPS becoming a slideshow.
Posted on Reply
#89
AusWolf
Still more than what I'd consider reasonable for a 12 GB card, but it's good to see that the price wars have started. :)
Posted on Reply
#90
3valatzy
AusWolfStill more than what I'd consider reasonable for a 12 GB card, but it's good to see that the price wars have started. :)
There are at least 3 games which need more than 12 GB in order not to see that VRAM caused performance drop and stutter.

How does Nvidia even define how much VRAM to put on their cards in order to meet the games' hardware requirements? :rolleyes:

GameAverage VRAM use (GB)Peak VRAM use (GB)

The Last of Us Part 111.812.4

Cyberpunk 2077 (Overdrive)12.013.6
Hogwarts Legacy12.113.9



www.techpowerup.com/review/nvidia-geforce-rtx-4080-super-founders-edition/29.html

www.techspot.com/article/2670-vram-use-games/
Posted on Reply
#91
AusWolf
3valatzyThere are at least 3 games which need more than 12 GB in order not to see that VRAM caused performance drop and stutter.

How does Nvidia even define how much VRAM to put on their cards in order to meet the games' hardware requirements? :rolleyes:

GameAverage VRAM use (GB)Peak VRAM use (GB)

The Last of Us Part 111.812.4

Cyberpunk 2077 (Overdrive)12.013.6
Hogwarts Legacy12.113.9



www.techpowerup.com/review/nvidia-geforce-rtx-4080-super-founders-edition/29.html

www.techspot.com/article/2670-vram-use-games/
1. VRAM usage and VRAM allocation aren't the same thing. A lot of games allocate more than 12 GB VRAM if available, but run fine on an 8 GB card. Some of them have texture and asset loading issues if there isn't enough VRAM, but the FPS looks fine. There is no blanket statement here, unfortunately.

2. Not everyone plays in 4K. Up to 1440p, 12 GB is still fine, in my opinion. How fine it will be in the near future when the PS5 Pro is out and new games get developed for it, we'll see.
Posted on Reply
#92
Assimilator
3valatzyThere are at least 3 games which need more than 12 GB in order not to see that VRAM caused performance drop and stutter.

How does Nvidia even define how much VRAM to put on their cards in order to meet the games' hardware requirements? :rolleyes:

GameAverage VRAM use (GB)Peak VRAM use (GB)

The Last of Us Part 111.812.4

Cyberpunk 2077 (Overdrive)12.013.6Hogwarts Legacy12.113.9



www.techpowerup.com/review/nvidia-geforce-rtx-4080-super-founders-edition/29.html

www.techspot.com/article/2670-vram-use-games/
  1. Last of Us and Hogwarts are shitty console ports that make no effort to manage resources. That's not NVIDIA's fault yet you're blaming NVIDIA. No logic.
  2. NVIDIA has never positioned the 4070 series or lower as 4K cards. Therefore, if you run 4K on anything lower than a 4080 (which has perfectly sufficient memory for that resolution because it's designed for it) and complain about the experience, It's really simple, if you want to run games at 4K, buy the GPU designed for 4K. Can't believe I have to explain this, but here we are.
AusWolfVRAM usage and VRAM allocation aren't the same thing. A lot of games allocate more than 12 GB VRAM if available, but run fine on an 8 GB card. Some of them have texture and asset loading issues if there isn't enough VRAM, but the FPS looks fine. There is no blanket statement here, unfortunately.
Don't waste your time explaining that; people whose only agenda is to parrot "NVIDIA doesn't have enough VRAM REEEEEEEEE" , to care about facts.
Posted on Reply
#93
kapone32
john_There wasn't. What people where doing was running patches that where defeating Nvidia's lock. There was also an Nvidia driver (I think 256.something) where they had "forgotten" to implement the lock. That driver was working fine with Radeons as primary cards without wanting any patches. Nvida's excuse was that they could not guarranty CUDA and PhysX stabillity and performance if the main GPU was not an Nvidia one, which was a HUGE pile of BS. So people had to use software PhysX that was totally UNoptimised. Somehere in 2014 I think, when hardware PhysX was already dead, I think they removed that lock. In my second system I have a GT 710 as a second card for PhysX and CUDA (well more accurate is "for wasting power and occupying one PCIe slot") that runs fine CUDA and PhysX without needing any kind of patch to activate those.

Nvidia's lock was so anticonsumer that even someone who had payed full price for an Nvidia card to use it for CUDA or PhysX, while also using a higher end/newer AMD card as main GPU, couldn't, because Nvidia was forcing the option the Nvidia card to be primary. So, If I had for example a GeForce 9800GT and was buying at a latter time an HD 4870 to use as my primary 3D card, Nvidia was punishing me for not being loyal by disabling CUDA and PhysX. That was a very sh__y business practice from a company that had 60% of the market back then, less of a billion income per quarter and less support from the public and press. Imagine them today having 80%+ of the market, billions of income every quarter and total acceptance from public and support from tech press, how they move in the background. And people expect AMD to offer super competitive options and not lose money in the process for no results.


When running on the GPU, CPU wasn't doing much. When running on the CPU everything bad was happening, like CPU maxing and FPS becoming a slideshow.
I was one of the "fools" that bought an Nvidia card for Physx, only for them to have my card not be detected by Windows of all things. It was dastardly because the card would work fine without an AMD card.
Posted on Reply
#94
3valatzy
AusWolfNot everyone plays in 4K. Up to 1440p, 12 GB is still fine, in my opinion.
You don't buy an 800-850$ card only to be said - but no 4K, you will be stuck at 1K and 2K only! :roll:
Posted on Reply
#95
AusWolf
3valatzyYou don't buy an 800-850$ card only to be said - but no 4K, you will be stuck at 1K and 2K only! :roll:
As someone who played at 1080p with a 2070, then a 6750 XT, then a 7800 XT for a while before I upgraded to UW 1440p, I strongly disagree.

If you only want to play current games and swap out your GPU with every new generation, then go ahead, but I do think that having some reserve potential in your system for future games isn't a bad idea.
Posted on Reply
#96
3valatzy
AusWolfStill more than what I'd consider reasonable for a 12 GB card, but it's good to see that the price wars have started. :)
Especially when you've earlier said that ^^^^
So, how much would you be willing to spend on that 12 GB card and what are you going to use it for if you did spend?
AusWolfAs someone who played at 1080p with a 2070, then a 6750 XT, then a 7800 XT for a while before I upgraded to UW 1440p, I strongly disagree.
But that is a 4070 Ti, a tier higher. And the 2070 was a 2018 thing, now it's 2024, and you will be using that for at least a few years more.
Posted on Reply
#97
AusWolf
3valatzyEspecially when you've earlier said that ^^^^
So, how much would you be willing to spend on that 12 GB card and what are you going to use it for if you did spend?
400-450 GBP max.
3valatzyBut that is a 4070 Ti, a tier higher. And the 2070 was a 2018 thing, now it's 2024, and you will be using that for at least a few years more.
What's your point? All I'm saying is, if someone - I, was happy with mid-range GPUs at 1080p for a long time, then I don't see why buying a high-end one for anything less than 4K would be a bad idea.
Posted on Reply
#98
Dawora
Nosferatu666its pretty simple nvidia is ripping you off enormously.

Amd tried to but they got their shit together.

500 for 7800xt
700 for 7900xt

is almost normal price. Should have been release price but we cant get everything.

whoever thinks any 70 or 70ti card is worth 700-900€ has lost their damn minds especially with that vram greed und ridiculous 60 card bit bus.

while a 7900xt gives you ti super to 4080 performance depending on the game.
bit bus...hmm..

who cares membus if gpu is fast..
My last car was 3.7L v6 mustang.
Now i got MB AMG CLA 2.0 and that thing is Fast! amd only 2.0l 4cyl engine..

so, i dont care how they buid GPU if it fast and run all my games.
Nosferatu666True that bit its getting better by the month. we are still feeling the aftermath of mining and scalping this will take years. But consumers are also stupid as shit accepting a gaming card for 1600€ is insanity and calling it a good deal while it has objectively the worst value. ‍♂️

The funny thing is the 4090 is so cut down it would barely make an actual 80ti card.

but hey people bought the freaking 3090 for double the price of a 3080 while being 10% faster. stupidity has no bounds especially gamers as you can see on the gaming industry.

Nvidia lied when they said 90 card will replace titans . Titan rtx birns the 3090 in some productivity tasks because it was areal titan. not a wannabe so they can raise the prices by 2. and people eating it up
But maybe we who bought 3090 are so smart we can make money almost thin air.. or lucky ebought that we got the money to spend and buy 3090.

u can buy lambo or Toyota..
but even Toyota oweners can buy 3090..
i just want to say, 3090 is not smart buy but many can buy it whitout selling kidney.

i hope i can buy better enghlist writing talets.. but i cant so i buy best GPU money can buy.
Posted on Reply
#99
Unregistered
i could buy 1000, 4090s instantly its not about that but you dont get it.
Posted on Edit | Reply
#100
Dr. Dro
Vya DomusAnd they cost 1.5K$ and up, out of reach of vast majority of consumers. People forget that Quadros used to be in that price range, Nvidia aren't idiots, they just quietly moved some of their consumer products to overlap Quadros.
The RTX 4090 (excluding 4090 D) has likely outsold the entire RDNA 3 stack plus RTX 4070 Ti and RTX 4080 combined. I wish I was making it up, but I was fortunate enough to see the GPU-Z numbers. It's crazy skewed towards Nvidia.
Posted on Reply
Add your own comment
Jun 17th, 2024 09:38 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts