• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS GeForce RTX 4070 Ti TUF

RTX 4070 Ti with its "measly" 12 GB has already put others with much more VRAM to shame, like RX 6800/6900 XT with 16 GB or RTX 3090 with 24 GB. None of these will all of a sudden start to scale better again in the future.
Right, because Radeon cards don't get better with age.... You're either joking, lying or you're new. :laugh:
As I've said before, I don't care how much VRAM people "feel" a card needs, the facts matter, which is evident in the benchmarking.
Benchmarking is only relevant for TODAY, not for tomorrow. Did anyone foresee that Far Cry 6 would need 11GB of VRAM to use the high-quality textures? I sure as hell didn't. I've been building PCs since 1988 and the biggest lesson that I've learnt is that the future comes a lot faster than you expect it to. There's no way to make anything last forever but that doesn't mean longevity is irrelevant. Hell, my old R9 Fury would still be able to do modern titles if it weren't hamstrung by the fact that it only has 4GB of VRAM. Sure, it's HBM and it can do some things that no card with only 4GB can do but not much. AMD would've done much better with it if it had 6 or 8GB of GDDR5 and it would've been cheaper too. The RX 580, a card that is actually slower than the R9 Fury can be had with 8GB of VRAM so it would've been a real boon if the R9 Fury had more. That taught me to never disregard the size of the VRAM buffer because you never know what's coming. Games might want to use more and more VRAM (in fact, might isn't the word because they WILL) and having not enough VRAM with a fast enough GPU might send a card to the boneyard before its time.

When the RX 5700 XT has 8GB of VRAM, you can't tell me that the RTX 3080, a card that's WAY more powerful, is good enough with 10. Hell, even the RTX 3060 could be had with 12GB.
The only measure that makes sense across countries and over time is US MSRP, the other countries' local MSRP is pretty much proportional to this with their respective VAT and toll rates. Prices in stores will vary over time, so this would affect the conclusions of the reviews.
Yes, that's true. This is why we tend to use American pricing when we discuss. I'm not an American but I still use American pricing because it's all relative anyway. If an RTX 4080 costs more in the USA, it will cost more in Canada, the UK, Australia, etc.

Where can I find a 3080 for $550? Old cards are very relevant indeed, but it all comes down to pricing. RX 6800 non-XT is a great option with fantastic price/performance if you can find it at $500 or below
I still remember seeing the ASRock Radeon RX 6800 XT Phantom Gaming D on newegg for $515USD. Considering what I paid for mine, I wanted to kick myself in the balls! :roll:
 
I still remember seeing the ASRock Radeon RX 6800 XT Phantom Gaming D on newegg for $515USD. Considering what I paid for mine, I wanted to kick myself in the balls! :roll:
Yeah that was when AMD AIBs were freaking out because they had tons of stock and AMD told them what new-gen will bring
 
Yeah that was when AMD AIBs were freaking out because they had tons of stock and AMD told them what new-gen will bring
It would appear that ASRock was freaking out the most because they had the lowest prices. Although they were also the "new kids on the block" when it comes to video cards so they might have overreacted, much to the delight of the consumers. For me though, I paid way too much for my RX 6800 XT because I wanted a reference model. Being around as long as I have makes you see the first gaming Radeon reference model without a blower cooler is seen as a momentous occasion! I'm sure you understand what I mean. :roll:
 
Right, because Radeon cards don't get better with age....
Well, they don't. Not relative to the competition at least.
This "fine wine" BS is a total myth that has existed at least since the Radeon 200/300 generation.
Remember the Fury cards? They had less memory, but fanboys insisted the HBM was so fast that it would make up for it.
Soon after, with Radeon RX 480/580, it was supposed to crush the counterpart (GTX 1060), and be so much better in the long run. (but it never materialized)
Then with Vega we kept hearing the same claims again, more power to be unleashed "soon", thanks to mystical drivers and somehow being "better designed for DirectX 12" (a claim which circulated for all of the above, without any evidence to support it).

So no, I'm not new to this dance.
I've actually been following it for a couple of decades, and I've seen the BS-circle from fanboys so many times. And as anyone with deeper knowledge of computer graphics, I see through this BS instantly.

So in all honesty, would you rather have a RX 6900 XT (16 GB) over a RTX 4070 Ti 12 GB? (assuming both free) Thinking the older GPU with more VRAM would be more "future proof" would be foolish.

Benchmarking is only relevant for TODAY, not for tomorrow.
The fundamental characteristics of a GPU will not change with new games. Rendering a mesh of x polygons will still need x performance, rendering a texture of size x will still require x amount of bandwidth. What changes with new games is how they utilize the hardware, not the hardware characteristics themselves.

Games might want to use more and more VRAM (in fact, might isn't the word because they WILL) and having not enough VRAM with a fast enough GPU might send a card to the boneyard before its time.
By the time a well balanced GPU runs out of VRAM, it's no longer capable of rendering 60+ FPS at that detail level anyways (and it's possible dropped driver support too).

When the RX 5700 XT has 8GB of VRAM, you can't tell me that the RTX 3080, a card that's WAY more powerful, is good enough with 10. Hell, even the RTX 3060 could be had with 12GB.
Completely different architectures.
If you knew how GPUs managed memory, you wouldn't ask the question this way to begin with. GPUs accesses memory in large blocks, many of which can be losslessly compressed on the fly, especially temporary buffers which are large but mostly "empty" (or are partially emptied thanks to tiled rendering, etc.). All modern GPU do compression, but Nvidia has had an upper hand in this effort, and has improved the technology with every generation. So you can't do an apples-to-apples comparison like this. And allocated VRAM isn't the same as needed VRAM.
As I always say when people bring up the VRAM concern; the truth is in the benchmarks, that's the only way you can discern whether a GPU has enough VRAM or not. The actual VRAM management happens way too quickly to even observe in real time. The only thing we can observe (without debug hardware) is the symptoms of too little VRAM, which is a total collapse in frame rate. So whenever you see a GPU still scale fine in 4K, and even scales further with OC, then it's not out of VRAM.

RTX 3060 has "too much" VRAM, this is a result of the memory configuration of the die, cutting it half would be too little. E.g. RTX 3070 Ti scales fine with just 8 GB.
 
Well, they don't. Not relative to the competition at least.
This "fine wine" BS is a total myth that has existed at least since the Radeon 200/300 generation.
Remember the Fury cards? They had less memory, but fanboys insisted the HBM was so fast that it would make up for it.
Soon after, with Radeon RX 480/580, it was supposed to crush the counterpart (GTX 1060), and be so much better in the long run. (but it never materialized)
Then with Vega we kept hearing the same claims again, more power to be unleashed "soon", thanks to mystical drivers and somehow being "better designed for DirectX 12" (a claim which circulated for all of the above, without any evidence to support it).

So no, I'm not new to this dance.
I've actually been following it for a couple of decades, and I've seen the BS-circle from fanboys so many times. And as anyone with deeper knowledge of computer graphics, I see through this BS instantly.

So in all honesty, would you rather have a RX 6900 XT (16 GB) over a RTX 4070 Ti 12 GB? (assuming both free) Thinking the older GPU with more VRAM would be more "future proof" would be foolish.


The fundamental characteristics of a GPU will not change with new games. Rendering a mesh of x polygons will still need x performance, rendering a texture of size x will still require x amount of bandwidth. What changes with new games is how they utilize the hardware, not the hardware characteristics themselves.


By the time a well balanced GPU runs out of VRAM, it's no longer capable of rendering 60+ FPS at that detail level anyways (and it's possible dropped driver support too).


Completely different architectures.
If you knew how GPUs managed memory, you wouldn't ask the question this way to begin with. GPUs accesses memory in large blocks, many of which can be losslessly compressed on the fly, especially temporary buffers which are large but mostly "empty" (or are partially emptied thanks to tiled rendering, etc.). All modern GPU do compression, but Nvidia has had an upper hand in this effort, and has improved the technology with every generation. So you can't do an apples-to-apples comparison like this. And allocated VRAM isn't the same as needed VRAM.
As I always say when people bring up the VRAM concern; the truth is in the benchmarks, that's the only way you can discern whether a GPU has enough VRAM or not. The actual VRAM management happens way too quickly to even observe in real time. The only thing we can observe (without debug hardware) is the symptoms of too little VRAM, which is a total collapse in frame rate. So whenever you see a GPU still scale fine in 4K, and even scales further with OC, then it's not out of VRAM.

RTX 3060 has "too much" VRAM, this is a result of the memory configuration of the die, cutting it half would be too little. E.g. RTX 3070 Ti scales fine with just 8 GB.
Wanna bet this is all going to fly right over his head?
 
A lot of new build threads across the net these past few days include a RTX 4070 Ti. So much for the youtubers who were bashing these cards.
 
  • Like
Reactions: N/A
A lot of new build threads across the net these past few days include a RTX 4070 Ti. So much for the youtubers who were bashing these cards.
Goes to show people will still buy them despite being a bad price.
 
So the reviews are supposed to include VAT? Not all countries have a VAT. I live in the US and I pay no sales tax.
The only measure that makes sense across countries and over time is US MSRP, the other countries' local MSRP is pretty much proportional to this with their respective VAT and toll rates. Prices in stores will vary over time, so this would affect the conclusions of the reviews.
Here's some visual aid:
 
Goes to show people will still buy them despite being a bad price.
They're already creeping up to the $900 price point. I expect the average price will be $950 by next month. They'll still sell though, so it doesn't matter, and Nvidia knows this. Regardless, it's as fast (except in 4k) as a 3090ti (an atrocious card that was marginally faster than the 3080 at 1080/1440. The 3080 came out two years ago), so it's a good deal. At least that's what I've been reading. Didn't the 3090ti come last year, too? I feel sorry for anyone who paid that initial $2000.
 
Last edited:
I guess by that logic I should use rmb as that’s the currency used by most people or inr?

Strictly speaking, if you're applying logic, where is the largest % demographic registered IP from on TPU. Is it US ($), Europe (euro), or elsewhere?

Sticking with the dollar value is sensible as that is used globally for trade reasons.
 
They're already creeping up to the $900 price point. I expect the average price will be $950 by next month. They'll still sell though, so it doesn't matter, and Nvidia knows this. Regardless, it's as fast (except in 4k) as a 3090ti (an atrocious card that was marginally faster than the 3080 at 1080/1440. The 3080 came out two years ago), so it's a good deal. At least that's what I've been reading. Didn't the 3090ti come last year, too? I feel sorry for anyone who paid that initial $2000.

 
I guess by that logic I should use rmb as that’s the currency used by most people or inr?
The logic goes like this, each country has its own pricing.

Keep throwing US MSRP is misleading, a better way would be to mention a price range (currency doesn't matter, maybe $ and €) within which the card is interesting say 500~700€/$ (final price). Another way just to compare it to other products, and then depending on the pricing potential buyers can choose.

The irony is, you already did all the hard work (and are among the best if not the best out there), you gave us all the data points we need to have a really good picture of the 4070ti, but conclusion was disconnected from reality.
 
I'm glad your links link to newegg, cause that's what I was looking at:





Like I said. Price is slowly creeping up. Both the 4080 and 4090 have jumped way past their MSRP. Am I supposed to think the 4070ti is going to be different?
 
Last edited:
@W1zzard - you love charts. Here's a simple one you could use to register the price. A linear point graph with model (or closest equivalent) and year. That way, you can plot the price variations of the model equivalent. Steve at GN has this on his 4080 review:

GN price chart.png


It doesn't even need to be inflation adjusted as year-on year figures reduce the impact.
 
@W1zzard - you love charts. Here's a simple one you could use to register the price. A linear point graph with model (or closest equivalent) and year. That way, you can plot the price variations of the model equivalent. Steve at GN has this on his 4080 review:

View attachment 278143

It doesn't even need to be inflation adjusted as year-on year figures reduce the impact.
MSRP is meaningless, but good chart to make people angry about MSRP price changes

Strictly speaking, if you're applying logic, where is the largest % demographic registered IP from on TPU. Is it US ($), Europe (euro), or elsewhere?

Sticking with the dollar value is sensible as that is used globally for trade reasons.
It is US by a small margin over Europe, 10% from China, then rest of the world

Keep throwing US MSRP is misleading
I avoid using MSRPs where possible. I look up every price point for every card before the reviews (Newegg US, Amazon US, eBay US / buy now / reputable seller) But for cards that are going on sale the day after the reviews there is no real pricing data (how convenient)
 
How about a calculator where visitors can enter actual prices and it output fps/currency?
 
@W1zzard - you love charts. Here's a simple one you could use to register the price. A linear point graph with model (or closest equivalent) and year. That way, you can plot the price variations of the model equivalent. Steve at GN has this on his 4080 review:

View attachment 278143

It doesn't even need to be inflation adjusted as year-on year figures reduce the impact.
Great chart. Best of any reviewer. That alone should piss off every buyer.

If all reviews showed that and emphasized how bad pricing was and reflected that in their conclusion, Nvidia might get a clue. I mean Nvidia did kind of listen to the bad press by unlaunching the 4080 12gb, so it does prove bad PR matters.
 
Fixed for accuracy.

You people need to start reading the fucking room.
Huh? These companies deceive you with fake MSRPs and sell their product at higher prices and want press to still report MSRPs. and I use actual market prices instead of MSRPs and you say I'm doing the wrong thing?
 
Huh? These companies deceive you with fake MSRPs and sell their product at higher prices and want press to still report MSRPs. and I use actual market prices instead of MSRPs and you say I'm doing the wrong thing?
The ratio of people disliking the price of this new gen and complaining about it vs ones like you that just say EDITORS CHOICE is wayyyyy wayyyyy in favor of the former. Yeah ignorant/rich people will still buy that's just sad but that's the way it is. If enough people buy at these prices amd and nvidia will still keep upping them. That said, none of that is an excuse or justification for your those price hikes, your award, if you put an award there that MUST MEAN SOMETHING.

What if you for example say:

EDITORS CHOICE IN:
-Noise and thermals
-Power comsumption
-Overclocking potential
-Features
-Performance over last generation

GARBAGE CHOICE IN:
-Price

I assure you no one would complain about your award.
 
The ratio of people disliking the price of this new gen and complaining about it vs ones like you that just say EDITORS CHOICE is wayyyyy wayyyyy in favor of the former.
I don't "just say Editor's Choice". I spend hours writing a long conclusion that covers all aspects of the product, and I think everyone agrees that 4070 Ti is a really good product? All the drama is about the price, which is very high indeed. But the fact is that there is no alternative right now. If I give you 800 bucks today, what's best card that you can buy?
 
I don't "just say Editor's Choice". I spend hours writing a long conclusion that covers all aspects of the product, and I think everyone agrees that 4070 Ti is a really good product? All the drama is about the price, which is very high indeed. But the fact is that there is no alternative right now. If I give you 800 bucks today, what's best card that you can buy?
You meant a new card, before someone jumps in and say buy used of whatever gpu.
 
Back
Top