• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA France Accidentally Reveals GeForce RTX 4070 Ti Specs

Seems pretty tight on VRAM for the amount of processing power it has. I wonder if that's intentional to keep people dishing out the big bucks for 4k cards.

Im 100% sure it is, its planned obsolescence, AMD is dishing out large amounts of Vram so cards can remain viable but big N (remember, over 80% marketshare) is now planning for even more profits.

All up to you guys to decide what you want to support....
 
I will wait for RTX 50xx with Windows 7 drivers support.
 
Hot turd about to drop.
So, you've tested it then? :P
I don't believe it will be hot, and whether or not it's a "turd" or not remains to be seen.
This card is likely to perform comparable or better than RTX 4080 per GFlop, so if it's priced at $800 it might turn out to be a better deal than most seem to think.
So, it this turns out to be right, will you eat your words? (not in a literal sense)

Im 100% sure it is, its planned obsolescence, AMD is dishing out large amounts of Vram so cards can remain viable but big N (remember, over 80% marketshare) is now planning for even more profits.

All up to you guys to decide what you want to support....
This nonsense still lives on, unfortunately.
If the reviews show the card scales just fine with 4K, then there is no reason to worry, neither now or in the future.
 
So, you've tested it then? :p
I don't believe it will be hot, and whether or not it's a "turd" or not remains to be seen.
This card is likely to perform comparable or better than RTX 4080 per GFlop, so if it's priced at $800 it might turn out to be a better deal than most seem to think.
So, it this turns out to be right, will you eat your words? (not in a literal sense)
I didn't mean hot in a sense that it's gonna be 110C under load like a certain competitor, haha! But I also won't eat my words based on that criteria of being just better than 4080 - 4080 is in my books egregiously priced. I just don't buy the "wafers and components are too expensive" story to that big of a degree. All hardware got more expensive but not by as much as GPUs. GN's Steve had a good graph, imo. I could accept a 20% hike as reasonable, hell, even 30 - but not 50. So being better than the worst value we've ever seen outside of mining shortages is not enough. All of these products are terribly priced and I just refuse to pay that much. If that's the new norm - well, they can have it, I'm out. Once my rig is obsolete I'll just buy a playstation again for the AAA-stuff and keep the rig for work and indies+emulation. And if others are willing to pay - I'm not going to criticize their decisions, it's not my money to spend.
 

Attachments

  • 1672428396867.png
    1672428396867.png
    500.2 KB · Views: 74
Last edited:
So, you've tested it then? :p
I don't believe it will be hot, and whether or not it's a "turd" or not remains to be seen.
This card is likely to perform comparable or better than RTX 4080 per GFlop, so if it's priced at $800 it might turn out to be a better deal than most seem to think.
So, it this turns out to be right, will you eat your words? (not in a literal sense)

A $800 MSRP 4070 Ti with a street price of $900 - $1,200 that's guaranteed to be slower than a 4080 is indeed a turd. "deal"? If you think this is a deal I have an extended car warranty to sell you.
 
Kind of interesting. The 3070ti was comparable, if not better than the 2080ti. Nvidia don't appear keen to compare the 4070ti to the 3080ti.

Hmmm....
But.. but... it's 3.5x faster than the non-Ti... if you turn on frame generation, decrease some settings, play a different game and buy a 4090. :roll:
 
3080 Ti and 3080 12G OC are very similar -3%. If we exclude the gimmics 4070 Ti probably even sligthly exceeds the goal. But the vanilla 4070 is better suited for a 60 Ti.

1672437595128.png
 
This nonsense still lives on, unfortunately.
If the reviews show the card scales just fine with 4K, then there is no reason to worry, neither now or in the future.

Vram is needed for RT and Texture resolution and Resolution its played at, so yeah, in the future all those just become more demanding so Vram will be a problem.

Portal RTX uses 16gb of Vram at 4k RIGHT NOW sooo yeah.
 
I didn't mean hot in a sense that it's gonna be 110C under load like a certain competitor, haha! But I also won't eat my words based on that criteria of being just better than 4080 - 4080 is in my books egregiously priced. I just don't buy the "wafers and components are too expensive" story to that big of a degree. All hardware got more expensive but not by as much as GPUs. GN's Steve had a good graph, imo. I could accept a 20% hike as reasonable, hell, even 30 - but not 50. So being better than the worst value we've ever seen outside of mining shortages is not enough. All of these products are terribly priced and I just refuse to pay that much. If that's the new norm - well, they can have it, I'm out. Once my rig is obsolete I'll just buy a playstation again for the AAA-stuff and keep the rig for work and indies+emulation. And if others are willing to pay - I'm not going to criticize their decisions, it's not my money to spend.
Exactly. I don't mind if NVidia wants to charge $800 for the 4080. But $1200 is just criminal.

3080 Ti and 3080 12G OC are very similar -3%. If we exclude the gimmics 4070 Ti probably even sligthly exceeds the goal. But the vanilla 4070 is better suited for a 60 Ti.

View attachment 276856

RTX 3070 is $500, beats the $1200 2080 Ti. This new 4070 Ti is really the 4070, probably just beats the 3080 Ti if lucky (with RT on, not with RT off). Should also be $500. Could stomach $600. I'll probably go buy one of those $325 Arc A770's just for the giggles in January.
 
This nonsense still lives on, unfortunately.
If the reviews show the card scales just fine with 4K, then there is no reason to worry, neither now or in the future.

This isn't really true. VRAM requirements gradually drift up over time. I've actually hit the 8GB VRAM limit on my GTX 1070 Ti, and that's with a lot less processing power. I think it's reasonable to assume that, in 5 years (my typical GPU lifecycle) and at a higher settings target, 12 GB could be a problem. I'm not saying this is the end of the world or anything, but I guess personally if I'm spending $800 on a GPU, I don't want to have doubts. That said, it looks like I'm going to be either facing unacceptably high idle power draw (AMD) or questionable VRAM (NVidia) on this generation so in my opinion we've got no winners yet...They've got like 2 months to get their shit together before I go buy a used last-gen card and put the saved cash toward other hobbies lol
 
This isn't really true. VRAM requirements gradually drift up over time. I've actually hit the 8GB VRAM limit on my GTX 1070 Ti, and that's with a lot less processing power. I think it's reasonable to assume that, in 5 years (my typical GPU lifecycle) and at a higher settings target, 12 GB could be a problem. I'm not saying this is the end of the world or anything, but I guess personally if I'm spending $800 on a GPU, I don't want to have doubts. That said, it looks like I'm going to be either facing unacceptably high idle power draw (AMD) or questionable VRAM (NVidia) on this generation so in my opinion we've got no winners yet...They've got like 2 months to get their shit together before I go buy a used last-gen card and put the saved cash toward other hobbies lol

What I wonder with that powerdraw, if you have a CPU with intergrated graphics, what if you just set your normal ermm desktop usage to use the integrated graphics card instead, would that no solve the power draw as its no the AMD gpu actually doing the video playback or multi monitor stuff?
 
Im 100% sure it is, its planned obsolescence, AMD is dishing out large amounts of Vram so cards can remain viable but big N (remember, over 80% marketshare) is now planning for even more profits.

All up to you guys to decide what you want to support....
Nv has done this since GF2, their cards suddenly dying, fucking bullshit.

Exactly. I don't mind if NVidia wants to charge $800 for the 4080. But $1200 is just criminal.



RTX 3070 is $500, beats the $1200 2080 Ti. This new 4070 Ti is really the 4070, probably just beats the 3080 Ti if lucky (with RT on, not with RT off). Should also be $500. Could stomach $600. I'll probably go buy one of those $325 Arc A770's just for the giggles in January.
Anything over 500 for top end is criminal.

People can get a XB or PS5 for that prive and be on their way playing
 
I didn't mean hot in a sense that it's gonna be 110C under load like a certain competitor, haha! But I also won't eat my words based on that criteria of being just better than 4080 - 4080 is in my books egregiously priced. I just don't buy the "wafers and components are too expensive" story to that big of a degree. <snip>
My objection is prejudging a product before we know the product's performance and price.
This card may very well end up at a similar performance per Dollar range as AMD's latest cards, so will you criticize them as harshly as you criticize this product then?

Vram is needed for RT and Texture resolution and Resolution its played at, so yeah, in the future all those just become more demanding so Vram will be a problem.

Portal RTX uses 16gb of Vram at 4k RIGHT NOW sooo yeah.
VRAM allocated isn't the same as VRAM needed. Many buffers and textures are heavily compressed on the fly. The true judge of VRAM requirement is benchmarking the performance; if the card runs out of VRAM the performance will drop sharply. If on the other hand the performance keep scaling, then there is no issue.

This isn't really true. VRAM requirements gradually drift up over time.
VRAM requirements may increase, but the bandwidth required to utilize a given amount of VRAM is fixed for any piece of hardware. As games become more demanding, the bandwidth required to utilize the desired VRAM will inevitably become the bottleneck long before the VRAM itself. By the time you actually allocate that much VRAM, the performance for a fairly well-balanced game will be approaching "slide-show territory" (way below 60 FPS).
The only exception to this would be a game which manages VRAM extremely poorly, meaning a game which allocates a lot more VRAM than it should, like a modded game with a texture pack. This is really an edge case, and for most buyers it's silly to buy cards with extra VRAM for this purpose. (If you're the exception, then that's fine for you, but don't assume normal gamers needs it.)
 
VRAM allocated isn't the same as VRAM needed. Many buffers and textures are heavily compressed on the fly. The true judge of VRAM requirement is benchmarking the performance; if the card runs out of VRAM the performance will drop sharply. If on the other hand the performance keep scaling, then there is no issue.

I know it works like that for normal ram, but I dont think it works like that for Vram, I also remember Digital Foundry running into a Vram bottleneck on an Nvidia card resulting in the game's textures never loading in fully, just this early low res initial texture, think it was far cry 6.

and sure but again, im saying that a current game, a game today, and perhaps more then one can already demand close to the max vram of current cards, spending 1000 bucks on a gpu running the risk that it will run into basic issues like Vram shortages within a year or 2 is not a risk I think many would be willing to make, course you are free to believe it to be a non issue.
 
VRAM allocated isn't the same as VRAM needed. Many buffers and textures are heavily compressed on the fly. The true judge of VRAM requirement is benchmarking the performance; if the card runs out of VRAM the performance will drop sharply. If on the other hand the performance keep scaling, then there is no issue.

16GB is the amount portal RTX uses at 4K, not just allocates: https://www.techpowerup.com/review/portal-with-rtx/3.html

Performance doesn't immediately drop when you run out of VRAM. It depends on the game but usually you can go 30% above available VRAM and the GPU will do a decent job of swapping between the VRAM and main system memory. The problem is, the instant something that needs to be fetched often is sent to the main system memory when VRAM is full, performance tanks.

It's not just an annoyance, it renders the game unplayable. The 3070 gets 1 FPS at 4K, but even in less extreme scenarios where you "just" get poor frame timing or stuttering it's easy to see why people want more VRAM. There's really no excuse other than forced obsolescence either because it would not be expensive for Nvidia to have added more.

VRAM requirements may increase, but the bandwidth required to utilize a given amount of VRAM is fixed for any piece of hardware. As games become more demanding, the bandwidth required to utilize the desired VRAM will inevitably become the bottleneck long before the VRAM itself. By the time you actually allocate that much VRAM, the performance for a fairly well-balanced game will be approaching "slide-show territory" (way below 60 FPS).
The only exception to this would be a game which manages VRAM extremely poorly, meaning a game which allocates a lot more VRAM than it should, like a modded game with a texture pack. This is really an edge case, and for most buyers it's silly to buy cards with extra VRAM for this purpose. (If you're the exception, then that's fine for you, but don't assume normal gamers needs it.)

Games must work on a variety of cards with vastly different speed memory subsystems. Games address this issue through asset streaming. The engine will load in objects based on priority and will use multiple LODs to ensure that the game will play smoothly on a wide range of video cards with vastly differing memory subsystems.

Both VRAM and Bandwidth are equally important. You need more VRAM to store assets and graphics data of increasingly complex games and you also need more bandwidth to move those into said memory within a reasonable time.

The above example of the 3070 getting 1 FPS disproves the idea that bandwidth will always be a limiting factor before VRAM size. It may appear that bandwidth is often the limiting factor but that's down to the fact that game developers would not design games that run like crap on newer video cards. The consequences of using too much VRAM make the game unplayable and thus you rarely see it. Limited VRAM size restrict what devs are able to do with their games.
 
The performance difference between "4070TI" and "3080" should be less than 10%, without dlss3. And it has to sell for more expensive prices.
 
there is nothing wrong with these 40 series cards except price. its the only issue. the performance is there, power consumption overclocking cooling etc. If Nvidia would have released them at $500 $700 and $1K or something, then it would have been a completely different discussion. but ofc why would they when the sheeple are willing to pay much more.
 
there is nothing wrong with these 40 series cards except price. its the only issue. the performance is there, power consumption overclocking cooling etc. If Nvidia would have released them at $500 $700 and $1K or something, then it would have been a completely different discussion. but ofc why would they when the sheeple are willing to pay much more.
Till the gddr6x starts to fail like on rtx2000/3000...
 
The specs and performance is not new. Nvidia actually showed the specs and performance when they were going to launch the RTX 4080 16GB and 12GB. Nothing really changed here other than the model number.
 
The performance difference between "4070TI" and "3080" should be less than 10%, without dlss3. And it has to sell for more expensive prices.
"4070Ti" 12GB < 3080 12GB if not use dlls3 fake frames and on similarity resolution and game settings. Also when tested on same PC configuration. Yes both cards are limited by VRAM size but 3080 12GB must using it's VRAM better because much bandwidth.
 
User error. Don't try to stick an RTX 4090 in a small cramped case and make sure it's plugged in all the way.

Any guesses on where the RTX 4070 Ti will place on this list after Wizard's review of that card?

https://pcpartpicker.com/search/?q=RTX+3080+Ti <--- current US prices for the RTX 3080 Ti 12GB via PC Partpicker

average-fps_2560_1440.png
 
Last edited:
Back
Top