• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5070 Ti Leak Tips More VRAM, Cores, and Power Draw

Lots of guys hate Nvidia, and that's ok :)

And many of the comments in this thread are from guys running AMD GPU's lol..

You guys just need to stop with this red vs green nonsense.

I think some people are just way too sensitive.

You guys that are jumping on me are like children when it comes to hardware.
You guys, lots of guys, sensitive guys, guys that are targeting you. Maybe we are jealous of you? Just a thought.
The worst thing is that you are a staff member and I can't put you in my ignore list(yes I checked that). But you did started following me just 35 minutes ago. I guess you are preparing your BAN hammer for any future posts by me, because, well, you are a staff member I guess you can do that.

How well do AMD GPU's run F@H?

Like shit, that is one of the reasons I am not running one.
Yep, it's a wonderful game. I do agree that I am tempted to buy an Nvidia card to play it. But don't tell anyone.
 
Last edited:
Maybe we are jealous of you? Just a thought.

Very doubtful, we are all big boys here.

The worst thing is that you are a stuff member and I can't put you in my ignore list(yes I checked that). But you did started following me just 35 minutes ago. I guess you are preparing your BAN hammer for any future posts by me, because, well, you are a stuff member I guess you can do that.

I recently started following people who have appeared on my radar, just using tools that are provided to me.
 
Low quality post by dismuter
I see that this thread is going about as well as usually such things do.

Anyway, rumors are all well and good, but what will matter is performance and price. I am not hugely optimistic, NV essentially has a captive market and can price at whatever the hell they think said market will bear, but we’ll see. Not too enthused about a potential TDP jump. I do realize that this is inevitable nowadays as a means to scrape every little bit of performance, but it’s not to my preference. Probably will end up being that you can limit the power significantly without losing much, but still.
 
It means that 5070Ti will smoke 4070Ti.
It will be faster but not as fast as RTX 4070 Ti vs RTX 3070 Ti in performance gains. You will pay more for less gain! This is currently nvidia's signature and pride. :)
 
It will be faster but not as fast as RTX 4070 Ti vs RTX 3070 Ti in performance gains. You will pay more for less gain! This is currently nvidia's signature and pride. :)
So, I have 2 kids, and 3 computers, I am not upgrading just to blow money.
 
More than enough for 1440p120 which this card is aimed at.
 
RTX 5070 Ti is very special and brings 4090 performance down to $800. At least In 1440p it should land much closer to 4090 than to 4080.

I hope this is sarcasm, Nvidia will never give that sort of performance for such a price when they can ream their customers while they have no regard for themselves and come back for more.
 
My 4070Ti smokes my 3070Ti in every possible way. Lots of guys hate Nvidia, and that's ok :)
It means that 5070Ti will smoke 4070Ti.

And many of the comments in this thread are from guys running AMD GPU's lol..

P.S. A kink in your theory, I'm on the same page with the "guys running AMD GPU's lol" except not running AMD. Also no idea how F@H runs these days on anything, haven't tried it in over 20 years, the power consumption isn't a selling point for me.

Nvidia is the new Intel, with price and power envelope progress that greatly outpace the performance progress.
 
Last edited by a moderator:
Stop the off topic drama
More than enough for 1440p120 which this card is aimed at.
If that is the case then this card isn't much of an upgrade over a 4070Ti Super, anything over $500 should be capable of 4K because good 4K monitors are very affordable.
 
Last edited by a moderator:
Nvidia is the new Intel, with price and power envelope progress that greatly outpace the performance progress.
I'm not sure what that means. Intel's prices have been pretty good in the past 2-3 years compared to their all-around performance (especially for the i5), and NVIDIA has the best power efficiency.
 
350w power draw that's a lot for mid range gpu that also could mean that actual architectural updates may be not so great.
 
P.S. A kink in your theory, I'm on the same page with the "guys running AMD GPU's lol" except not running AMD. Also no idea how F@H runs these days on anything, haven't tried it in over 20 years, the power consumption isn't a selling point for me.

Nvidia is the new Intel, with price and power envelope progress that greatly outpace the performance progress.
I could be a guy running an Nvidia gpu and absolutely hate it because of the things Nvidia has been pulling since the RTX 2000 series, not everyone has to be fawning over everything the leather jacket man is selling. Nvidia is very much like Apple with how the marketing works, and Nvidia has been the new Intel since the 3000 series, pricing and power consumption went too high and performance slowed down because Nvidia wants to force ray tracing on everyone even though the tech still isn't ready after 3 generations.
I have no "power" in here.

I have no power at all, I volunteer for free, try to keep threads tidy in certain sections, etc.

Just a regular guy.

Many childish people in this thread, a little disappointed to be honest, but not at all surprising.

Keep up the good work fellas :)
It depends what you mean by power then I guess, since you put power in quotation marks.
To a normal pleb like me you have plenty of "tools" I don't. I might be misreading your post but if you want to insist those who disagree with you are childish, then cool, good for you.
I think a lot of the hositlity in gpu threads comes from just how bad the GPU market has gotten, maybe people are taking it too seriously, myself included. I shouldn't be seeing a GPU topic seriously here, not after I seen a review with not having DLSS listed as a con, I didn't expect that to be said again but it was with the B580 review.
 
Last edited:
(...) 10GB on a 3080 (the most interesting card in the line up, everything else was overpriced or halfway obsolete on release, see x60/x70 performance in stuff like Hogwarts for example) was a fucking joke, and the lower tiers were nothing better. Double VRAM cards came too late and looked strange relative to core power, so basically one half of the stack was underspecced and the replacements were overkill on VRAM. The whole release of Ampere was a horrible mess, failure rate wasn't exactly lowest either. I think Ampere may go down history as the generation with the lowest useable lifetime of Nvidia's GPU stacks, all things considered. (...)

I'm not sure about that, the 2 GiB GTX 960 might beg to differ. Also the GTX 700 gen I think aged quite ungracefully, even if you got the higher vRAM variants. Hopefully Intel can help making 10-12 GiB of vRAM as the expected minimum for an affordable mid-range GPU with their new GPU lineup.
 
The comparisons in this article are messed up. The 4070 Ti was superseded by the 4070 Ti Super and has been discontinued, so there's no point mentioning or comparing to the 4070 Ti non-Super.
Yet it just mentions 4070 Ti, while mixing specs from the non-Super (12 GB VRAM) and Super (8448 CUDA cores, the non-Super had 7680).
So in fact, it does not have more VRAM, because the 4070 Ti Super already had 16 GB.
I think the reasoning here in the article for using 4070 Ti non S, is that we are again at the same moment in the launch cycle - we're not in Super release time, but in vanilla version release time. Super is a refresh, much like how Blackwell might get one a year, or two, later.

I'm not sure about that, the 2 GiB GTX 960 might beg to differ. Also the GTX 700 gen I think aged quite ungracefully, even if you got the higher vRAM variants. Hopefully Intel can help making 10-12 GiB of vRAM as the expected minimum for an affordable mid-range GPU with their new GPU lineup.
The 960 did come to mind after posting indeed lol, nice one. 700 series kinda had the same issue that Ampere has in terms of timing of its release, prior to a move forward in console gaming and increasing demands on... VRAM! But the gen itself was quite good, it was what Kepler should have been right away - that stack only went up to x104, 700 added the big chip and Nvidia dragged it out for quite a while. Maxwell, though, was also just an extremely good gen, rivalling Pascal - the first iteration of delta compression came with it, alongside a much leaner core.
 
Last edited:
I think the reasoning here in the article for using 4070 Ti non S, is that we are again at the same moment in the launch cycle - we're not in Super release time, but in vanilla version release time. Super is a refresh, much like how Blackwell might get one a year, or two, later.
I don't think that that reasoning is useful. It would make much more sense to compare what's currently on the market, with what's going to replace it. But it seems that there was not much thought put into it anyway, considering that the CUDA core count mentioned for the 4070 Ti is actually that of the 4070 Ti Super.
 
Quite happy to sacrifice my new born atm…
haha my sister has two kids under 5.

those first 5 years are tough.

16 GB on a 350 W card. Am I supposed to be impressed or something? :wtf:
NV's power consumption was very good in the 4000 series after they ditched that samsung node for TSMC.

even with a TBP of 350 watts will most likely see lower power while gaming.
 
Last edited:
350w power draw that's a lot for mid range gpu that also could mean that actual architectural updates may be not so great.
My 4070 Ti runs at 200w (via nvidia-smi) and lost only 5-6% performance across various games.

No doubt the 5070 Ti can run at 275W just fine.
 
The 960 did come to mind after posting indeed lol, nice one. 700 series kinda had the same issue that Ampere has in terms of timing of its release, prior to a move forward in console gaming and increasing demands on... VRAM! But the gen itself was quite good, it was what Kepler should have been right away - that stack only went up to x104, 700 added the big chip and Nvidia dragged it out for quite a while. Maxwell, though, was also just an extremely good gen, rivalling Pascal - the first iteration of delta compression came with it, alongside a much leaner core.
I think I might have the definitive answer to the worst Nvidia GPUs... the FX 5000 series. Those may be the worst of them. I had an FX 5500 but for me it seemed ok since it was an upgrade from a Voodoo 3, along with upgrading system RAM to a "whooping" 192 MiB amount, although this is already ancient history relatively speaking and maybe a tad too much of offtopic so I'll leave it there.
 
Last edited:
My 4070 Ti runs at 200w (via nvidia-smi) and lost only 5-6% performance across various games.

No doubt the 5070 Ti can run at 275W just fine.
With OC I can see ~305w on the core, and board power at~400w running certain workloads in F@H.

MSFS is a pretty good load, ~300w on the core, ~350-380w board power at times
 
First, beating a dead horse: I predict there won't be any benefit from the node process change, at all.
  1. 40 series is N4, 50 series is N4P.
  2. TSMC says N4->N4P is +6% perf, N4P->N4X is +4% perf. (source: Wikipedia)
  3. Zen 4 is N4P, Zen 5 is N4X.
  4. There was no clock speed improvement for Zen 5. Advertised boost clocks barely changed, and measured boost clocks were either the same or worse than Zen 4. (source: TPU, 9900x clocks vs 7900x clocks and 9700x clocks vs 7700x clocks)
  5. TSMC perf claims are at 1.2V, and GPU's run at lower voltage than CPU's so any hypothetical benefit will be further shrunk.
Next, the TDP boost won't improve max clocks by more than 10%. I don't have a solid source for this, since TPU OC tests are run at stock board power instead of max (in which case I could point you to the review of the 4070 TiS Strix which has +28% max board power.) But my general impression from undervolting tests, for example this 4080S test on Reddit, is that a 50% change in power results in 15% change in clocks and 10% change in performance. Also you can eyeball the voltage/frequency plots in a TPU review, extrapolate to ~1.2V (rule-of-thumb that power scales with square of voltage, so this would be ~25% more power than 40 series), and see that there's barely 200 MHz gained on the projected curve.

Finally, the higher memory bandwidth will help slightly. Promisingly, a 4090 with memory OC'd from 21Gbps to 26Gbps supposedly achieved 13% more perf for that 24% clock boost. But the 5070 Ti has half as many cores and probably won't see as much benefit. As I mentioned upthread, the 4070 TiS has a 33% wider bus than the 4070 Ti, 10% more cores, and 3% lower core clocks. Actual performance gain was about 10% at 4K, less at lower resolutions. I'll guesstimate 15% better perf at most from the upgrade of 21Gbps GDDR6 to 28Gbps GDDR7.

Overall I predict the 5070 Ti will perform 15-20% better than the 4070 TiS, which will put it slightly above the 4080S. It will probably be priced below the 4080S's $1000 (my bet: $849 MSRP, $975 street price) and have similar perf/W. I'm also expecting DisplayPort UHBR20 and PCIE 5.0 support, which will improve these cards' longevity.

Re: PCIE 5.0, I would love to see the cards make use of PCIE bifurcation because I'd rather have 8 lanes more PCIE connectivity for NVMe than the <1% performance benefit of extra graphics bandwidth, but I'm not hopeful that card or motherboard manufacturers will make this possible for 5070/5080 series cards.
 
Last edited:
I hope this is sarcasm, Nvidia will never give that sort of performance for such a price when they can ream their customers while they have no regard for themselves and come back for more.

Except every single time. isn't 3070 Ti as good as 2080 Ti,. and 4070 ti as good as 3080 Ti, same thing. It's entirely possible that the 5070 Ti is in close range to the 4090 at 1440p, which remains better at 4K and the saving grace for 4090 owners that are coming back for a 5090 this time and now caclulate when is the best time to sell. It really is a 3-4 year investment. Asking $100 more isn't making or breaking the deal.
 
Last edited:
Except every single time. isn't 3070 Ti as good as 2080 Ti,. and 4070 ti as good as 3080 Ti, same thing. It's entirely possible that the 5070 Ti is in close range to the 4090 at 1440p, which remains better at 4K and the saving grace for 4090 owners that are coming back for a 5090 this time and now caclulate when is the best time to sell. It really is a 3-4 year investment. Asking $100 more isn't making or breaking the deal.
But that doesn't make any sense. That prediction would be in line with 3070Ti ~ Titan RTX, 4070Ti ~ 3090 Ti... but they're not. With the holding pattern you describe, the 5070Ti would be in place to touch tips with... the 4080 SUPER.

And I counter that the pattern holds in name only. GeForce model numbers and their expected configs saw a backslide for the 40 series; the 4080 SUPER is closer to what a launch model 4080 should have been, with an uncomfortable gap in shaders between the 4080 and 4090 where a 4080Ti would be expected to fit. Nevermind that we never got full-fat Ada flagship in GeForce, no 4090Ti with a fully intact AD102... nothing.

I project the 50 series to be maybe, MAYBE be a 15-20% uplift across the board between a new node/arch and higher power limits. And mind you, that would still see the 5070Ti neck and neck with the launch 4080, not bodyshotting a theoretical 4080Ti. It's GB203 up against AD103. It'd be an upset if Blackwell lost.
 
Last edited:
Back
Top