• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 5070 Ti Specs Leak: Same Die as RTX 5080, 300 W TDP

Seems really meh, won't be much generation increase. Pretty convinced the next gen cards are going to flop hard.

There just isn't the demand for ultra expensive cards anymore.

Compute is what's been driving Nvidia sales even for gaming division. People trying use consumer cards in their llms.

Without a massive breakthrough on the front end, it's going to end, very badly.
 
Last edited:
So you are saying there will be a 100% increase in IPC going from Ada to Blackwell? That would mean the 5090 would be almost three times the performance of 4090.

By the way, the 4070Ti has over 50% clock increase over the 3090 which was possible going from Samsung 8LPP to TSMC 4N. Blackwell is the same node as Ada.

2070 Super has less CUDA cores and less bandwidth, is based on TSMC 12nm which is a tweaked TSMC 16nm, still perform the same as 1080Ti regardless.

Well if 5090 were given 1000W+ TDP and GDDR8 then it could be 3x as fast as 4090 LOL, obviously perf doesn't scale with CUDA cores and bandwidth past a certain point, i.e 4090 is not 60% faster than 4080
 
2070 Super has less CUDA cores and less bandwidth, is based on TSMC 12nm which is a tweaked TSMC 16nm, still perform the same as 1080Ti regardless.

Well if 5090 were given 1000W+ TDP and GDDR8 then it could be 3x as fast as 4090 LOL, obviously perf doesn't scale with CUDA cores and bandwidth past a certain point, i.e 4090 is not 60% faster than 4080
It really comes down to boost clocks and cores since the IPC across all three RTX series is similar. Just multiple the two and you get the ranking order in the charts.

It’s that simple. And here’s the prediction:

5070Ti: 8.9K cores times 2.7 Ghz = 24
4090: 16.4k cores times 2.5 Ghz = 41

Not even close. The 5070Ti will be about 20% faster than the 4070Ti (24/20).
 
Last edited:
4090 16,4K Cores more than 2X and only 50% faster than a 4070 Ti, 5070 Ti needs to be just that. I believe it can get pretty close with 20% uArch.
 
It really comes down to boost clocks and cores since the IPC across all three RTX series is similar. Just multiple the two and you get the ranking order in the charts.

It’s that simple. And here’s the prediction:

5070Ti: 8.9K cores times 2.7 Ghz = 24
4090: 16.4k cores times 2.5 Ghz = 41

Not even close.

Man you are cracking me up, so the 5070Ti will be slower than 4070?
 
Man you are cracking me up, so the 5070Ti will be slower than 4070?
Ummm…no. Again the math is simple.

4070: 5.9k cores times 2.5 Ghz = 14.75
5070Ti: 8.9k cores times 2.7 Ghz = 24

Btw this is how precision is calculated, cores times clocks. While the precision scales, games tend not to at higher resolution and higher cores. So the calculation starts to break down at 4k and 4090 levels. I don’t expect the 5090 to be 33% faster than the 4090 at 4k at the same clocks but it could be close.
 
4070 Ti: 7680 CUDA cores, 192bit bus
3090: 10496 CUDA cores, 384 bit bus
View attachment 372980

Pretty much every xx70 GPU tie with previous gen xx80Ti/xx90 (1070 = 980Ti, 2070Super = 1080Ti, 3070 = 2080Ti, 4070Ti = 3090)
Right but Ada's a different case. The 3090, like most flagship-tier GPUs before it, didn't have a huge performance uplift over the next-best product in the stack. Techpowerup has the 3090 at +14% over the 3080, and 39% over the 3070 Ti. (And it's worth pointing out the 3070 Ti was widely considered a turd.)

By contrast, the 4090 shows a 26% advantage over the 4080, and a 60% advantage over the 4070 Ti. This change in the product stack's composition was exacerbated by the huge increase to the 4080's price over its previous generation analogues, which explains why so many people complained: Nvidia concentrated most of Ada's performance gains at the tippy top end. There's no obvious reason to expect that they'll change that approach.

I think it's unrealistic to expect the 5070/Ti to meet or exceed the 4090; the fact that it appears to carry many fewer CUDA cores only strengthens the case.
 
Right but Ada's a different case. The 3090, like most flagship-tier GPUs before it, didn't have a huge performance uplift over the next-best product in the stack. Techpowerup has the 3090 at +14% over the 3080, and 39% over the 3070 Ti. (And it's worth pointing out the 3070 Ti was widely considered a turd.)

By contrast, the 4090 shows a 26% advantage over the 4080, and a 60% advantage over the 4070 Ti. This change in the product stack's composition was exacerbated by the huge increase to the 4080's price over its previous generation analogues, which explains why so many people complained: Nvidia concentrated most of Ada's performance gains at the tippy top end. There's no obvious reason to expect that they'll change that approach.

I think it's unrealistic to expect the 5070/Ti to meet or exceed the 4090; the fact that it appears to carry many fewer CUDA cores only strengthens the case.

Pretty much every 104 die (xx70/xx70Ti) match the previous 102 die (xx80Ti/xx90) going way way back.

So many people complained, yet Nvidia gain market share with ADA? :kookoo:.

Edit: even that turd 3070Ti is faster than 2080Ti by 10%.
 
Last edited:
Eyeing on the 5080 but it seems to be identical to the specs of the 4080 super with just GDDR7 soldered in there..hmmm the future is indeed crazy/lazy.
 
But... Nvidia was always better. Simple as that. Even during GCN; AMD drew more power for a slightly better bang for buck, offered more VRAM for a slightly better bang for buck. And that's all AMD wrote. Not ONCE did they take the leading position, either in featureset or in software altogether. Driver regime has been spotty. GPU time to market has no real fixed cadence, its 'whatever happens with AMD' every single time and it never happens to be just a smooth launch. The list of issues goes on and on and on.

The only thing to applaud is AMD brought RDNA2/3 to a good, stable situation. Too bad the products don't sell. Because AMD chose to price them in parity with Nvidia... So at that point, they didn't have the consistency, nor the trust factor or brand image, nor the bang for buck price win.... and guess what. RDNA4 is a bugfix and then they're going back to the drawing board. Again: no consistency, they even admitted themselves that they failed now.
Don't get me wrong, I never denied Nvidia's performance was better. Just that almost every reviewer only ever cared about that for years. Ran the benchmarks, wrote down the FPS, declared Nvidia the best because FPS(Nvidia) > FPS(AMD). Not to mention the raytracing hype that never turned into anything but AMD sucked at that. Reviewers fueled the "go for FPS and RT" craze even if they had seen where this attitude led in the CPU world: years of overpriced low core count CPUs that were rehashes of the previous gen. Most reviews were anyway just basic bar charts of builtin game benchmarks but enough of those created a general consensus that Nvidia is the only good choice.

AMD took a beating from all sides and (maybe smartly?) chose to focus on the CPU business. Not sure how many companies could have successfully taken on Intel (back when they were not the shadow that they are today) but certainly none would be able to successfully take on both Intel and Nvidia simultaneously. Recently we thankfully get an increase of reviewers noticing after so many years that Nvidia's bang for the buck and watt are constantly going down and starting to wise up to the fact that "better" and "faster" might be different things. Sometimes "faster" doesn't justify recommending a card.

Long story short, we get exactly what our "expert" reviewers sold. They sold Nvidia as the best, users bought Nvidia, now we all get Nvidia. But any "surprise" that a GPU is just a tweak of last generation, or consumes way more power, or costs too much is pointless now or in the near future. You still get more FPS so it must be just as much a win as ever. Let's see if everyone still gives the usual "Editor's pick" for these GPUs, they're the fastest after all.
 
Last edited:
Just skimming through..

Is 300w really that bad for the performance gains? I mean my 4070Ti has been doing 300w for 2 years.

Also, my GTX 580 has a 384 bit bus, but it doesn't mean that its better than my 3070Ti, or even my 4070Ti..

No point in arguing about leaks and speculation..
 
Don't get me wrong, I never denied Nvidia's performance was better. Just that almost every reviewer only ever cared about that for years. Ran the benchmarks, wrote down the FPS, declared Nvidia the best because FPS(Nvidia) > FPS(AMD). Not to mention the raytracing hype that never turned into anything but AMD sucked at that. Reviewers fueled the "go for FPS and RT" craze even if they had seen where this attitude led in the CPU world: years of overpriced low core count CPUs that were rehashes of the previous gen. Most reviews were anyway just basic bar charts of builtin game benchmarks but enough of those created a general consensus that Nvidia is the only good choice.

There is no such consensus. There is too much sponsorship paid by Nvidia in the reviews which are misleading and lack true data.

AMD took a beating from all sides and (maybe smartly?) chose to focus on the CPU business.

Was more a matter of luck that they got the right man in the right place - his name is Jim Keller who designed the Zen architecture which for now saves AMD.
The thing is that Nvidia is focused in a market which made them a multi-trillion company, while AMD is nowhere near, especially with the today's threat to exit the GPU competition all together. One or two weak Radeon GPUs generations and AMD will be out of the business.
Which will be fatal for the company.
 
Why is a ti card coming at start of generation, arent they usually later cards?
 
Why is a ti card coming at start of generation, arent they usually later cards?
usually, there is quite some performance (price, etc.) gap between a xx70 and xx80 card, and when AMD launches a card with somewhat better performance than the current xx70 card, nvidia launches the xx70ti, which is just a little bit better than the newly released AMD counterpart. but this time it's very likely that AMD's top-of-the-line card will be weaker than 5070, so nvidia won't have to keep a stronger card in their hand
 
Why is a ti card coming at start of generation, arent they usually later cards?

Yeah that is unusual but then again Ti is no longer the top performer in any segment, not since NVIDIA started including "Ti Super" in their lineup. It’s just another example of NVIDIA’s skill in exploiting sub-tiered product segmentation within already segmented product categories. Perhaps the Ti model comes with 16GB, while the non-Ti has 12GB, and maybe a TIS gets introduced later to bridge the supposedly wide performance gap between the 70s mid-range and high-end 80-class giants.

The 5080 could be anywhere between $1200-$1500. Below $1200, thats a lot of price-points to cover for mid-tier cards, Nvidias gonna have a field day with the 70s and who knows maybe a TISD (~DELUXE) variant to comfortably fill that revenue tasty ~$1100 gap.

King of marketing semantics!
 
Last edited:
RTX 5090 will cost $2500, RTX 5080 will cost $1400, RTX 5070ti will cost $1000, RTX 5070 will cost $800, RTX 5060ti will cost $500, RTX 5060 will cost $400.

AMD's line up will be: RX 8800XT will cost $600, RX 8700XT will cost $500, RX 8600XT will cost $400, RX 8600 will cost $300, RX 8500XT will cost $200.
 
Below $1200, thats a lot of price-points to cover for mid-tier cards
This statement says it all, below 1200 mid-tier :rolleyes: honestly this whole shit show of charging 800+ for yes, what is essentially mid range performance tier GPU's can fuck right off, I am not playing their games, will sit on my current HW until it's obsolete or dead and maybe just stick to retro HW and gaming, not like AAA games that are bringing these 1k+ cards to their knees even look any better than top titles almost 10 years ago to justify it either, no optimisation from game devs, NV could give a fk about budget gamers which make up 90% of the PC gaming market and I really hope their AI bubble bursts big time and they fall flat on their face, a nice big helping of humble pie is needed in the PC gaming market space, just ask Intel how it tastes
 
But... Nvidia was always better. Simple as that. Even during GCN; AMD drew more power for a slightly better bang for buck, offered more VRAM for a slightly better bang for buck. And that's all AMD wrote. Not ONCE did they take the leading position, either in featureset or in software altogether. Driver regime has been spotty. GPU time to market has no real fixed cadence, its 'whatever happens with AMD' every single time and it never happens to be just a smooth launch. The list of issues goes on and on and on.

The only thing to applaud is AMD brought RDNA2/3 to a good, stable situation. Too bad the products don't sell. Because AMD chose to price them in parity with Nvidia... So at that point, they didn't have the consistency, nor the trust factor or brand image, nor the bang for buck price win.... and guess what. RDNA4 is a bugfix and then they're going back to the drawing board. Again: no consistency, they even admitted themselves that they failed now.
Nah, there was an era that AMD (or was it ATI? frankly don't remember) had the overall better product. During the hd4770 - hd5850 - hd 6850 - hd 7850 era they had either more performance , better efficiency or just came first to market. And first to market is a huge thing in my book, especially on the high end. I don't like buying the top end card 6 or 12 months in since the later you buy it, the more closer you are to the next big launch and suddenly your flagship GPU is now, well, not so flagship.

Maybe you are to young to remember, but definitely nVidia wasn't always better.
Just for your homework, search for AMD Radeon HD 5870 card. It was so good, that it was almost beating the dual GPU card from nVidia, while wiping the floor with whole nvidia gen cards. Also the 5850 was a monster too, and could work in pair with the 5870. I remember that was my last SLI setup ever, but it was a blast. Good ol' times.
My bro had the 5850s in SLI. Good times
 
This statement says it all, below 1200 mid-tier :rolleyes: honestly this whole shit show of charging 800+ for yes, what is essentially mid range performance tier GPU's can fuck right off, I am not playing their games, will sit on my current HW until it's obsolete or dead and maybe just stick to retro HW and gaming, not like AAA games that are bringing these 1k+ cards to their knees even look any better than top titles almost 10 years ago to justify it either, no optimisation from game devs, NV could give a fk about budget gamers which make up 90% of the PC gaming market and I really hope their AI bubble bursts big time and they fall flat on their face, a nice big helping of humble pie is needed in the PC gaming market space, just ask Intel how it tastes

Yep its a fully lit-up shit show like a burning dumpster rolling downhill into a fireworks factory (inevitable). Unfortunately, I’m shackled to this thing because I need an upgrade to "desirably" hit my consistent 120fps+ performance goals at 1440p. I've always been a nV 80-class subscriber although this time around I ain't bending over for any card above £800. I might settle for a 16GB+ 70-class card from the 50-series at this price limit, or if not possible if AMDs 8000-series offers something compelling, I'm open to switching.

I couldn’t care less about some of the flashy features everyone keeps banging on about, but if Nvidia continues to deliver better power efficiency than AMD, that’s a big win in my book, definitely something I wouldn’t mind paying a bit extra for. I just can’t deal with GPUs that double-up as mini-heaters during the summer - nothing annoys me more than sweaty pants during a 2-hour/+ gaming session.
 
Back
Top