• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4070 isn't a Rebadged RTX 4080 12GB, To Be Cut Down

This time around NV will go 4070 non ti, ti and super and will create perfect chaos with chips and skyrocketing prices.
 
This time around NV will go 4070 non ti, ti and super and will create perfect chaos with chips and skyrocketing prices.
They need to sell defective chips with high model numbers first to maintain high pricing on the fully enabled ones. It worked on Turing refresh *cough* Ampere, it'll sure work on Ampere ref... *cough-cough* Ada, too.
 
Crypto is definitely not dead - where there is money, people will always find a way. Mining on the other hand, is a different story when the major well-established coins are all PoS.

I think you're wrong here. Ethereum had a virtual monopoly as a GPU mineable coin, and it remained as such through the crypto crash. It changed to proof of stake only after the crash, and with no current interest or sense in GPU mining no other mineable coin profited from that. But that's only until crypto starts rising again - and it doesn't really matter if it will be replaced by a single coin or a wider group - these currently "PoS" coins will rise very quickly, since everyone will want to be on the bandwagon from the start.

As I said, Nvidia had a perfect oportunity now to state clearly they will from now on oppose any GPU use in mining. And what did they do? They removed the "hardware" LHR limitation simply through driver update.

"The golden goose is on the loose and never out if season!"
 
I think you're wrong here. Ethereum had a virtual monopoly as a GPU mineable coin, and it remained as such through the crypto crash. It changed to proof of stake only after the crash, and with no current interest or sense in GPU mining no other mineable coin profited from that. But that's only until crypto starts rising again - and it doesn't really matter if it will be replaced by a single coin or a wider group - these currently "PoS" coins will rise very quickly, since everyone will want to be on the bandwagon from the start.

As I said, Nvidia had a perfect oportunity now to state clearly they will from now on oppose any GPU use in mining. And what did they do? They removed the "hardware" LHR limitation simply through driver update.

"The golden goose is on the loose and never out if season!"
PoS coins rising quickly or slowly, or in whichever manner is fine. That doesn't add any value to graphics cards anymore. Crypto will probably boom again, but I think mining is dead, unless some millionaire finds some never-heard-of mineable coin to invest his mining farm in. But in that case, I can only hope that the rest of the world has learned from the mistake and never follows suit.
 
... I think mining is dead, unless some millionaire finds some never-heard-of mineable coin to invest his mining farm in. But in that case, I can only hope that the rest of the world has learned from the mistake and never follows suit.

I don't think we need any such thing for exactly the same outcome as in 2017 and 2020-21 cryptominig booms, to name the latest. The same behaviour as before will lead to the same result - even if at first there won't be large cryptofarms, and people will only mine at home with what they have.
 
very bad move, if confirmed, that says a lot about Nvidia lately... They really don't care about customers.
This time I really want AMD to succeed with RDNA3, because I much prefer not to buy a new Nvidia card for this generation.
 
I don't think we need any such thing for exactly the same outcome as in 2017 and 2020-21 cryptominig booms, to name the latest. The same behaviour as before will lead to the same result - even if at first there won't be large cryptofarms, and people will only mine at home with what they have.
Maybe. I'm just thinking that the major coins weren't so big, and were still PoW back then, which always left a back door open for mining to return. Now that they're all PoS and a lot bigger than before, a lot more effort is needed to give mining another chance - which I seriously hope it will never get.
 
I can smell a "cheap" 4070 coming with a severely cut-down chip for $700, then a 4070 Ti with the same config as the 4080 12 GB would have been for $850. And then, Jensen won't understand why isn't everybody happy.

It really puzzles me how Nvidia doesn't have a well-thought out plan for the whole product stack before the launch of the flagship the way AMD and Intel do.
Noone will be happy ever, the world today cannot be pleased anymore.
 
The 4090 is what should have been the 4080
The 4080 16GB is what should have been the 4070Ti
The 4080 12GB is what should have been the 4070

The upcoming 4090Ti is what should have been the 4090

Obvious when you think critically.
 
Maybe. I'm just thinking that the major coins weren't so big, and were still PoW back then, which always left a back door open for mining to return. Now that they're all PoS and a lot bigger than before, a lot more effort is needed to give mining another chance - which I seriously hope it will never get.

Ah, I misread you were meaning the "major well-established coins are all Piece of Shit".


Which they are.


And it's also true the established ones are all Proof of Stake. And I still think the non-established ones that are mineable will gather momentum from zero. Just as the Dogecoin gathered momentum just because some rich guy tweeted it's "going to the moon".
 

"NVIDIA GeForce RTX 4070 isn't a Rebadged RTX 4080 12GB, To Be Cut Down"​


Well, technically, that will be correct.

The 4070FE will be a cut down 4070ti.

Because the AIB partner '4080 12GB' cards, are the 4070 ti's.
 
To be fair, Nvidia has such a solid buyer base of people who would never ever consider having another brand in their PCs, and always buy the latest, highest-end model they can afford, that they (Nvidia) really don't need to be cheap. They can keep on selling their BS at whatever price they demand because someone will always buy it. Just like Apple.

we should be supporters only for our own wallet...
Even if I have only Nvidia cards in my house (3080, 3070 and 3060 Ti), I won't have any problem to return to AMD if RDNA 3 worth it.
My last AMD card was a 5700XT, that gave me some headaches especially in the first year.

AMD on the other hand...
that's for entirely different reasons ... AMD completely messed up with the price scheme if you consider the platform (CPU+mobo+RAM).
You cannot ask that amount of money for a 6 core !

AM4 platform is still selling very well for this reason.

I strongly suspect AMD will, officially or non-officially, reduce prices quite soon.
 
So it would be:
4080 12GB -> 4070ti ?

The performance delta between each GPU is gigantic this generation, but their price aren't really too much set apart, favorising the ultra enthusiast market.
In France:
RTX 4090 (24GB) 1.929 €
RTX 4080 (16GB) 1.469 €
RTX 4080 (12GB) 1.099 €

I don't understand the 4080, someone willing to shell 1469EUR, for 480EUR more you have approx 40-50% extra performance it would be crazy not to IMO...that performance per EUR is far higher the more you pay sadly..., it's just silly, the 4080 16GB is priced absurdly, slightly too high, probably so that it does not eat the RTX 3000 market and no comment on the "unlaunched" 4080 12GB, 1100EUR...
 
I can smell a "cheap" 4070 coming with a severely cut-down chip for $700, then a 4070 Ti with the same config as the 4080 12 GB would have been for $850. And then, Jensen won't understand why isn't everybody happy.

It really puzzles me how Nvidia doesn't have a well-thought out plan for the whole product stack before the launch of the flagship the way AMD and Intel do.
that's my idea too.
To just rebrand the "old" 4080 12 GB as 4070 and lower the price to $700 would have been a smart move, but that's not today's Nvidia.
They are playing dirty games on the distribution channel in order to keep prices as high as they can, since a while. They are just milking customers at this point.
See what's happening with DLSS 3.0, artificially limited to Series 40 in order to keep a good distance between 4080 16GB and 3080/3090 Ti in benchmarks and somehow justify the insanely high launching price.

So it would be:
4080 12GB -> 4070ti ?

most probably 4080 12 GB = 4070 Ti > 4070

The performance delta between each GPU is gigantic this generation, but their price aren't really too much set apart, favorising the ultra enthusiast market.
In France:
RTX 4090 (24GB) 1.929 €
RTX 4080 (16GB) 1.469 €
RTX 4080 (12GB) 1.099 €

I don't understand the 4080, someone willing to shell 1469EUR, for 480EUR more you have approx 40-50% extra performance it would be crazy not to IMO...that performance per EUR is far higher the more you pay sadly..., it's just silly, the 4080 16GB is priced absurdly, slightly too high, probably so that it does not eat the RTX 3000 market and no comment on the "unlaunched" 4080 12GB, 1100EUR...
480EUR more is not a small amount of money, and even if you are right about diminishing returns, not everyone want a card like a 4090 in his case.
 
I see a lot of people are trying to figure out, is it a cut down 4070 or not. Or maybe it is a cut down 4070Ti? It does not matter. NV has a 4080 12gb and it was priced at$900 and that is way way too much. The faster 4080 is $1200 and that is even more ridiculous price. $900, you would pay for a top SKU back in the day like 2000 series. ($700 for 2080) and less if you go to 1000 series (1080 $600)
1070 ti was $499 and now people are debating, if the 4080 12gb is a 4070 or 4070 ti. It does not matter, the prices are crazy and will remain crazy because everyone is looking at the subject from a wrong perspective. Talking about a degree of which this launch sucks. Bad is bad and prices are horrible.
 
I really hope we get so buy a fully activated AD104 as 4070 or 4070Ti. It is sad if a chip is never allowed to show it's true potential, it was sad with GA106 an even more so with GA103, which we only saw ony 3080Ti mobile an allegedly now on some obscure derivates of 3070Ti desktop.

As much as it is regerettable that a chip in this price range only offers 192Bit and 12GB, AMD has shown that enough cache can compensate that. If not clocked to much above the sweet spot, a fully activated chip will always be more efficient than a cut down chip with higher clock. The 3070Ti was slammed for offering only slightly more performance than the 3070 because GDDR6X used to much power, but at the same power target, it will always offer more performance. Same with 3090Ti vs 3090 vs 3080Ti vs 3080 12GB vs 10GB. It just isn't worth the price premium most of the time, but that is just marketing, a 3080 12GB doesn't have to differ at all in manufacturing from a 3080Ti apart from the chip.

There is one additional problem with 4080 12GB returning as 4070Ti instead of 4070: There would be no space for a cut down, cheaper AD103. Because of the big difference in memory interface, I would much prefer a 4070Ti based on a cut down AD103 with 256Bit and 16GB than a fully activated AD104 with 192Bit and 12GB.
 
So now nGreedia are going to rebadge 60 class chips as 70 series and sell upwards of $600.

Making the recent years prices official!
 
Noone will be happy ever, the world today cannot be pleased anymore.
Well, not by $900 graphics cards that used to cost $300 a few years ago.

Ah, I misread you were meaning the "major well-established coins are all Piece of Shit".


Which they are.
I easily could have (considering how much I hate the whole concept of crypto), but this time I actually didn't. :laugh:

that's for entirely different reasons ... AMD completely messed up with the price scheme if you consider the platform (CPU+mobo+RAM).
You cannot ask that amount of money for a 6 core !

AM4 platform is still selling very well for this reason.

I strongly suspect AMD will, officially or non-officially, reduce prices quite soon.
Didn't Nvidia mess up this launch with the $1,600 flagship that is on average 60% faster than the $6-700-something 6900 XT? 160% performance for 240% price. Um... awesome? :wtf:

The point is that people will buy Nvidia even if it's unjustifiably expensive, but they won't buy AMD if it's ever so slightly out of its ideal price/performance range.
 
If it's still 12GB GDDR6X, this means 5GPC active, so 80ROPs and at least 208TC and 6656 cuda cores.
Being a 5GPC design, it will have more similar scaling with GA104 than GA102 in lower resolutions, meaning even if it goes from 2610MHz and 21Gbps GDDR6X to 2510MHz and 20.5Gbps GDDR6X it will probably match RTX 3090 in QHD, forcing RTX 3080Ti and 6950X to drop below it's SRP at least 10% or more, that's why it seems difficult (if the launch is this year) to be less than $799 when there is so Ampere stock according to reports.
Probably what Nvidia will do is wait AMD's announcement in 2 weeks from now and then based on RDNA3 pricing to respond accordingly.
 
They need to sell defective chips with high model numbers first to maintain high pricing on the fully enabled ones. It worked on Turing refresh *cough* Ampere, it'll sure work on Ampere ref... *cough-cough* Ada, too.
I'm sure they want to do that I hope it wont happen. If you consider the price hike from 3 gen before you will notice it was not a coincidence. The literally want to get the mid tier of cards to cost twice as much as it used to cost.
 
I'm sure they want to do that I hope it wont happen. If you consider the price hike from 3 gen before you will notice it was not a coincidence. The literally want to get the mid tier of cards to cost twice as much as it used to cost.
They've already done that, I'm afraid. I used to buy £2-300 graphics cards only a few years ago, and now here I am, eyeing the £470 6750XT (which is not even Nvidia).
 
Noone will be happy ever, the world today cannot be pleased anymore.
Nobody says anything about pleasing others but some sort of decency in pricing. I dont want to be pleased with low prices but I dont want to be ripped off as well. $900 dollars for what was a $300 card is too much. Somebody has a problem with it somebody don't. If you don't have a problem now you will next year when 5070 costs $1200 or whatever the price hike will be then.
 
Let's just wait what AMD come out with, I believe nvidiots will be more than happy to sell the full AD104 as 4070; I mean what do they have to sell with the 4xxx generation besides Fake_frames? Not much, that's for sure.
 
For me, 5700xt was always a better buy to be fair.
Sure, after both products launched, and we totally ignore the 2 years and 3 months or so the 1080Ti was out with no 5700XT in existence yet and it was impossible to make that particular 'better buy' choice.

Let's just wait what AMD come out with,
More raster perf, ever so slightly better than RDNA2 RT perf (relative hit to performance) and the same playing catch up with other features as usual, that's my bet.

What might actually be exciting is price and availability.

EDIT: typo
 
Last edited:
If it's still 12GB GDDR6X, this means 5GPC active, so 80ROPs and at least 208TC and 6656 cuda cores.
The amount of RAM or rather the amount of 32-Bit memorycontrollers isn't coupled to the amount of active GPCs and ROPs anymore since Ampere, if I remember correctly. Further more, there is a lot of freedom of how many SM in a GPC can be deactivated without losing the whole GPC and the ROPs. RTX 3070 laptop has the same 96 ROPs of 6 GPCs, even though it has only 40 SM active, which would fit perfectly in 5 fully activated GPCs which 8 SM each, while RTX 3080Ti Laptop based on GA103 still has 96 ROPs in the same 6 GPCs, just with 58SM in either 10 (like GA106) or 12 SM (like GA102) per GPC.

However, on Ada, Nvidia seems to have foregone the concept of many different configurations of how many SM each GPC contains. Either way, the amount of VRAM and the width of the MC gives as no clue as to how many GPCs are active.
 
Back
Top