Tuesday, November 8th 2022

Cancelled NVIDIA GeForce RTX 4080 12GB Rebadged as RTX 4070 Ti, Bound for January?

NVIDIA had originally planned to launch the GeForce RTX 4080 16 GB, and the now-cancelled RTX 4080 12 GB in mid-November, but facing strong backlash from the press and social-media over the confusion the "RTX 4080 12 GB" branding would cause due to a vastly different hardware specification to that of the RTX 4080 16 GB (limited not only to the memory size); the company cancelled the launch of the RTX 4080 12 GB. We're hearing that with NVIDIA's board partners already having manufactured a large inventory of RTX 4080 12 GB cards, something had to be done. The partners could be undertaking a rebranding exercise, and the new brand is "GeForce RTX 4070 Ti."

According to kopite7kimi, a reliable source with NVIDIA leaks, the RTX 4080 12 GB will be rebranded as the RTX 4070 Ti. VideoCardz reports that the card is probably bound for a January 2023 launch. Based on the 4 nm "AD104" silicon, the RTX 4080 12 GB was supposed to max out the silicon, featuring 7,680 CUDA cores, 60 RT cores, 240 Tensor cores, 240 TMUs, 80 ROPs, and a 192-bit wide memory interface, running 12 GB of 23 Gbps GDDR6X memory (504 GB/s bandwidth). The RTX 4080 12 GB was originally slated to launch at a USD $900 price-point. It remains to be seen if NVIDIA is bold enough to sell a xx70-class product at $900. AMD is launching the Radeon RX 7900 XT at this price.
Sources: kopite7kimi (Twitter), VideoCardz
Add your own comment

81 Comments on Cancelled NVIDIA GeForce RTX 4080 12GB Rebadged as RTX 4070 Ti, Bound for January?

#1
Chaitanya
So now instead of 60 series being rebadged as 80 its 60 series being rebadged to 70Ti, and sheep will be lining up in hoards to buy them.
Posted on Reply
#2
wolf
Performance Enthusiast
Don't think they can get away with $900 vs the 7900 series, but I don't see Nvidia wanting much under $800 for it. I'm not overly fussed on the eventual name that gets stickered on it, I am far more interested in price/performance/features
Posted on Reply
#3
Vader
ChaitanyaSo now instead of 60 series being rebadged as 80 its 60 series being rebadged to 70Ti, and sheep will be lining up in hoards to buy them.
My advice? Vote with your wallet and don't get worked up over what companies or consumers do
Posted on Reply
#4
Prima.Vera
Vote with your wallet people.
Unfortunately the human stupidity is too vast out there...
Posted on Reply
#5
Calenhad
ChaitanyaSo now instead of 60 series being rebadged as 80 its 60 series being rebadged to 70Ti, and sheep will be lining up in hoards to buy them.
Why? Because it is a *04 chip? If so, they have used *04 in 80-series before. See GTX980.

I am glad that they are at least, finally, making some sense out of this product by rebranding it as the 4070ti. I have said this since they announced the 4080 duo.
Posted on Reply
#6
evernessince
ChaitanyaSo now instead of 60 series being rebadged as 80 its 60 series being rebadged to 70Ti, and sheep will be lining up in hoards to buy them.
Heck the 4080 is a rebadged 4070 being an AD103 die. Either Nvidia thought they'd have no competition this generation or that people will buy Nvidia regardless of the price. AMD ain't no saint either, they jacked the price up of their 2nd fastest card by $150 USD.

At these prices I'm likely to just keep my 1080 Ti another generation. I don't enjoy being given the middle finger now as much as I did during the pandemic.
CalenhadWhy? Because it is a *04 chip? If so, they have used *04 in 80-series before. See GTX980.

I am glad that they are at least, finally, making some sense out of this product by rebranding it as the 4070ti. I have said this since they announced the 4080 duo.
That's if you ignore multiple other factors. The GTX 980 was 398mm2 while the 4080 12GB is a mere 295mm2, over 100mm2 smaller. In addition, the 980 was $550 USD (or $689 today) where as the 4080 12GB was listed for $1,200.

The 980 isn't really a great example to compare to either. That's when people first started complaining about Nvidia staggering it's product releases to get people on the hook for more money. The fact that the 4080 12GB looks absolutely horrible in comparison just says how bad things have gotten.
Posted on Reply
#7
Dux
Don't worry, everyone. I am sure that Nvidia will still charge the same price despite the name change. Soon we'll have midrange cards costing $1K+
Posted on Reply
#8
Bwaze
So how will Nvidia fanboys explain now why there is absolutely nothing wrong to pay $900 for a card that is barely faster than the one released two years ago for $700?

Why should thee be price / performance increase, right?
Posted on Reply
#9
Hyderz
the price will suck big time but im curious how this gpu will perform
would it be a 3080ti/90 similar equivalent? or maybe the 3080 10/12gb in between.
wonder where this gpu will slot into
Posted on Reply
#10
nguyen
Well, either fork up 1600usd and buy a 4090 now, or wait a year for 4080Super or 4070Super with much better pricing.
Posted on Reply
#11
ZoneDymo
VaderMy advice? Vote with your wallet and don't get worked up over what companies or consumers do
Well it's those two that make a "vote with your wallet" ordeal so they are connected
Posted on Reply
#12
Guwapo77
When they said this card wouldn't resurface again, I knew that was a lie. It made no sense to get rid a chip that took a couple years to design.
Posted on Reply
#13
ratirt
4070 Ti for $900. The world has gone mad for good. I admire NV's determination for cash grabs. They would literally bend over backwards to get more money.
I bet the price will stay as it is. NV is pretty confident in ADA for some reason.
Posted on Reply
#14
mb194dc
Continues to look like this gen will be a commercial flop. The cards are way to expensive from both teams.

They haven't adjusted to the end of Covid and mining at all. Or the much tougher economic environment.
Posted on Reply
#15
the54thvoid
Intoxicated Moderator
There are plentiful 4090's in the UK now. Stock isn't flying as it did initially which could point to limited release. If AMD's cards are equivalent to the 4080 16G, this 4070ti would need to be a lot cheaper.
Posted on Reply
#16
Unregistered
ChaitanyaSo now instead of 60 series being rebadged as 80 its 60 series being rebadged to 70Ti, and sheep will be lining up in hoards to buy them.
Just look at the analysis of some so called experts falling for nVidia's marketing. A normal person has no chance.
#17
Chaitanya
VaderMy advice? Vote with your wallet and don't get worked up over what companies or consumers do
I will be buying either used(there are some insane deals these days(from trusted local sellers and GPUs are less than 6 months old i.e. just before crypto crash) here in India. saw 3080 barely 6 months old listed for mere INR 34000 or ~$400 or will buy last gen AMD which seem to be finally being sold at sane pricing.
mb194dcContinues to look like this gen will be a commercial flop. The cards are way to expensive from both teams.

They haven't adjusted to the end of Covid and mining at all. Or the much tougher economic environment.
Most hardware launches this year might fail thanks to hubris regarding pricing(or stupid naming scheme) or due to impending recession and inflation.
Posted on Reply
#18
Bwaze
mb194dcContinues to look like this gen will be a commercial flop. The cards are way to expensive from both teams.

They haven't adjusted to the end of Covid and mining at all. Or the much tougher economic environment.
I bet their reasoning is "tougher economic environment goes both ways" - they have to pay higher energy bills, higher shipping, there are uncertainties regarding ordering components from China, and even Taiwan...

And if there are less buyers due to high inflation and recession, the remaining buyers will have to cover for the rest of us. Makes sense, no?
Posted on Reply
#19
Keullo-e
S.T.A.R.S.
900USD for a card with a 192-bit bus? This is insane.
Posted on Reply
#20
john_
DLSS 3.0 will sell this card even at it's original price. And people will be laughing at RX 7900XTX owners because
- They don't have DLSS 3.0 and when they get FSR 3.0, it will be obviously inferior
- DLSS 2.x is superior to FSR 2.x
- GSync is superior to FreeSync
- They don't have CUDA
- They don't have Optix
- They don't have Tensor cores
- Their drivers are bad
- AMD's logo is not as shiny as Nvidia's logo
- More reasons or excuses....

In fact when I started to write this post, I was only thinking of DLSS 3.0 as the perfect marketing material for Nvidia to sell this card to consumers. But someone can build a fairly long list of reasons or excuses, depending of how someone sees it, to go with an RTX 4070 Ti at $900, instead of the RX 7900 XT at the same price, or the XTX at $1000.
Posted on Reply
#21
Ravenlord
For the first time I'm going to buy AMD GPU instead of Nvidia (over 20 years of Nvidia cards). Huge price, size, TDP/TGP increase in last Nvidia generations aren't pros to pick those GPUs. Unfortunetly some people justify any negative aspect of those cards ( ͡° ͜ʖ ͡°) - "because they must do it". I will pick something with TDP between 150-<250 like 6700XT, but probably i'm going to wait for something like 7700XT with improved raytracing performace which is only one real con for current 6000 gen.
Posted on Reply
#22
nguyen
RavenlordFor the first time I'm going to buy AMD GPU instead of Nvidia (over 20 years of Nvidia cards). Huge price, size, TDP/TGP increase in last Nvidia generations aren't pros to pick those GPUs. Unfortunetly some people justify any negative aspect of those cards ( ͡° ͜ʖ ͡°) - "because they must do it". I will pick something with TDP between 150-<250 like 6700XT, but probably i'm going to wait for something like 7700XT with improved raytracing performace which is only one real con for current 6000 gen.
It's cheaper just to buy rx6000, rx7000 won't bring any fruitful increase to RT Performance (you will need to use aggressive FSR settings like Balanced or Performance just to enable RT, destroying the Image Quality in the process) that you might as well just stick to Native without RT.
Posted on Reply
#23
beedoo
nguyenIt's cheaper just to buy rx6000, rx7000 won't bring any fruitful increase to RT Performance (you will need to use aggressive FSR settings like Balanced or Performance just to enable RT, destroying the Image Quality in the process) that you might as well just stick to Native without RT.
Whilst I'm not expecting miracles, I really have to believe you're talking out of your arse with the entirety of this comment.
Posted on Reply
#24
nguyen
beedooWhilst I'm not expecting miracles, I really have to believe you're talking out of your arse with the entirety of this comment.
Maybe AMD is talking out their ass too, using FSR Performance just to bring FPS >60 with RT :laugh:, meanwhile without FSR it looks like FPS are around 20-25
Posted on Reply
#25
Broken Processor
VaderMy advice? Vote with your wallet and don't get worked up over what companies or consumers do
Peeps will still line up to buy it cus it's Nvidia unfortunately.
Posted on Reply
Add your own comment
Apr 25th, 2024 17:00 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts