• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4070 Ti Launched at $799 with Performance Matching RTX 3090 Ti

nVidia is and will be doing a tick tock Intel thing from now onwards.
Turing expensive, Ampere affordable, Ada expensive, Whatever is next affordable...

AMD follows.
 
Crutches that are better IQ than TSAA?
That's subjective rather than objective, far from a unanimous opinion, and I for one hate DLSS because it's a smeary, incoherent blur in motion on a fast enough display like OLED or good high-refresh rates screens.

In some games, blur doesn't matter, but I have always turned off motion blur no matter what. I loathe it in games.
Some games implement DLSS better than others, and a good DLSS implementation can be good enough that I will actually tolerate it for the performance gain.

There will always be artifacts, and it's always easy for me to spot the low-resolution jaggies of the actual render resolution; You can't see them easily in static side-by-side comparisons, and you can't really see them in Youtube videos because of the compression.

Raytracing makes the artifacts even more obvious because the raytracing generates a noisy image and the size of the noise is exponentially larger the lower the sampling resolution gets. You only see the noise in motion, because the denoiser can hide the worst of it within 3-4 frames. Only a moving image is constantly generating virgin reflection/shadow/ambient information that the denoiser has no prior frames to deal with.

One thing we can always agree on is that when DLSS 'Quality' is used for 4K, it's being rendered at 1440p because that's an objective fact. IMO that makes it a 1440p card because you can upscale or downscale that to literally any arbitrary resolution you want, using any technology you want. 4K, 5K, 8K? Who cares? It's still a 1440p render and if you want DLSS's temporal AA, then use DLAA.
 
You have to laugh so hard at these new chips.

For the first time ever, we get TWO node improvements. To 7nm, and to 5nm, at the same time. Massive transistor count increases. Free clock speed increases. Just shrink the RTX 3080 to 5nm, make it $600, give it 3Ghz, jesus christ these new products suck so bad.

RTX 2080 to 3080, 60 percent improvement in perf/dollar.
3070 Ti to 4070 Ti, -5 percent improvement (actual pricing, there is no $800 4080, most of them are more expensive than the 7900 XTX in Canada, slightly over $1000 USD).

Unbelievable. 2 node improvements and zero or negative improvement depending on your country.

Total garbage.
 
Quick math using die per wafer estimator calc and known numbers:
2080TI 754 mm² die size -> 12nm 300mm wafer $4000 (yield of 80%) = 70 dies out of them fully 56 functional = $71 per die -> this die probably costed between $200 & $300 to produce at the start of N12 production
4080 379 mm² die size -> 5nm $17000 (yield of +90%) = 147 dies out of them fully 132 functional = $128 per die

As you can see new dies are more expensive to produce, but not groundbreakingly more expensive and prices will only get better once TSMC moves Apple's products to smaller node.

You got everything wrong. $4000 price of wafer is the 2020 price. Not the 2018 price.
Also, the $17000 is also a 2020 price, not 2023 price :D

Don't troll, get your facts correct. You have to normalise for the time of launch of the corresponding product - RTX 2080 Ti - 2018, RTX 4080 - 2023.
 
950$ in Europe:

1672783863173.png


Cartes graphiques GeForce RTX 4070 Ti | NVIDIA
 
So, Nvidia launched a card that's slightly cheaper than RX 7900 XT and presumably performs a little lower, what makes Nvidia evil this time?
Is it like in the Fury X days, when Nvidia was evil for making AMD look bad?
 
Quick math using die per wafer estimator calc and known numbers:
2080TI 754 mm² die size -> 12nm 300mm wafer $4000 (yield of 80%) = 70 dies out of them fully 56 functional = $71 per die -> this die probably costed between $200 & $300 to produce at the start of N12 production
4080 379 mm² die size -> 5nm $17000 (yield of +90%) = 147 dies out of them fully 132 functional = $128 per die

As you can see new dies are more expensive to produce, but not groundbreakingly more expensive and prices will only get better once TSMC moves Apple's products to smaller node.

Fixed the wrong maths by you :D

RTX 2080 Ti 2018 754 mm² die size -> 12nm 300mm wafer $15000 (yield of 60%) = 70 dies out of them fully 42 functional = $357 per die
RTX 4080 2023 379 mm² die size -> 5nm 300mm wafer $15000 (yield of 80%) = 147 dies out of them fully 117 functional = $128 per die
 
So, Nvidia launched a card that's slightly cheaper than RX 7900 XT and presumably performs a little lower, what makes Nvidia evil this time?
Is it like in the Fury X days, when Nvidia was evil for making AMD look bad?
That price is gonna be $900 after a while.
 
So, Nvidia launched a card that's slightly cheaper than RX 7900 XT and presumably performs a little lower, what makes Nvidia evil this time?
Is it like in the Fury X days, when Nvidia was evil for making AMD look bad?

1. 12 GB VRAM
2. Performance expectation by users for this price tier was somewhat higher - so the product is overpriced and underperforms. AMD doesn't matter.
 
Each country has (wildly) different prices. In Portugal at the moment the cheapest (without shipping) for each model are:
  • 3070TI: 691,14€
  • 3080: 839,9€ (the 2nd cheapest is 1092,92€)
  • 3080TI: 1559,9€
  • 3090: 1999,9€
  • 3090TI: 1939,98€ (not a typo, the cheapest 3090TI is actually cheaper than the cheapest 3090)
  • 4080: 1405,99€
Let's wait for the reviews, but around this parts, the perf/€ should be near double of past gen.
 
Each country has (wildly) different prices. In Portugal at the moment the cheapest for each model are (without shipping):
  • 3070TI: 691,14€
  • 3080: 839,9€ (the 2nd cheapest is 1092,92€)
  • 3080TI: 1559,9€
  • 3090: 1999,9€
  • 3090TI: 1939,98€ (not a typo, the cheapest 3090TI is actually cheaper than the cheapest 3090)
  • 4080: 1405,99€
Let's wait for the reviews, but around this parts, the perf/€ should be near double of past gen.

AMD's pricing is better:

Radeon RX 6400 - 140.90
Radeon RX 6500 XT - 181.90
Radeon RX 6600 - 269.00
Radeon RX 6600 XT - 387.16
Radeon RX 6650 XT - 319.00
Radeon RX 6700 XT - 399.00
Radeon RX 6750 XT - 455.00
Radeon RX 6800 - 569.00
Radeon RX 6800 XT - 659.00
Radeon RX 6900 XT - 846.46
Radeon RX 6950 XT - 879.00
Radeon RX 7900 XT - 999.00
Radeon RX 7900 XTX - 1389.00
 
AMD's pricing is better:

Radeon RX 6400 - 140.90
Radeon RX 6500 XT - 181.90
Radeon RX 6600 - 269.00
Radeon RX 6600 XT - 387.16
Radeon RX 6650 XT - 319.00
Radeon RX 6700 XT - 399.00
Radeon RX 6750 XT - 455.00
Radeon RX 6800 - 569.00
Radeon RX 6800 XT - 659.00
Radeon RX 6900 XT - 846.46
Radeon RX 6950 XT - 879.00
Radeon RX 7900 XT - 999.00
Radeon RX 7900 XTX - 1389.00
There were very nice discounts a little while ago on a lot of models, for example ~340€ for a 6700 (non-XT). But right now it's gone up quite a bit across the board.
 
AMD's pricing is better:

Radeon RX 6400 - 140.90
Radeon RX 6500 XT - 181.90
Radeon RX 6600 - 269.00
Radeon RX 6600 XT - 387.16
Radeon RX 6650 XT - 319.00
Radeon RX 6700 XT - 399.00
Radeon RX 6750 XT - 455.00
Radeon RX 6800 - 569.00
Radeon RX 6800 XT - 659.00
Radeon RX 6900 XT - 846.46
Radeon RX 6950 XT - 879.00
Radeon RX 7900 XT - 999.00
Radeon RX 7900 XTX - 1389.00
I knew I couldn't beat a 6800xt for $500. Good luck getting one at that price now. Lol.
 
I knew I couldn't beat a 6800xt for $500. Good luck getting one at that price now. Lol.

I don't think I am getting it. I was expecting something like 400$ for EOL.... but not happening.
Now, better take something new generation - better AV1 video support, improved media engine, updated architecture, longer driver support, etc...
Even if 7700 or 7800.
 
Fixed the wrong maths by you :D

RTX 2080 Ti 2018 754 mm² die size -> 12nm 300mm wafer $15000 (yield of 60%) = 70 dies out of them fully 42 functional = $357 per die
RTX 4080 2023 379 mm² die size -> 5nm 300mm wafer $15000 (yield of 80%) = 147 dies out of them fully 117 functional = $128 per die
Wooa there, I'm certain the production cost is not the same between 2018/12Nm and 2023/5Nm

And failure rates are debatable, variable and improve.
 
Blessed the ones keeping a distance from 4K gaming.
This is the only way to fight back any unreasonable pricing policy.

At the end of day there is only one question, how much profitable its the gaming for us regular people?
 
At the end of day there is only one question, how much profitable its the gaming for us regular people?
instead of chewing on lollipops, we game and thus keep our teeth stronger.
a dental implant costs nearly as a 4090. and an all-porcelain dental crown restoration is minimum 500$

Nichole Bloom Lollipop GIF by Superstore
 
The actual wafer costs are not the only issue. The problem is nVidia's designs suck FOR GAMING. They are blowing their transistor budget with bad designs. Doesn't matter if wafers are double the price, you could still make a much better GPU for a lot less money. One GPU design is for 5 different use cases. Bad idea.

GTX 1080 Ti: 12 billion transistors.
RTX 3080: 28 billion.
RTX 4090: 76 billion.

We are talking 7x the old density here. We could be buying a GTX 1080 Ti, at 3Ghz, with double speed VRAM, less than 100mm2. $200-$300 product, same speed as the RTX 4070 Ti in normal rastered games thanks to that double performance. Die size half the RTX 4050 die. LOL. There's a lot of crap thrown in to the die design to add those "features" that I don't want.

As for the RTX 3080, we have 2.7x that density. We could be buying the RTX 3080 with a 30 percent speed boost, <250mm2. $400. Faster than the 4070 Ti for raster, just as good for RT. $400. Same profit margins as the old 3080. We know the RTX 3080 design works. A shrunk design at higher frequencies will also work. This isn't "pie in the sky" thinking.

TSMC handed nVidia the victory. NVidia gave TSMC terrible designs for gaming. It's all ethereum mining, application performance, RT this DLSS that. I just want the 1080 Ti for $200, running with double clock speeds. So would everyone else.

They call it vision. I call it stupidity. AMD had a chance to go a different way, and they messed also. This isn't an nVidia versus AMD thing. This is a "our video card companies suck and they are ruining gaming" thing. Intel had such an opportunity here to target gamers instead of the datacenter, they messed up also.

Even the best of the bunch, the RTX 4090 doesn't look so hot. 7x the transistors for 3x performance. But that's running at 2x the clock speed. Basically 14x transistor*frequency for 3x gaming FPS. Bad. Those transistors are not being used effectively for the games you play.
 
Last edited:
instead of chewing on lollipops, we game and thus keep our teeth stronger.
a dental implant costs nearly as a 4090. and an all-porcelain dental crown restoration is minimum 500$

So you are not going to be with us at CES, giving a hard time to NVIDIA's pavilion and sales people, something for them to feel as gamer answer at their unreasonable plans.
Because at the end day, due such forum topics, the message of consumers this never reaches it destination.
Anonymous people word's can not harm any brand.
 
I mean, this is so funny. Stick modern VRAM in to an old GTX 1080 Ti design, and you get DOUBLE the ram bandwidth compared to the RTX 4070 Ti. That's how crap the 4070 Ti is. Terrible design. It's '60 class.
 
This isn't an nVidia versus AMD thing. This is a "our video card companies suck and they are ruining gaming" thing. Intel had such an opportunity here to target gamers instead of the datacenter, they messed up also.

Leave INTEL outside, we need someone dedicated to build data-center with out severe technical issues. :)
We need some new brains to stand up and create fresh competition.
 
GTX 1080 Ti: 12 billion transistors.
RTX 3080: 28 billion.
RTX 4090: 76 billion.
I just want the 1080 Ti for $200, running with double clock speeds. So would everyone else.
I see, 6.3 times more transistors; 2.75 times more fps:

but 2.75 1080ti would take 300w more watts than a single 4090

 
Back
Top