• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4060 Ti Possible Specs Surface—160 W Power, Debuts AD106 Silicon

Joined
Apr 30, 2020
Messages
848 (0.59/day)
System Name S.L.I + RTX research rig
Processor Ryzen 7 5800X 3D.
Motherboard MSI MEG ACE X570
Cooling Corsair H150i Cappellx
Memory Corsair Vengeance pro RGB 3200mhz 16Gbs
Video Card(s) 2x Dell RTX 2080 Ti in S.L.I
Storage Western digital Sata 6.0 SDD 500gb + fanxiang S660 4TB PCIe 4.0 NVMe M.2
Display(s) HP X24i
Case Corsair 7000D Airflow
Power Supply EVGA G+1600watts
Mouse Corsair Scimitar
Keyboard Cosair K55 Pro RGB
I like the "value" of the 4070 Ti as sad as it is, being the 3080 is really hard to find, especially at a decent price.

The truth is all that back stock & over stock was a lie. It's was probably stock from people who cancled their orders for a RTX 3080, after waiting a year or two years for their card. They stopped making ampere a long timeago.
 
Joined
Aug 23, 2013
Messages
543 (0.14/day)
I'm guessing $750 for the 4060 Ti, $700 for the 4060, $650 for the 4050 Ti, and $600 for the 4050. Then they'll introduce a x40 series that will slot in $50 less with a Ti and so on and so forth. They'll keep the 40 Series above $500 because Jensen heard somewhere that $500 is the new $100.

That's what he heard!
 

ARF

Joined
Jan 28, 2020
Messages
3,892 (2.56/day)
Location
Ex-usa
I think there is one thing which is constant across generations: power consumption. AIBs buy graphics cards based on price and TDP not codenames or memory bandwidth and such.
Based on this metric you can compare the cards and tell which one is which ones successor in Nvidia's mind so to speak. Whoever said the GTX 680 was supposed to be a midrange card (at least based on the TDP of previous generations) is imo right but market circumstances "promoted" it into the 80 class (AMD was slower).
As it happens I made an excel table based on TDP. I have another one with AMD cards if you're interested.
I think this table shows that the cards above 250W are more expensive now exactly because they relatively perform better than "old day&age" high-end SKUs (those had 250W TDP).
As you can see Nvidia moved up the numbers so at the same TDP you either get a "lower class" GPU for the same price or you have to pay more for the "same" class with the same naming. People usually upgrade within the same class so when they see the lower class card for the same money they rather buy the more expensive card. Brilliant and very simple marketing by Nvidia.
But this is just my thinking.

View attachment 279924

I think this is a natural trend because we are fast reaching the limits of any semiconductors manufacturing nodes progress.
Today there is a news that Apple which is the first and exclusive customer on the optical shrink TSMC N3 will begin to ship Apple M2 products as late as late 2023.
3nm M3 Chips To Launch With MacBook Air In 2H23, Will Bring Major Performance And Battery Life Gains (wccftech.com)

This means that AMD will move to N3 in 2025 or 2026.

Goodbye PC business !@ You won't be missed.
 
  • Like
Reactions: N/A
Joined
Sep 1, 2020
Messages
2,015 (1.55/day)
Location
Bulgaria
H2 starting with 1st July. Relax, ZEN 5 and RX 8000 series will be on 3nm from around end of 2024.
 
Joined
Nov 27, 2022
Messages
38 (0.08/day)
I think this is a natural trend because we are fast reaching the limits of any semiconductors manufacturing nodes progress.
Today there is a news that Apple which is the first and exclusive customer on the optical shrink TSMC N3 will begin to ship Apple M2 products as late as late 2023.
3nm M3 Chips To Launch With MacBook Air In 2H23, Will Bring Major Performance And Battery Life Gains (wccftech.com)

This means that AMD will move to N3 in 2025 or 2026.

Goodbye PC business !@ You won't be missed.
What is natural? That we have 300+ Watts cards that didn't exist before Turing (besides the dual GPU cards)?
In normal times the 3070 Ti (not even that, because the TDP of 3070 Ti is 290W) would've been the top card (I mean that kind of performance, not the name) in the Ampere generation.
I think two things happened.
Firstly NV saw in 2016-17 the first bigger crypto mining craze (they started working on Ampere around that time) and realized people want more calculation powa and more cards.
Secondly since they were in contract with Samsung on a crappy node with high power consumption this new top card with 250W TDP would've been only maximum 10% faster than the previous flagship (2080 Super/Ti) which is a very bad generational uplift, worse than Pascal->Turing.
They were cocky by going with Samsung because AMD was nowhere to be found (in 2016 AMD had RX 480 and Fury X and that's it).
So they went allin not caring about TDP with Ampere so we got these 300W+ cards. They're much faster than Turing but they also eat much more than Turing. Or should I say they're faster only because they eat much more.
Look at the TPU database: the 3060 is barely faster than the 2060/Super or the 2070, the 3060 Ti barely faster than the 2070 Super/2080 and the same is true 1660TI vs 3050. These are same TDP tiers. At least they lowered the prices by moving down a tier these performance levels.
They got lucky with the second big crypto mining boom and the Ampere cards sold out easily despite the high TDPs.
Since people want these cards they gonna make them but I think without crypto mining the history would've completely different with an Ampere generation barely faster than Turing showing NV dominance and monopolistic behavior in the GPU space.
 
Joined
May 3, 2018
Messages
2,232 (1.03/day)
Your 6800XT is not even close. 4070 Ti is ~20% faster than 6800XT and 3080 overall at 1440p.

In newer games its more like 25-30%.

Also 4070 Ti is doing that with 50-75w less + DLSS, DLDSR, NvEnc and way more features in generel + Better RT

Yes you can probably cherrypick a game or two 6800XT wins slightly, but overall it does not and in some other games, the 4070 Ti is over 30% faster than 6800 XT as well.

In most new games 4070 Ti is 25% faster than 6800XT stock vs stock. Old games is dragging down the perf of 4000 series in comparisons, because of no optimizations. Thats why performance is better in newer/popular titles in general.


4070 Ti is 30-35% faster than 6800XT in these newer games


So unless you have overclocked your 6800XT to the absolute limit (and looking at 400-500 watt usage) then your 6800XT is not even close.

4070 Ti beats even 6900XT with like 15%...

You sound like people with 3080 that thinks their 3080 also performs just like a 4070 Ti, it don't.

Both 3080 and 6800XT are 20-25-30% behind 4070 Ti depending on title, in GPU limited games.

Wow 20% faster and I get to pay $1500AU or I get a near new second hand 6800XT for $800. You argument only makes sense if the price/performance ratio is similar. It's pathetic value. At $599 it would make sense.
 
Last edited:

johnspack

Here For Good!
Joined
Oct 6, 2007
Messages
5,979 (0.99/day)
Location
Nelson B.C. Canada
System Name System2 Blacknet , System1 Blacknet2
Processor System2 Threadripper 1920x, System1 2699 v3
Motherboard System2 Asrock Fatality x399 Professional Gaming, System1 Asus X99-A
Cooling System2 Noctua NH-U14 TR4-SP3 Dual 140mm fans, System1 AIO
Memory System2 64GBS DDR4 3000, System1 32gbs DDR4 2400
Video Card(s) System2 GTX 980Ti System1 GTX 970
Storage System2 4x SSDs + NVme= 2.250TB 2xStorage Drives=8TB System1 3x SSDs=2TB
Display(s) 2x 24" 1080 displays
Case System2 Some Nzxt case with soundproofing...
Audio Device(s) Asus Xonar U7 MKII
Power Supply System2 EVGA 750 Watt, System1 XFX XTR 750 Watt
Mouse Logitech G900 Chaos Spectrum
Keyboard Ducky
Software Manjaro, Windows 10, Kubuntu 23.10
Benchmark Scores It's linux baby!
Thank god I've given up on gaming on a pc. My 980Ti will last me for years more now. Entry level cards at 1kcan... nope they can keep them.
So many other things you can do on a pc besides game.....
 
Joined
Oct 18, 2013
Messages
5,415 (1.42/day)
Location
Everywhere all the time all at once
System Name The Little One
Processor i5-11320H @4.4GHZ
Motherboard AZW SEI
Cooling Fan w/heat pipes + side & rear vents
Memory 64GB Crucial DDR4-3200 (2x 32GB)
Video Card(s) Iris XE
Storage WD Black SN850X 4TB m.2, Seagate 2TB SSD + SN850 4TB x2 in an external enclosure
Display(s) 2x Samsung 43" & 2x 32"
Case Practically identical to a mac mini, just purrtier in slate blue, & with 3x usb ports on the front !
Audio Device(s) Yamaha ATS-1060 Bluetooth Soundbar & Subwoofer
Power Supply 65w brick
Mouse Logitech MX Master 2
Keyboard Logitech G613 mechanical wireless
Software Windows 10 pro 64 bit, with all the unnecessary background shitzu turned OFF !
Benchmark Scores PDQ
Can we getz some cookies with this (lukewarm) milk please ?

Figures that after they've completely milked the masses with their overpriced power sukin crap, they would bring out this low-end thing that probably can't even run the "can it run crysis" utility without stuttering or overheating......
 
Joined
Feb 20, 2019
Messages
7,194 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Over a decade and 6 consecutive architectures with almost the same naming schemes with a few exceptions here and there seems pretty consistent to me.

What's your timeframe at which point you consider this consistent ? 20, 30 Years ? Centuries ?
Your argument was that xx60 Ti models should use 104 dies:

There is no consistency over the last decade with Ti models. I went back so many generations solely to give your argument the benefit of the doubt, because prior to Ampere, there was another xx60 Ti model using a 104 die, the 660 Ti - but that was 13 years ago!

The 4060 Ti isn't using a 104 die
Ampere 3060 Ti did use a 104 die. We've established that - but this ONE generation occurence isn't enough to call it a concrete pattern like you say it should.
Turing 2060 Ti didn't exist at all, and if you say "super=Ti" then it still didn't use a 104 die
Turing 1660 Ti didn't use 114 die
Pascal 1060 Ti didn't exist.
Maxwell 960 Ti didn't exist
Kepler 760 Ti didn't exist
Kepler 660 Ti (finally, we get there) used a 104 die.

Please, explain to me why you're so adamant that the 4060 Ti should be a 104 die again. I honestly don't understand the point you're trying to make.
 
Last edited:
Joined
Dec 31, 2020
Messages
766 (0.65/day)
Processor E5-2690 v4
Motherboard VEINEDA X99
Video Card(s) 2080 Ti WINDFROCE OC
Storage NE-512 KingSpec
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
Geforce GTX 960 = GM204
This is GM206.

this time around 4070 is a rebranded 60 Ti, just look at the 4070 ti to 4070, they cut more than 20%, this is precisely what 3060 Ti was compared to a 3070 Ti.
And the 60 Ti falls in the abyss of 128 bit bus.
 
Joined
Feb 18, 2013
Messages
2,178 (0.54/day)
Location
Deez Nutz, bozo!
System Name Rainbow Puke Machine :D
Processor Intel Core i5-11400 (MCE enabled, PL removed)
Motherboard ASUS STRIX B560-G GAMING WIFI mATX
Cooling Corsair H60i RGB PRO XT AIO + HD120 RGB (x3) + SP120 RGB PRO (x3) + Commander PRO
Memory Corsair Vengeance RGB RT 2 x 8GB 3200MHz DDR4 C16
Video Card(s) Zotac RTX2060 Twin Fan 6GB GDDR6 (Stock)
Storage Corsair MP600 PRO 1TB M.2 PCIe Gen4 x4 SSD
Display(s) LG 29WK600-W Ultrawide 1080p IPS Monitor (primary display)
Case Corsair iCUE 220T RGB Airflow (White) w/Lighting Node CORE + Lighting Node PRO RGB LED Strips (x4).
Audio Device(s) ASUS ROG Supreme FX S1220A w/ Savitech SV3H712 AMP + Sonic Studio 3 suite
Power Supply Corsair RM750x 80 Plus Gold Fully Modular
Mouse Corsair M65 RGB FPS Gaming (White)
Keyboard Corsair K60 PRO RGB Mechanical w/ Cherry VIOLA Switches
VR HMD Pico 4 128GB AIO VR Headset* (* = in consideration of getting one)
Software Windows 10 Professional x64 (Update 22H2)
y'all do know Ada Lovelace core has larger L2 cache than Ampere, right? What if this 4060Ti is keeping up or even beating a 3080 despite having the "inferior" AD106 silicon, smaller bus, lower CUDA count and whatnot? If it did priced at $600 or more, then go find a known used 3080 (or a Ti variant) and be happy with it. No one is forcing you to buy or even consider getting this not-releasing-so-soon GPU.
 
Joined
Jul 15, 2020
Messages
976 (0.72/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.025mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F26 with "Instant 6 GHz" on
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 85%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
Die names are not just random worthless information, Nvidia uses the same nomenclature every time for a reason. If a certain card now uses a chip that's a lower tier according to the silicon naming scheme that means they've just upsold their customers a product that is inferior according to their own internal classification compared to the previous generation. In other words they're giving you less for more money.

If you think that doesn't matter you're being oblivious because it's those things that actually dictate price and performance.
Right, but what matters is price to preformance and not the naming that can change according to each gen positioning and time circumstances.

Many limbo on names, endlessly compering 10 gens back of tiers according to them instead of just going price\pref at a given time.

Things change rapidly, adapt or be left behind. The neverending rant on memory bit, die size\name, product name is futile as dust.

No one disputing the fact that today you pay extra more and getting less for that extra but clinging to what was in the (not far) past and not accepting the economy global changes and NV hunger for more $$$ is somewhat naive.

The best thing you can do is to delay your GPU purchase as much as you can (I advise 5 years, no less) and if you really need\want to buy go for best cost\pref on the day of purchase. All else is just confusing spec info that might diviate you to less than optimal purchase decision.
 
Joined
Jan 8, 2017
Messages
8,861 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Your argument was that xx60 Ti models should use 104 dies:

No it wasn't. I simply pointed out that there is going to be a downgrade, the silicon is going to be a tier lower.
 
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Yep. And they can't because...............
R
T

How are we liking that price of RT yet? Glorious innit


And faster VRAM, which is what held the 680 back. I owned a 770, was a pretty neat card, basically GK104 in optimal form. Experience was better than the SLI GTX 660's I had prior to it, and that was a free 'upgrade', so to speak :D Good times... You actually had choices. Now its 'okay, where does my budget land in the stack, and how much crap am I getting along with it that I really don't want to pay for but still do?'
Tbf, if someone doesnt care about rt, which i hear a lot, shouldn't complain about the prices either. A 350-400 euro cards kills everything with rt off at 1440p (lets say a 6700xt or a 3060ti). That kind of performance used to be ultra high end a few years ago. I have a 3060ti in my gfs pc with a 1440p ultrawide monitor, she doesnt really care about rt, framerate is great, and there is also the dlss option for backup.

I really don't think we ever got more performnace in the midrange than we do today. And since you mentioned the gtx 770,got that card almost day one for 370 euros if i remember, day one it was dropping easily to below 60 and even 30 in plenty of games. So sure we might think the prices have gone up, but fact of the matter is todays midrange gives you way more performance in modern games than the high end gave you 10 years ago.
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
It is disturbing watching the "just accept it and buy the GPU anyway no matter how much Nvidia is ripping gamers off mentality" concerning Nvidia's GPUs. Don't you realize that once Nvidia gets away with this it becomes precedent just like the GTX 680 almost 11 years ago. They have never retreated from their over-pricing schemes since then. It only continues to get worse and now with Ada it is utterly ridiculous.

What's next with the RTX 50 series? Higher prices still most likely.

Some people have asked me why I care. I care because I see PC gaming as a hobby and I always have. I see PC gamers as fellow hobbyists and I don't like seeing my fellow hobbyists being abused and ripped off by Nvidia. $550, if not more, for a lower end Ada RTX 4060 Ti which is really an entry level RTX 4050 Ti is ridiculous. Don't be silent and just take it like is being advised by a vocal minority here. Fight back.
 
Last edited:
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
What is natural? That we have 300+ Watts cards that didn't exist before Turing (besides the dual GPU cards)?
In normal times the 3070 Ti (not even that, because the TDP of 3070 Ti is 290W) would've been the top card (I mean that kind of performance, not the name) in the Ampere generation.
I think two things happened.
Firstly NV saw in 2016-17 the first bigger crypto mining craze (they started working on Ampere around that time) and realized people want more calculation powa and more cards.
Secondly since they were in contract with Samsung on a crappy node with high power consumption this new top card with 250W TDP would've been only maximum 10% faster than the previous flagship (2080 Super/Ti) which is a very bad generational uplift, worse than Pascal->Turing.
They were cocky by going with Samsung because AMD was nowhere to be found (in 2016 AMD had RX 480 and Fury X and that's it).
So they went allin not caring about TDP with Ampere so we got these 300W+ cards. They're much faster than Turing but they also eat much more than Turing. Or should I say they're faster only because they eat much more.
Look at the TPU database: the 3060 is barely faster than the 2060/Super or the 2070, the 3060 Ti barely faster than the 2070 Super/2080 and the same is true 1660TI vs 3050. These are same TDP tiers. At least they lowered the prices by moving down a tier these performance levels.
They got lucky with the second big crypto mining boom and the Ampere cards sold out easily despite the high TDPs.
Since people want these cards they gonna make them but I think without crypto mining the history would've completely different with an Ampere generation barely faster than Turing showing NV dominance and monopolistic behavior in the GPU space.
Isnt it crazy blaming everything on nvidia, when in fact its amd that messed up? Here you go, turing came after VEGA and look at that, amd topping the power draw graphs
But lets blame nvidia regardless


It is disturbing watching the "just accept it and buy the GPU anyway no matter how much Nvidia is ripping gamers off mentality" concerning Nvidia's GPUs. Don't you realize that once Nvidia gets away with this it becomes precedent just like the GTX 680 almost 13 years ago. They have never retreated from their over-pricing schemes since then. It only continues to get worse and now with Ada it is utterly ridiculous.

What's next with the RTX 50 series? Higher prices still most likely.

Some people have asked me why I care. I care because I see PC gaming as a hobby and I always have. I see PC gamers as fellow hobbyists and I don't like seeing my fellow hobbyists being abused and ripped off by Nvidia. $550, if not more, for a lower end Ada RTX 4060 Ti which is really an entry level RTX 4050 Ti is ridiculous. Don't be silent and just take it like is being advised by a vocal minority here. Fight back.
But Nvidia's gpus are CHEAPER than amds, wth are you talking about?? What nvidia pricing schemes, dont you see that amd is the one fleecing us? Like seriously what the heck is wrong with people and their nvidia hating

The xt is priced 14% higher across eu while being a worse product than the 4070ti. Lol
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
IBut Nvidia's gpus are CHEAPER than amds, wth are you talking about?? What nvidia pricing schemes, dont you see that amd is the one fleecing us? Like seriously what the heck is wrong with people and their nvidia hating

If I'm a Nvidia hater then I'm doing it wrong. I have been using exclusively Nvidia GPUs in all of my builds starting with a 8800 GT over 15 years ago. Speaking out against shitty practices by a company isn't hating that company. It's my right as a paying customer.
 
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
If I'm a Nvidia hater then I'm doing it wrong. I have been using exclusively Nvidia GPUs in all of my builds starting with a 8800 GT almost 15 years ago. Speaking out against shitty practices by a company isn't hating that company. It's my right as a paying customer.
But you are speaking about the wrong company. Nvidia has the cheapest and the best value gpu out of all the new cards. That's just a fact, the 4070ti offers both better raster per dollar and way better rt per dollar while costing less money, consuming less power having more features and being cheaper than the competition. Why the heck would you expect nvidia to lower its price? It already is the best value gpu, what you are saying doesnt make much sense to me.
 
Joined
Nov 27, 2022
Messages
38 (0.08/day)
Isnt it crazy blaming everything on nvidia, when in fact its amd that messed up? Here you go, turing came after VEGA and look at that, amd topping the power draw graphs
But lets blame nvidia regardless

I didn't blame them for anything except for making very power hungry cards out of necessity (because of Samsung node) and partially market demand (sprinkled with some deception).
Vega was below 300W while the Ampere generation has 5 cards above 300W. AMD made power hungry cards earlier (HD 7970 GHz, R9 290/X, R9 390/X, R9 Fury/X all below 300W) but they had to in order to stay somewhat competitive. In the RDNA2 lineup only the 6950XT (an afterthought card) consumes more than 300W.
NV is the market leader so they should set an example.
 
Joined
Sep 17, 2014
Messages
20,777 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Tbf, if someone doesnt care about rt, which i hear a lot, shouldn't complain about the prices either. A 350-400 euro cards kills everything with rt off at 1440p (lets say a 6700xt or a 3060ti). That kind of performance used to be ultra high end a few years ago. I have a 3060ti in my gfs pc with a 1440p ultrawide monitor, she doesnt really care about rt, framerate is great, and there is also the dlss option for backup.

I really don't think we ever got more performnace in the midrange than we do today. And since you mentioned the gtx 770,got that card almost day one for 370 euros if i remember, day one it was dropping easily to below 60 and even 30 in plenty of games. So sure we might think the prices have gone up, but fact of the matter is todays midrange gives you way more performance in modern games than the high end gave you 10 years ago.
Part of this is true... but then you have the beautiful situation where that 3060ti is endowed with a measly 8GB and you still find yourself cutting back on textures and whatnot to not stutter through your content. Like you say there is ample core power in current and last gen cards. Turing much the same really.

You could indeed also get an RDNA2 midrange card to have the VRAM. But the 3060ti is a horrible example of midrange progress, its a card that effectively regressed in how long it'll last in the midrange compared to say a Pascal or Turing alternative with similar VRAM but a much less powerful core. So even in the midrange, on Nvidia, you do have adverse effects besides even price, which result in reduced usable lifespan before you're cutting down settings that noticeably hit IQ.

The bottom line remains: on Nvidia, you are paying the RT price on the entire stack. The last cards avoiding that issue are the 16 series - not exactly midrange by todays standards.

IMHO the midrange has definitely not improved in any way shape or form, past the 1060 6GB. It has regressed. Perf/$ gen to gen barely moves forward or is in fact worse; VRAM relative to core power got fár worse and cache doesn't alleviate that in most cases; and absolute pricing also just went up.

I'm honestly trying to find a nice purchase now in the midrange of last several generations. Its just not there. Every single Nvidia card is handicapped in some way or another at its given price point and stack position. Coming from a 1080, it seems the only cards I can look at are 6800 (-XT) or 3080 10GB, the latter being sub optimal just the same - and a whoppin 320W TDP. And then I'm still looking not at 450-500 (reasonable range imho) but 600+ EUR.

Seriously man, I need a full bucket of cognitive dissonance to see improvement here over the last two generations...

I also dare your GF to run Darktide @ 1440p on a 3060ti. Gonna be a fun sub 60 FPS experience... You're going to have to use FSR/DLSS as well to get decent frames. Please stop selling nonsense. And this isn't anything special either, its just a UE4 game.

1674222209883.png


1674222182965.png
 
Last edited:
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Part of this is true... but then you have the beautiful situation where that 3060ti is endowed with a measly 8GB and you still find yourself cutting back on textures and whatnot to not stutter through your content. Like you say there is ample core power in current and last gen cards. Turing much the same really.

You could indeed also get an RDNA2 midrange card to have the VRAM. But the 3060ti is a horrible example of midrange progress, its a card that effectively regressed in how long it'll last in the midrange compared to say a Pascal or Turing alternative with similar VRAM but a much less powerful core. So even in the midrange, on Nvidia, you do have adverse effects besides even price, which result in reduced usable lifespan before you're cutting down settings that noticeably hit IQ.

The bottom line remains: on Nvidia, you are paying the RT price on the entire stack. The last cards avoiding that issue are the 16 series - not exactly midrange by todays standards.

IMHO the midrange has definitely not improved in any way shape or form, past the 1060 6GB. It has regressed. Perf/$ gen to gen barely moves forward or is in fact worse; VRAM relative to core power got fár worse and cache doesn't alleviate that in most cases; and absolute pricing also just went up.

I'm honestly trying to find a nice purchase now in the midrange of last several generations. Its just not there. Every single Nvidia card is handicapped in some way or another at its given price point and stack position. Coming from a 1080, it seems the only cards I can look at are 6800 (-XT) or 3080 10GB, the latter being sub optimal just the same - and a whoppin 320W TDP. And then I'm still looking not at 450-500 (reasonable range imho) but 600+ EUR.

Seriously man, I need a full bucket of cognitive dissonance to see improvement here over the last two generations...

I also dare your GF to run Darktide @ 1440p on a 3060ti. Gonna be a fun sub 60 FPS experience... You're going to have to use FSR/DLSS as well to get decent frames. Please stop selling nonsense. And this isn't anything special either, its just a UE4 game.

View attachment 280041

View attachment 280040
Is darktide really worth it in terms of visuals? I dont know the game. Cause sure, i can easily find plenty of games that wont cut it, but usually those games look awful on top of the horrible performance. Case in point, ark survival was struggling on a 3090 (lol) and it looked AWFUL. Like one of the worst games ive ever laid my eyes upon.

Watchdogs 2 was another one, although it didnt look bad, setting shadows to the pcss option means your framerate will drop to the low 30ies. Its a game that came out with your 1080, try it, youll probably struggle to go above 20 at 1080p maxed out.

Some games have stupid max settings (gtav had 8x msaa lol). You cant blame the cards if they cant handle that.

But anyways, dlss exists for a reason, and its absolutely amazing. I know the usual crowd that's going to boo but in fact dlss is as good, sometimes even better than native (cod black ops for example looks way better with dlss q than native). So i dont see the problem with activating it. In fact i have it perma on even on my 4090,cause there is no reason not to.
 
Joined
Sep 17, 2014
Messages
20,777 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Is darktide really worth it in terms of visuals? I dont know the game. Cause sure, i can easily find plenty of games that wont cut it, but usually those games look awful on top of the horrible performance. Case in point, ark survival was struggling on a 3090 (lol) and it looked AWFUL. Like one of the worst games ive ever laid my eyes upon.

Watchdogs 2 was another one, although it didnt look bad, setting shadows to the pcss option means your framerate will drop to the low 30ies. Its a game that came out with your 1080, try it, youll probably struggle to go above 20 at 1080p maxed out.

Some games have stupid max settings (gtav had 8x msaa lol). You cant blame the cards if they cant handle that.

But anyways, dlss exists for a reason, and its absolutely amazing. I know the usual crowd that's going to boo but in fact dlss is as good, sometimes even better than native (cod black ops for example looks way better with dlss q than native). So i dont see the problem with activating it. In fact i have it perma on even on my 4090,cause there is no reason not to.
Darktide doesn't look bad. I'm running it on a 1080 @ 3440x1440, that's sub 60 with lows @ 40 and reduced settings left and right + FSR at native res. If you look at those charts, you see the cards have plateaud from the 2080ti onwards, look at the massive jump there between it and 980ti. 1080ti slots somewhere in the middle between them. After that? 10-15% at best per gen. 1080p...

Its easy to spot this trend here.
 
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Darktide doesn't look bad. I'm running it on a 1080 @ 3440x1440, that's sub 60 with lows @ 40 and reduced settings left and right + FSR at native res. If you look at those charts, you see the cards have plateaud from the 2080ti onwards, look at the massive jump there between it and 980ti. 1080ti slots somewhere in the middle between them. After that? 10-15% at best per gen. 1080p...

Its easy to spot this trend here.
Then there is a problem with the game, cause a 4090 is nowhere near only 22% faster than the 3080.
 
Joined
Sep 17, 2014
Messages
20,777 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Then there is a problem with the game, cause a 4090 is nowhere near only 22% faster than the 3080.
There are lots of examples like it... if the better half you're playing 'is a problem with the game' that is one strange situation we're in. In a practical sense, you're then still not getting your money's worth in that midrange though, which was the point. Perhaps that new box of hardware tricks implemented doesn't really prove its worth in all the content ;)

You can keep moving the goalposts to imply denial of an obvious trend, you do you. Let's see where a 160W 4060ti goes with 228GB/s... at a supposed 350-400 bucks like you say.
 
Joined
Jun 14, 2020
Messages
2,678 (1.93/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
There are lots of examples like it... if the better half you're playing 'is a problem with the game' that is one strange situation we're in. In a practical sense, you're then still not getting your money's worth in that midrange though, which was the point. Perhaps that new box of hardware tricks implemented doesn't really prove its worth in all the content ;)

You can keep moving the goalposts to imply denial of an obvious trend, you do you. Let's see where a 160W 4060ti goes with 228GB/s...
I dont understand what goalposts im moving. I said that a 400 euro card gives you insane 1440p performance with dlss or fsr to boot. Thats already high end resolution for me. If you want to go 4k maxed out with rt you have to pay, but that's luxury.
 
Top