• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4090/4080 to Feature up to 24 GB of GDDR6X Memory and 600 Watt Board Power

Joined
Sep 17, 2014
Messages
20,782 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Just stick to a TDP figure you are comfortable with and let others enjoy their 500W GPU :D (not me, I would just undervolt it down to ~300W), I'm sure the AD104/106/107 chips will be ultra efficient and more suitable for your gaming need.

Ada is just a die shrink of Ampere, so RT/Tensor perf would scale linearly with raster compare to Ampere.

There are AI supercomputers doing all the planet saving stuffs so in the end capitalism save itself from the problem it created :D. This would be a bit surprising to you but Capitalist countries have healthier air quality than non capitalist countries.

Yes because we export our industry to cheap labor countries. China...

You are missing quite a bit of context to justify cognitive dissonance. Another glaring one is the idea processing power somehow helps climate? All it really serves is understanding it, while slowly ruining it.

Also, lower SKUs arent all that efficient at all, the 104 also got its share of TDP bumps. Its a strange reading of facts you have here, buddy...

You make a great point when you say ADA is just shrunk ampere, that does explain quite well why we get what we seem to be getting. Their tech isnt advancing a whole lot anymore, we are in a moar corezzz situation here like I alluded to earlier as well. We'll see where that goes... gonna be interesting what AMD will do with RDNA but they then have the opportunity to gain feature parity with NV..
 
Last edited:
Joined
Nov 11, 2016
Messages
3,045 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Yes because we export our industry to cheap labor countries. China...

You are missing quite a bit of context to justify cognitive dissonance. Another glaring one is the idea processing power somehow helps climate? All it really serves is understanding it, while slowly ruining it.

Also, lower SKUs arent all that efficient at all, the 104 also got its share of TDP bumps. Its a strange reading of facts you have here, buddy...

Isn't your 180W GP104 get whooped by a 75W GA106 A2000?
 
Joined
Sep 17, 2014
Messages
20,782 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Isn't your 180W GP104 get whooped by a 75W GA106 A2000?

A2000?

GA106 is a 170w SKU. Not 75 ;)
So really we moved the TDP of 104 to a lower point in the stack. Again you need to twist reality to get to favorable outcomes ;) you really just provided your own counter argument.

Last i checked we are talking about the gaming stack...
 
Joined
Nov 11, 2016
Messages
3,045 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
A2000?

GA106 is a 170w SKU. Not 75 ;)
So really we moved the TDP of 104 to a lower point in the stack. Again you need to twist reality to get to favorable outcomes ;) you really just provided your own counter argument.

Last i checked we are talking about the gaming stack...

Oh so you can't buy an A2000 to play game? that's just you limiting yourself from life choices that are not obvious to you, either by lack of intellect or ignorance.
I didn't have to twist anything but provide information that you clearly lack.
 
Last edited:
Joined
Jun 21, 2013
Messages
535 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
Latest rumor say 4090 can be 2-2.5x faster than 3090 at 4K, that would be insane for one generation leap.
Must be comparing native res 3090 vs 4090 running with DLSS in performance mode.
 
Joined
Nov 11, 2016
Messages
3,045 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Must be comparing native res 3090 vs 4090 running with DLSS in performance mode.

Have you tried the new DLSS v2.3.9 (download from TPU)? it looks amazing, much better than any other v2.3 (sharper and better anti-ghosting)
 
Joined
Oct 10, 2018
Messages
943 (0.47/day)
Nice to see new tech releasing soon. But I just can not justify these consumption rates. So probably 3060ti successor is around 300W? Makes no sense...
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.96/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Oh so you can't buy an A2000 to play game?
Uhhh, The A2000 is a workstation GPU. Anybody buying that with the primary intent to play games needs to have their head examined.
 
Joined
Nov 11, 2016
Messages
3,045 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Uhhh, The A2000 is a workstation GPU. Anybody buying that with the primary intent to play games needs to have their head examined.

You can read in the A2000 review threads that there are TPU members who want to use A2000 as a mini ITX gaming GPU
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.96/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
You can read in the A2000 review threads that there are TPU members who want to use A2000 as a mini ITX gaming GPU
They too should have their heads examined because you're paying a premium for that workstation GPU. :laugh: I mean, if they want to blow their money on something like that, it's their decision, however it's the wrong tool for the wrong job and that's one hell of a price premium just to get a particular form factor.
 
Joined
Feb 18, 2005
Messages
5,239 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
CHILDREN.

Do none of you understand how any of this works?

The reason that CPUs and GPUs are drawing more and more energy for apparently less performance is a simple function of physics.

Even until a few years ago, a node shrink generally meant that transistor density would double (180nm to 90nm to 45nm / 28nm to 14nm to 7nm) which implied that the same chip would occupy only a quarter of the physical size after a shrink. That gave you a fuckton of room to add more transistors for more functionality and performance.

But that's no longer possible because node sizes aren't halving anymore, they're dropping by a nanometer or two at a time because we're reaching the literal physical limits of what silicon can do (end of Moore's law). So die area isn't as easily available, hence die sizes have to increase.

Then there's the unavoidable consequence of smaller and smaller nodes - less physical space between the individual transistors. Which leads to electromigration and crucially, current leakage. Which means that not only do you need more energy input to overcome that leakage, you need to dedicate more of your die to current-monitoring and -controlling transistors. Again, that means you have less die area for transistors to increase functionality and performance.

And because consumers don't care about any of the above and demand more functionality and performance generation-on-generation, the problem is going to continue to get worse until the semiconductor industry transitions to a successor to silicon.

Ada is just a die shrink of Ampere
No it's not.
 
Joined
Sep 17, 2014
Messages
20,782 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
RDNA1 and 2 have been power efficient, RDNA2 is more efficient than Nvidia's 3000 series, so what are you referring to exactly?


Why so so many people have it out for MLID on here? His leaks have been more correct than not

He has a history and its a total nobody. Take a look at 'Forum Cop' ;) its a nolifing troll, nothing more, like a vast portion of the currently popular internet that makes a living out of fishing for subs and clicks.

Plus anyone with half a brain can put historical specs side by side and take a wild guess at future core counts. Its not interesting and we shouldnt quite care so much about it. 'First!' Well yay, want a cookie? None of the info really matters until stuff is on shelves and seriously reviewed anyway.

CHILDREN.

Do none of you understand how any of this works?

The reason that CPUs and GPUs are drawing more and more energy for apparently less performance is a simple function of physics.

Even until a few years ago, a node shrink generally meant that transistor density would double (180nm to 90nm to 45nm / 28nm to 14nm to 7nm) which implied that the same chip would occupy only a quarter of the physical size after a shrink. That gave you a fuckton of room to add more transistors for more functionality and performance.

But that's no longer possible because node sizes aren't halving anymore, they're dropping by a nanometer or two at a time because we're reaching the literal physical limits of what silicon can do (end of Moore's law). So die area isn't as easily available, hence die sizes have to increase.

Then there's the unavoidable consequence of smaller and smaller nodes - less physical space between the individual transistors. Which leads to electromigration and crucially, current leakage. Which means that not only do you need more energy input to overcome that leakage, you need to dedicate more of your die to current-monitoring and -controlling transistors. Again, that means you have less die area for transistors to increase functionality and performance.

And because consumers don't care about any of the above and demand more functionality and performance generation-on-generation, the problem is going to continue to get worse until the semiconductor industry transitions to a successor to silicon.


No it's not.

There are many ways to seek efficiency, and the chips we are getting now are a mix of bad things. GPUs are brute forcing expensive ray tracing because they eat anything without RT alive... so what we have there is just a saturated market searching for new money makers, in a day and age of diminishing returns and major climate problem... plus a market where chips are scarce.

Fundamentally, that is a display of human greed at the expense of a lot of sense that unfortunately is not common.

Adults look at the horizon and see a problem, Id say it are in fact the children still yearning for good old progress like we always had it. 'Must have new toys' for... what exactly? Progress in the eternal shape of more more more simply cannot keep lasting.

You can read in the A2000 review threads that there are TPU members who want to use A2000 as a mini ITX gaming GPU

You must be joking. Power to them, I guess? The fact remains its a workstation GPU, its not even a full 106, and its part of a stack that isnt subject of this topic. Im not even considering price lol...

Twisting reality to make your point. 106 for gaming is a 170W GPU that you falsely compare as a 70W one, to say it beats a 1080 GP104 at 180W. Glad we have that sorted out. The reality is that 106 now uses the same power as a 2016 104. And the current 104 is in the olde 102 TDP range.
 
Last edited:
Joined
Jan 31, 2010
Messages
5,363 (1.04/day)
Location
Gougeland (NZ)
System Name Cumquat 2021
Processor AMD RyZen R7 7800X3D
Motherboard Asus Strix X670E - E Gaming WIFI
Cooling Deep Cool LT720 + CM MasterGel Pro TP + Lian Li Uni Fan V2
Memory 32GB GSkill Trident Z5 Neo 6000
Video Card(s) Sapphire Nitro+ OC RX6800 16GB DDR6 2270Cclk / 2010Mclk
Storage 1x Adata SX8200PRO NVMe 1TB gen3 x4 1X Samsung 980 Pro NVMe Gen 4 x4 1TB, 12TB of HDD Storage
Display(s) AOC 24G2 IPS 144Hz FreeSync Premium 1920x1080p
Case Lian Li O11D XL ROG edition
Audio Device(s) RX6800 via HDMI + Pioneer VSX-531 amp Technics 100W 5.1 Speaker set
Power Supply EVGA 1000W G5 Gold
Mouse Logitech G502 Proteus Core Wired
Keyboard Logitech G915 Wireless
Software Windows 11 X64 PRO (build 23H2)
Benchmark Scores it sucks even more less now ;)
I'm curious to how much this will hold true and if it does: what kind of heat it generates, how it's properly cooled and finally what kind of performance it gives....oh and also what kind of outlandish price tag they slap on it.

It'll use a 360 Rad water cooler and probably cost around $5K USD
 
Joined
Dec 14, 2011
Messages
941 (0.21/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Corsair iCUE H115i Elite Capellix 280mm
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS RTX 3070 Ti TUF Gaming OC Edition
Storage Sabrent Rocket 1TB M.2
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 MK.2 Low-Profile Rapidfire
Software Microsoft Windows 11 Pro (64-bit)
Have you tried the new DLSS v2.3.9 (download from TPU)? it looks amazing, much better than any other v2.3 (sharper and better anti-ghosting)

I am still waiting for that nice Guru, that will make a utility for us, so we can swap between DLSS versions for the games that support them with ease. :)
 
Joined
Dec 26, 2006
Messages
3,471 (0.55/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
Does America have the same energy crisis as UK? A guy mining stopped when he had a £500 month electric bill ($660).

If I ran the GPU in the UK without undervolt and no FPS cap, I think I would pay more in electric than buying the card in the first year. :)
USA don't live there but from what people I do know that live there, it varies state to state. Just like Canada varies province to province. I would not say any shortages or crisis, but electricity isn't free lol Ontario after 'shipping', fees, taxes etc. I would guess we are around 20c-25c/kw.hr average
Base electricity it tiered in time

Most people heat with natural gas since electricity would be 6x-10x the cost and Manitoba typically heats with electricity cause it's cheaper than natural gas.
 
Joined
Oct 13, 2021
Messages
45 (0.05/day)
(24) 1GB chips.. part of the power consumption figure.

Twenty-four

What we need is more cooking videos. I'm all-in on this one.

Seriously though, when are they going to stop that nonsense? I surmise a lot of down-the-road card failures will be mem related.
 
Joined
Jan 27, 2015
Messages
1,065 (0.32/day)
System Name loon v4.0
Processor i7-11700K
Motherboard asus Z590TUF+wifi
Cooling Custom Loop
Memory ballistix 3600 cl16
Video Card(s) eVga 3060 xc
Storage WD sn570 1tb(nvme) SanDisk ultra 2tb(sata)
Display(s) cheap 1080&4K 60hz
Case Roswell Stryker
Power Supply eVGA supernova 750 G6
Mouse eats cheese
Keyboard warrior!
Benchmark Scores https://www.3dmark.com/spy/21765182 https://www.3dmark.com/pr/1114767
USA don't live there but from what people I do know that live there, it varies state to state. Just like Canada varies province to province. I would not say any shortages or crisis, but electricity isn't free lol Ontario after 'shipping', fees, taxes etc. I would guess we are around 20c-25c/kw.hr average
Base electricity it tiered in time

Most people heat with natural gas since electricity would be 6x-10x the cost and Manitoba typically heats with electricity cause it's cheaper than natural gas.
it can and does change from city/county to city/county even depending how deregulated the state is (since 1999 to inhibit the natural monopolies of pub utls).

seeing 25¢ for a kilowatt up north . . . . wow. its under 8¢ on this side of the lake.

Capture.PNG

granted still pay toldeo edison for distribution/equipment frees ~$60 a month.
 
Joined
May 30, 2018
Messages
1,890 (0.89/day)
Location
Cusp Of Mania, FL
Processor Ryzen 9 3900X
Motherboard Asus ROG Strix X370-F
Cooling Dark Rock 4, 3x Corsair ML140 front intake, 1x rear exhaust
Memory 2x8GB TridentZ RGB [3600Mhz CL16]
Video Card(s) EVGA 3060ti FTW3 Ultra Gaming
Storage 970 EVO 500GB nvme, 860 EVO 250GB SATA, Seagate Barracuda 1TB + 4TB HDDs
Display(s) 27" MSI G27C4 FHD 165hz
Case NZXT H710
Audio Device(s) Modi Multibit, Vali 2, Shortest Way 51+ - LSR 305's, Focal Clear, HD6xx, HE5xx, LCD-2 Classic
Power Supply Corsair RM650x v2
Mouse iunno whatever cheap crap logitech *clutches Xbox 360 controller security blanket*
Keyboard HyperX Alloy Pro
Software Windows 10 Pro
Benchmark Scores ask your mother
CHILDREN.

Do none of you understand how any of this works?

The reason that CPUs and GPUs are drawing more and more energy for apparently less performance is a simple function of physics.

Even until a few years ago, a node shrink generally meant that transistor density would double (180nm to 90nm to 45nm / 28nm to 14nm to 7nm) which implied that the same chip would occupy only a quarter of the physical size after a shrink. That gave you a fuckton of room to add more transistors for more functionality and performance.

But that's no longer possible because node sizes aren't halving anymore, they're dropping by a nanometer or two at a time because we're reaching the literal physical limits of what silicon can do (end of Moore's law). So die area isn't as easily available, hence die sizes have to increase.

Then there's the unavoidable consequence of smaller and smaller nodes - less physical space between the individual transistors. Which leads to electromigration and crucially, current leakage. Which means that not only do you need more energy input to overcome that leakage, you need to dedicate more of your die to current-monitoring and -controlling transistors. Again, that means you have less die area for transistors to increase functionality and performance.

And because consumers don't care about any of the above and demand more functionality and performance generation-on-generation, the problem is going to continue to get worse until the semiconductor industry transitions to a successor to silicon.


No it's not.
Hmmm... I'm with you on a lot of this. Been quietly wondering where the train is going for years. Short of major breakthroughs in our understanding of applying physics and materials, the bend can only continue to drive tighter.

Maybe we shouldn't be buying GPUs every year or two to begin with though. From a consumer standpoint, I personally don't mind if there isn't something 'lastest and greatest' constantly. Can't buy it all anyway. But I do wonder what happens to all of these industries dependent on excruciatingly fast product cycles with just slightly better stuff in them each time. It's always been kind of nuts to me how many microchips we churn out just to toss in a few years. But I never saw how releasing products this way could be sustainable. It's a lot of waste, and people spend a lot of their finite financial resources on it. It doesn't seem necessary to reap the benefits of technology. To me, that's more marketing that makes people feel like they always need more, and on the other end are games promising more and more, utilizing the extra headroom. Pretty much every year, there will be some new upgrade to tempt your wallet.

That's the thing.... the pace of actual performance advancements slowed, right? But the products didn't. They came out at the same rate as always, with smaller gains, and prices that really only got higher year after year, like most other things. The more logical and (I think) fairer way is to hold a higher standard for what's considered a real gain, develop products for as long as that takes. Less waste, better products, less oversaturation in the markets and product stacks... possibly better delineation between them too. The thing is, it's impossible under the current economic model. No tech business could survive as it does today, if it was waiting until it had serious gains for each new product. Nobody is footing the bill on added operational costs. We are talking about entire industries that are almost predicated breakneck advancement... that can plan for a decade's worth of new shit, a whole roadmap of relatively small incremental improvements. What happens when we can no longer do that with the knowledge and tools that we have? What does stagnating development in silicon transistor tech look like when it reaches its final stage? People have their pet materials, but those are more challenging to work with than silicon. Those challenges could reap huge dividends in the end, but they could also just ensure that it's all ridiculously expensive for a long time. Just getting one of them to replace silicon will be expensive for everyone.

Something kind of has to give there though. Somewhere, somehow.

I kind of wonder how many things about how we operate will change in my life. I don't really see technology, or especially stuff in the realm of consumer technology ever being like it was before.
 
Joined
Feb 18, 2005
Messages
5,239 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
You must be joking. Power to them, I guess? The fact remains its a workstation GPU, its not even a full 106, and its part of a stack that isnt subject of this topic. Im not even considering price lol...
It fills an extremely specific niche - that of people who want to build really mini-ITX systems. We're talking so mini that they don't even have an extra 150W available from their PSU for an PCIe 8-pin connector, which is entirely possible with picoPSUs.

I certainly understand the desire to have a decently powerful GPU that doesn't require an external power connector, and I also understand why so many (including myself) are fascinated with the A2000 - it demonstrates that the efficiency that Maxwell brought hasn't gone anywhere, it's just that because everything has become about performance and beating the competition (and I blame consumers for this as much as anyone), the manufacturers are clocking their silicon to the limits to try to get any win. And of course that's a zero-sum game because as soon as one manufacturer does it, all of them will do it.
 
Joined
May 3, 2018
Messages
2,235 (1.04/day)
Latest rumor say 4090 can be 2-2.5x faster than 3090 at 4K, that would be insane for one generation leap.
This why the price will be much higher, but you could get a 4070 that will still beat a 3090 at 4K in pure ratserisation and easily beat it for RT. It's said the 4060 Ti will match the 3090 for RT.

People will need to drop down a tier this gen if they don't want sticker shock, but you'll still get a lot stronger performance.
 
Joined
Mar 21, 2016
Messages
2,195 (0.75/day)
You won't be able to get them either way. Dropping down a tier to offset the extreme price shock and power consumption rise is pretty solid advice though. If they just intro with a replacement to the RTX 3050 that would doing the world a huge favor. The re-released RTX 2060 is a joke with it's bus width for 12GB of VRAM where the core will sh*t a brick before it ever makes use of that much VRAM. You might as well call it a GTX970 because the core isn't going to cope well with 12GB VRAM usage.

In summary more expensive to buy if you aren't using it for mining like Nvidia intends it to be. It's basically within the error of margin of a 6GB version at or below 1440p within +/- 5% that's pathetic really in context. Also no SLI so there is no salvaging how bad it is for gamer's plus they tamped down on bios modding as well like it's not at all aimed at gaming except uninformed ones. The GTX 960 4GB had more upside you could SLI it and could bios mod the VRAM to run faster this card is just a joke outright.
 
Joined
Dec 26, 2012
Messages
1,039 (0.25/day)
Location
Babylon 5
System Name DaBeast! DaBeast2!
Processor AMD AM4 Ryzen 9 5900X 12C24T/AMD AM4 RYZEN 9 3900X 12C/24T
Motherboard Gigabyte X570 Aorus Xtreme/Gigabyte X570S Aorus Elite AX
Cooling Thermaltake Water 3.0 360/Thermalright PA 120 SE
Memory 2x 16GB Corsair Vengeance RGB RT DDR4 3600C16/2x 16GB Patriot Elite II DDR4 4000MHz
Video Card(s) XFX MERC 310 RX 7900 XTX 24GB/Sapphire Nitro+ RX 6900 XT 16GB
Storage 500GB Crucial P3 Plus NVMe PCIe 4x4 + 4TB Lexar NM790 NVMe PCIe 4x4 + TG Cardea Zero Z NVMe PCIe 4x4
Display(s) Samsung LC49HG90DMEX 32:9 144Hz Freesync 2/Acer XR341CK 75Hz 21:9 Freesync
Case CoolerMaster H500M/SOLDAM XR-1
Audio Device(s) iFi Micro iDSD BL + Philips Fidelio B97/FostexHP-A4 + LG SP8YA
Power Supply Corsair HX1000 Platinum/Enermax MAXREVO 1500
Mouse Logitech G703/Logitech G603 WL
Keyboard Logitech G613/Keychron K2
Software Win11 Pro/Win11 Pro
600watts = 4x 8pin connectors from a psu, so you probably need a psu upgrade just to run one.
Bah, I have an Enermax MAX REVO 1500W PSU that should be able to handle this, though I ain't looking at any RTX 4000 series cards, I'd be looking at an RX 7000 series card, will see the difference in performance and price between the RX 7800 XT and the RX 7900 XT.
 
Top