• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 40 Series "AD104" Could Match RTX 3090 Ti Performance

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,223 (0.91/day)
NVIDIA's upcoming GeForce RTX 40 series Ada Lovelace graphics card lineup is slowly shaping up to be a significant performance uplift compared to the previous generation. Today, according to a well-known hardware leaker kopite7kimi, we are speculating that a mid-range AD104 SKU could match the performance of the last-generation flagship GeForce RTX 3090 Ti graphics card. The full AD104 SKU is set to feature 7680 FP32 CUDA cores, paired with 12 GB of 21 Gbps GDDR6X memory running on a 192-bit bus. Coming with a large TGP of 400 Watts, it should have a performance of the GA102-350-A1 SKU found in GeForce RTX 3090 Ti.

Regarding naming this complete AD104 SKU, it should end up as a GeForce RTX 4070 Ti model. Of course, we must wait and see what NVIDIA decides to do with the lineup and what the final models will look like.


View at TechPowerUp Main Site | Source
 
Joined
Nov 11, 2016
Messages
3,062 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Disappointed if true, AD104 should beat GA102 by 30% like what happened with Pascal
perfrel_3840_2160.png
 
Last edited:
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
So, the new 400W model matches the old 450W model? Considering they moved from the "bad" 8nm Samsung node to the "great" 5nm TSMC node, it's not exactly a breathtaking result.
I believe either performance is higher or wattage much lower. 50W less for the same performance looks too little to me.
 
Joined
Aug 20, 2007
Messages
20,759 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
So, the new 400W model matches the old 450W model? Considering they moved from the "bad" 8nm Samsung node to the "great" 5nm TSMC node, it's not exactly a breathtaking result.
I believe either performance is higher or wattage much lower. 50W less for the same performance looks too little to me.
Also, less VRAM.

Pretty underwhelming. Maybe the Samsung node was not as bad as we had believed.
 
Last edited:
Joined
Feb 11, 2009
Messages
5,393 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> ... nope still the same :'(
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
yeah that is preeetty weak if true
 
Joined
Dec 5, 2020
Messages
159 (0.13/day)
Disappointed if true, AD104 should beat GA102 by 30% like what happened with Pascal
View attachment 256616
AD104 is a very small chip.

So, the new 400W model matches the old 450W model? Considering they moved from the "bad" 8nm Samsung node to the "great" 5nm TSMC node, it's not exactly a breathtaking result.
I believe either performance is higher or wattage much lower. 50W less for the same performance looks too little to me.
Or more likely the chip is pushed way beyond the optimal effiiciency curve. From the same leaker AD102 is supposed to bring 70-80% more performance at 450W and that's a big jump in efficiency.
 
Low quality post by fancucker
Joined
Oct 25, 2019
Messages
203 (0.12/day)
Nvidia will still be the more compelling option (more comprehensive forward looking package like RTX + DLSS). Updating your power supply is a small price to pay for greater quality. The gimmicky chiplet package on Navi 31 will only introduce latency and frame-rate consistency issues.
 
Joined
Aug 21, 2013
Messages
1,677 (0.43/day)
Disappointed if true, AD104 should beat GA102 by 30% like what happened with Pascal
View attachment 256616
I argued with people on Pascal launch who said 1080 was not enough of a step forward from their 980 Ti's and that 980 Ti overclocks to 1500Mhz easily etc. As if 1080 could not OC to 2000Mhz+
Nvidia will still be the more compelling option (more comprehensive forward looking package like RTX + DLSS). Updating your power supply is a small price to pay for greater quality. The gimmicky chiplet package on Navi 31 will only introduce latency and frame-rate consistency issues.
"comprehensive package" & "greater quality" :wtf:
Also im glad you know already how N31 will perform and what issues (if any) it will have.

We normal people will wait for reviews and prices before deciding. Not blindly buying from company N.
 
Joined
Jun 5, 2021
Messages
284 (0.27/day)
Also, less VRAM.

Pretty underwhelming. Maybe the Samsung node was not as bad as we had believed.
The Samsung node was good its the crap gddr6x that hogged power... plus if nvidia used Samsung 7lpe they would have reached 2500ghz on a 3090ti...
 
Joined
Dec 5, 2020
Messages
159 (0.13/day)
I argued with people on Pascal launch who said 1080 was not enough of a step forward from their 980 Ti's and that 980 Ti overclocks to 1500Mhz easily etc. As if 1080 could not OC to 2000Mhz+

"comprehensive package" & "greater quality" :wtf:
Also im glad you know already how N31 will perform and what issues (if any) it will have.

We normal people will wait for reviews and prices before deciding. Not blindly buying from company N.
980Ti had way more OC potential though. Stock the 1070 was 10-15% faster but at 1500mhz it was on par with a 2100mhz 1070.
 
Joined
Jan 30, 2018
Messages
216 (0.10/day)
System Name Dreamstation2
Processor Ryzen 7 3700X
Motherboard MSI X470 Gaming Plus
Cooling Hyper 212 Black Edition
Memory Kingston HyperX 32GB DDR4 3200 CL16
Video Card(s) Aorus 2080 Ti Turbo (sounds like a vaccum cleaner at full load)
Storage 2 x 1TB M.2 NVME + 1TB 2.5" SSD
Display(s) Samsung Odyssey G7 32" 4k
Case NZXT H500i
Audio Device(s) Asus Xonar U3 / Audio-Technica ATH-M50x / Edifier R1855DB
Power Supply Corsair TX650M
Mouse Corsair Scimitar Pro RGB
Keyboard Cooler Master Masterkeys Lite L
With electricity costs going up in the whole world, how come the 4070Ti draws more than double of power compared to a 1080 (180W)? New generations should bring more performance at the same power. We are getting expensive space heaters instead.
Add to that 400W heating in your room the costs of air conditioning and it's becoming a quite expensive hobby.
On a second thought, the RTX 3070Ti has awful performance per watt, especially compared to the RTX 3070 and RX6800, so there's hope my intended upgrade RTX 4070 will perform close to the Ti version but at much lower power draw.
 
Joined
Jan 31, 2011
Messages
26 (0.01/day)
I'm not even looking at the performance. If that mid-range cad uses between 400-500w, then I will not be touching this Generation even if the flagship is like 200% faster than 3090ti. I will not survive the summer with a 500w or higher power consuming card in my system as my 350w 3080ti is already pumping a crap tone of heat into my room.
 
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Nvidia will still be the more compelling option (more comprehensive forward looking package like RTX + DLSS). Updating your power supply is a small price to pay for greater quality. The gimmicky chiplet package on Navi 31 will only introduce latency and frame-rate consistency issues.
As always pricing and performance will be the deciding factor. DLSS is less of a parameter today with FSR 1.0 and 2.0. While DLSS is better in most cases having better implementation, also being in the market for much longer, meaning it is more tweaked and at more games, FSR 2.0 does the job nicely in most cases. Even FSR 1.0 will be enough for people just trying to go above a certain framerate. RTX is something that AMD also offers. And while Nvidia is still ahead there too, raytracing is not something essential to enjoy a game. I understand that pointing at Nvidia's exclusives does show them to have an advantage, but it's not exactly like the others don't offer those too today. Also we will have to wait and see how much AMD has improve their Ray Tracing performance in RDNA3. Nvidia will also offer higher RayTracing performance, but again, at what price points? If it starts with 4080 and 4090, we are talking about 4 digit pricing. I doubt Nvidia will sell even 4080 for less than $1000.
As for Navi 31 and latency and frame rate consistency and stuff, this is NOT CrossFire. Don't have high hopes there that Navi will be a disaster. Just wait and see.
And no, upgrading the power supply is NOT a small price to pay. Because is not just the power supply that could be an over $100 expense. It's also power consumption that will be a cost that will keep adding up for every hour of gaming.
 
Joined
Nov 11, 2016
Messages
3,062 (1.13/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
I argued with people on Pascal launch who said 1080 was not enough of a step forward from their 980 Ti's and that 980 Ti overclocks to 1500Mhz easily etc. As if 1080 could not OC to 2000Mhz+

Well i had Titan X Maxwell too and i didn't buy the gtx1080, simply because everyone knows 1080ti would be much faster
 
Joined
Feb 20, 2019
Messages
7,278 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
This rumour is in line with all the rumours and leaks so far but the real question is "at what cost"?

It's going to have as many transistors as GA102, it's going to use as much power as GA102, and it's (knowing Nvidia) not going to be cheap either. The only plus side is that TSMC5 is a denser process node, which should reduce manufacturing cost, but we all know Nvidia chose Samsung 8nm for Ampere because TSMC7 wouldn't budge on cost.
 
Joined
Jun 27, 2019
Messages
1,848 (1.05/day)
Location
Hungary
System Name I don't name my systems.
Processor i3-12100F 'power limit removed/-130mV undervolt'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75 Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
I'm not even looking at the performance. If that mid-range cad uses between 400-500w, then I will not be touching this Generation even if the flagship is like 200% faster than 3090ti. I will not survive the summer with a 500w or higher power consuming card in my system as my 350w 3080ti is already pumping a crap tone of heat into my room.

Same here, I draw the line around 220-230w for any card I'm willing to buy/put in my PC.
Starting from this August our electricity bill will cost double than what it used to, even my low power 12100F+undervolted GTX 1070 system will cost me around 10$/month with my casual use case. 'barely 1-3 hours/day gaming rest is light use'
So yeah this kind of power draw is a big nope for me, most likely I will just upgrade to a 3060 Ti/6700 XT and be done with it.
 
Joined
Apr 10, 2020
Messages
480 (0.33/day)
xx70 class (104 die) using 400W? That's pure insanity. Does Nvidia think all it's users live in Iceland, where weather is cold and electricity bill nearly non existent? Here in mainland Europe electricity expenses went up 200 to 300% since Putin's adventure in Ukraine plus record breaking temperatures scorching us outside. I still remember heated debates over 1080TI (102 die) power consumption (267W GPU under stress test) 5 years back now we're talking about double wattage for 104 die. Anything above 250W for 70 class is way too much. Nvidia lost it's way imho. I'd not be touching this thing even if Nvidia paid me to use it. It's already too hot in here with 200W GPU.

GIF by Camosun
 
Last edited:
Joined
May 17, 2021
Messages
3,005 (2.82/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
There is clearly a barrier neither Nvidia or AMD can pass through, they cannot increase performance in a meaningful way in the 2 year cycle without going crazy on power draw, no point in beating the dead horse. Skip this generation if you don't agree with the way things are going or deal with it.
People want big performance leaps and 4k 200Hz with the 2 year cycle with the same power draw and they are being as unrealistic as Nvidia and AMD

Set a wattage limit, stay in the limit no matter what they release, or shut up about it.
 
Joined
Nov 6, 2016
Messages
1,572 (0.58/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
I argued with people on Pascal launch who said 1080 was not enough of a step forward from their 980 Ti's and that 980 Ti overclocks to 1500Mhz easily etc. As if 1080 could not OC to 2000Mhz+

"comprehensive package" & "greater quality" :wtf:
Also im glad you know already how N31 will perform and what issues (if any) it will have.

We normal people will wait for reviews and prices before deciding. Not blindly buying from company N.

I like how he declares chiplets to be "gimmicky", how can you call something a "gimmick" when literally the entire industry is moving in that direction?

I seriously do not understand those that cheer for Nvidia or Intel... In terms of pure self-interest and what's more advantageous for the consumer, everyone should be cheering for AMD. The better AMD does against Intel and Nvidia, the more likely we get larger performance increases between generations, the more likely prices go down, the more likely innovation is pushed further, faster.

We all remember what the CPU market was prior to ryzen, right? 4% generational increases, 4 core stagnation, and all at a high price...alder lake and raptor lake would not exist without Ryzen.

And let's look at the GPU market, without RDNA2 mounting such a fierce competition, there's no doubt Nvidia's cards would be more expensive than they already are... (BTW, AMD is able to compete with Nvidia while having less than half the R&D budget, $5.26 billion vs $2 billion and AMD has to divide that $2 billion between graphics and x86 and x86 being the larger, more lucrative market, it must get the majority of those funds). And look at the latest Nvidia generation to be released, all the rumors of huge power consumption increases are evidence that Nvidia is doing everything in its power to push performance and all due to RDNA3.

I'm not saying everyone should be AMD fanboys, but don't the people who cheer on Intel and Nvidia realize that, at least until AMD has gotten closer to 50% market share in dGPU and x86 (especially enterprise and mobility, the two most lucrative x86 segments), victories for Intel and Nvidia inherently equate to losses for consumers? That these people who wish for AMD failure would have us plunged into a dark age even worse than the pre-ryzen days in both x86 and graphics... Sorry for the off topic rant, but I just don't get it when people are cheering for Nvidia prior to the products even being released, and by extension, cheering for reduced competition in the market... I guess the desire to create a parasocial relationship with whatever brand they deem to most likely be the winner is stronger than supporting what's best for your own material self-interest.
 
Joined
Apr 10, 2020
Messages
480 (0.33/day)
There is clearly a barrier neither Nvidia or AMD can pass through, they cannot increase performance in a meaningful way in the 2 year cycle without going crazy on power draw, no point in beating the dead horse. Skip this generation if you don't agree with the way things are going or deal with it.
People want big performance leaps and 4k 200Hz with the 2 year cycle with the same power draw and they are being as unrealistic as Nvidia and AMD

Set a wattage limit, stay in the limit no matter what they release, or shut up about it.
Why the F should we shut up about it? It's not like we end consumers have much say anyway in this monopoly/duopoly market. The only thing we can do, besides not buying, is to scream our opinion pointing out that development is moving into the wrong direction. Maybe, just maybe someone at Radeon and Nvidia will hear us.
 
Last edited:
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
development is moving into the wrong direction
Development is moving in the expected direction when people cheering for the model A that beats model B while consuming +40-50-100% more power. AMD was targeting efficiency and people kept cheering for the 250W Alder Lake and the 450W RTX 3090 Ti.
 
Last edited:

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.39/day)


GTX 1080 was 25% faster when comparing OC versions and 20% for the OC-OC. And it did that with 10% less transistors 7200/8000. 600 shrinking to 300mm2. and the impressive 59% improvement between FE and OC-OC.

Now 4070 Ti has more transistors, L2 and ROPs, But cut to 192 bit bus and using same memory speed G6X. no improvement there. Bandwidth is cut to less than half.
 
Last edited:
Joined
Dec 12, 2016
Messages
1,227 (0.46/day)
I could be wrong but isn’t CUDA core count calculated by adding FP32 cores (7680) to FP64 cores (3840) for a total of 11,520 cores. That’s 10% higher cores than the 3090Ti at 12% lower power. I mean it’s not great but it’s not bad either.

I remember all the rumors not knowing how to count CUDA cores before the 3000 series launch. Looks like that might be the case again.
 
Joined
Oct 6, 2021
Messages
1,422 (1.54/day)
Disappointed if true, AD104 should beat GA102 by 30% like what happened with Pascal
View attachment 256616
Well, Is this a rule just because it happened once?

The only thing I know is that the next generation will be more expensive and it will take a long time to get entry-level and mid-range cards, stores are suffering from overstock of GPUs...
 
Top