• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7600 Early Sample Offers RX 6750 XT Performance at 175W: Rumor

Joined
Jun 2, 2017
Messages
8,220 (3.20/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I consoom so much, that I skipped the GPU generation for the first time in years. Please, I just don't hold allegiances. I don't care if AMD is in dire straits or if Jensen Huang has a multi-million-dollar leather jacket collection, I just want products that deliver, and this gen was the first that didn't in my living memory. I'll happily point out my grievances with both of them, though. This thread should have made that clear.
This generation did not deliver? My 7900XT is in every way faster than my 6800XT. Correspondingly my 7900X3D is also mush faster in feel than the 5800X3D. The narrative is strong for this generation but once you cut through the noise the truth is that AMD with the 6000 and 7000 will have a full stack for people to get into. I see the 6700XT going to $299 and that would be perfect for people like me. If the 7600XT can match a 6750XT that is good news. These chips are not one super huge chip but chiplets so things like that Monster Enterprise Card with 32GB of VRAM could easily be configured for Desktop use.
 
Joined
Dec 25, 2020
Messages
5,054 (3.98/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
This generation did not deliver? My 7900XT is in every way faster than my 6800XT. Correspondingly my 7900X3D is also mush faster in feel than the 5800X3D. The narrative is strong for this generation but once you cut through the noise the truth is that AMD with the 6000 and 7000 will have a full stack for people to get into. I see the 6700XT going to $299 and that would be perfect for people like me. If the 7600XT can match a 6750XT that is good news. These chips are not one super huge chip but chiplets so things like that Monster Enterprise Card with 32GB of VRAM could easily be configured for Desktop use.

It's not enough, especially when you factor in the cost. Also, the only reason I could possibly want more GPU horsepower is for raytracing. AMD doesn't deliver that to me. It doesn't impress me because I already had what you're currently experiencing over 2 years ago. Almost 3, at this point.

 
Joined
Jun 2, 2017
Messages
8,220 (3.20/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
It's not enough, especially when you factor in the cost. Also, the only reason I could possibly want more GPU horsepower is for raytracing. AMD doesn't deliver that to me. It doesn't impress me because I already had what you're currently experiencing over 2 years ago. Almost 3, at this point.

My experience was nuanced, the thing is I originally got a 7900XTX and it died. In the meantime I sold my 6800XT in a system. I got a refund for my 7900XTX and got a 7900XT for $400 less. I am in no way interested in Ray Tracing so that did not get me and if I don't use FSR why would I care about DLSS? That is me though but it's not like I did not enjoy thoroughly my 6800XT it's just that the 7900XT actually drives my 4K 144Hz panel exactly how I thought it would and that is more than enough for me. I also am a huge fan of AMD's driver/software support which is not circa 2012 anymore and actually is great as the RX580 8GB is still a viable card so many years later. I know that Nvidia has perceived more stable drivers but when I got a 3060 laptop I shook my head when I went into the software package to see the exact same interface as my GTS450 from 2010.
 
Joined
Dec 25, 2020
Messages
5,054 (3.98/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
My experience was nuanced, the thing is I originally got a 7900XTX and it died. In the meantime I sold my 6800XT in a system. I got a refund for my 7900XTX and got a 7900XT for $400 less. I am in no way interested in Ray Tracing so that did not get me and if I don't use FSR why would I care about DLSS? That is me though but it's not like I did not enjoy thoroughly my 6800XT it's just that the 7900XT actually drives my 4K 144Hz panel exactly how I thought it would and that is more than enough for me. I also am a huge fan of AMD's driver/software support which is not circa 2012 anymore and actually is great as the RX580 8GB is still a viable card so many years later. I know that Nvidia has perceived more stable drivers but when I got a 3060 laptop I shook my head when I went into the software package to see the exact same interface as my GTS450 from 2010.

Perhaps you're much more easily impressed than I am, but then again the relative base I am using is in another level altogether. Ray tracing performance is the only thing I really care about at this point because there are no raster-only games which an RTX 3090 won't comfortably run on ultra high. The other auxiliary improvements that RDNA 3 may bring, such as a higher quality encoder, are all things that Nvidia had already given me all those years ago. AMD's barely catching up here.

Changing the control panel's design often is by no means a sign of quality driver support, btw. And especially not of driver stability. ;)
 
Joined
Sep 17, 2014
Messages
21,342 (5.99/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
It's not enough, especially when you factor in the cost. Also, the only reason I could possibly want more GPU horsepower is for raytracing. AMD doesn't deliver that to me. It doesn't impress me because I already had what you're currently experiencing over 2 years ago. Almost 3, at this point.

You bought into the top of the stack, that means any further upgrades, especially with just one gen between them, are going to be hyper costly for minimal gain.

The longer you wait, the more you save. I gained almost 300% from last card to this one. Even if the price was too high, the gain made it totally worthwhile.

You expect too much from one gen to the next - you could move to a 4090 btw, that's +80%. ;) Seems substantial - any other option was off the table for you regardless.
 
Joined
Dec 25, 2020
Messages
5,054 (3.98/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
You bought into the top of the stack, that means any further upgrades, especially with just one gen between them, are going to be hyper costly for minimal gain.

The longer you wait, the more you save. I gained almost 300% from last card to this one. Even if the price was too high, the gain made it totally worthwhile.

You expect too much from one gen to the next - you could move to a 4090 btw, that's +80%. ;) Seems substantial - any other option was off the table for you regardless.

I thought it was the more you buy the more you save :laugh:

But yeah, I agree. The low VRAM curse of the 3080 doesn't affect me, so I'm just gonna wait for RDNA 4 and Blackwell GPUs before I make a decision, unless a miracle happens and GPU prices lower quite significantly. Next thing I will be purchasing is an OLED TV, current display I have is alright but doesn't do my PC justice.
 
Joined
Feb 3, 2017
Messages
3,567 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
2nd generation G6X (introduced with 3090 Ti) greatly alleviated power consumption, and the 4070 which is the GPU in the same power consumption tier uses standard G6, doesn't it? Because if not... The performance per watt disparity is even more extreme.
RTX 4070 is using GDDR6X.

Do you have a reference, maybe a link about the 2nd generation GDDR6X in RTX 3090Ti? I do not remember anything resembling this from any coverage.
RTX 3090 Ti did get a more efficient VRAM subsystem but it was simply because RTX 3090 TI got 2GB memory chips instead of double the amount of 1GB chips mounted on both front and back of the card. This should bring a nice 30% or so of power saving by itself.
 
Joined
Dec 25, 2020
Messages
5,054 (3.98/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
RTX 4070 is using GDDR6X.

Do you have a reference, maybe a link about the 2nd generation GDDR6X in RTX 3090Ti? I do not remember anything resembling this from any coverage.
RTX 3090 Ti did get a more efficient VRAM subsystem but it was simply because RTX 3090 TI got 2GB memory chips instead of double the amount of 1GB chips mounted on both front and back of the card. This should bring a nice 30% or so of power saving by itself.

Yeah, I saw, I looked it up after I posted that. That makes the 4070 even more remarkable to me.

The original 3090 also received 21 Gbps chips, specifically Micron MT61K256M32JE-21 (D8BGX), the reason the 3090 ships at 19.5Gbps is to save power (around 40% of this GPU's power budget is chugged by the G6X alone). That, and they don't clock much above that, so there's no illusion of headroom, my personal card does *exactly* 21 Gbps and not an inch more. Well, maybe just a tiny bit - 1319 MHz according to GPU-Z, instead of 1313:

1682799112788.png


The 3090 Ti has the updated Micron MT61K512M32KPA-21:U (D8BZC) chip, same as the 4090:

 
Joined
Nov 26, 2021
Messages
1,401 (1.50/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
The original 3090 also received 21 Gbps chips, specifically Micron MT61K256M32JE-21 (D8BGX), the reason the 3090 ships at 19.5Gbps is to save power (around 40% of this GPU's power budget is chugged by the G6X alone). That, and they don't clock much above that, so there's no illusion of headroom, my personal card does *exactly* 21 Gbps and not an inch more. Well, maybe just a tiny bit - 1319 MHz according to GPU-Z, instead of 1313:
I think you're mistaken. Micron claims 7.25 pJ pr bit for GDD6X. That works out to about 54 W for the 3090. The 24 memory chips should be 24 to 48 W. That is at most 30% of a 3090's power budget.
 
Joined
Dec 25, 2020
Messages
5,054 (3.98/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
I think you're mistaken. Micron claims 7.25 pJ pr bit for GDD6X. That works out to about 54 W for the 3090. The 24 memory chips should be 24 to 48 W. That is at most 30% of a 3090's power budget.

Trust me, I make that claim from experience. GPU-Z is capable of measuring and reporting the MVDDC rail wattage on these cards. It can easily push north of 100 W, and the memory controller load isn't even maxed out. Driving the clamshell G6X on the 3090 at high resolutions such as 4K is absurdly power demanding. Using 3DMark Speed Way (heavy raytracing workload) as an example, it will average 120 watts here.

1682818219926.png


That reminds me, I think time to repaste this card is coming. 3 years of ownership without opening it, the hotspot temps are getting a bit high for my taste :oops:
 
Joined
Nov 15, 2021
Messages
2,759 (2.92/day)
Location
Knoxville, TN, USA
System Name Work Computer | Unfinished Computer
Processor Core i7-6700 | Ryzen 5 5600X
Motherboard Dell Q170 | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | RTX 2080 Ti FE
Storage Crucial BX500 2TB | TBD
Display(s) 3x LG QHD 32" GSM5B96 | TBD
Case Dell | Heavily Modified Phanteks P400
Power Supply Dell TFX Non-standard | EVGA BQ 650W
Mouse Monster No-Name $7 Gaming Mouse| TBD
Trust me, I make that claim from experience. GPU-Z is capable of measuring and reporting the MVDDC rail wattage on these cards. It can easily push north of 100 W,
Jeez, that is a lot more inefficient than I had thought it would be.

I guess that is why VRMs need at least as much cooling as the VRAM chips themselves.

I would assume MVDDC power draw reported would be the incoming side of the VRM.
 
Joined
Dec 25, 2020
Messages
5,054 (3.98/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Jeez, that is a lot more inefficient than I had thought it would be.

I guess that is why VRMs need at least as much cooling as the VRAM chips themselves.

I would assume MVDDC power draw reported would be the incoming side of the VRM.

Yeah, you begin to understand why NVIDIA opted to install only 10 GB on the 3080 when you see the original 3090 at work. It made sense for most gamers, once you account for lower shader count and that the GPU core itself is afforded a lot more power, should have been a no brainer at resolutions that gamers commonly use. Except VRAM usage began to balloon up, and there's a few situations where those 10 GB can be a bit uncomfortable already. The 3090 relies on the extra shaders present to do that same work, which is why they are a lot closer than they should be. The 3090 Ti's faster because it solved the memory power consumption problem, fully enabled the GPU and raised the power limit at the same time, so that's where the 20% perf uplift comes from despite the 3090 technically being 98% enabled.

As for the side, I don't know exactly, but it makes sense to me
 

Sherhi

New Member
Joined
Apr 30, 2023
Messages
6 (0.01/day)
I hope it's good card, shame it's only 8gb though...dunno about technical shenanigans but consoles have 10gbs? So I would like that number as bare minimum honestly for 60s card that is considered midtier because many studios are limited (or unchained) by current consoles' hardware. I am still using gtx 760 and even though I play older games (mostly grand strategies like EU4) it's starting to show it's age. At this point almost any new card will give me like 450-500% better performance but I see modern midtier standard to be at 60s card with enough memory to run new games at 1440p on medium settings without any issues and for the next 3-4 years which...doesn't seem to be the case.
 
Joined
Dec 25, 2020
Messages
5,054 (3.98/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
I hope it's good card, shame it's only 8gb though...dunno about technical shenanigans but consoles have 10gbs? So I would like that number as bare minimum honestly for 60s card that is considered midtier because many studios are limited (or unchained) by current consoles' hardware. I am still using gtx 760 and even though I play older games (mostly grand strategies like EU4) it's starting to show it's age. At this point almost any new card will give me like 450-500% better performance but I see modern midtier standard to be at 60s card with enough memory to run new games at 1440p on medium settings without any issues and for the next 3-4 years which...doesn't seem to be the case.

Good card and 8 GB are mutually exclusive these days, and have been for some time now, far before the trend caught up. People would be mad at me for saying this not 6 months ago.

Anyway, consoles have an unified 16 GB pool and a custom OS which doesn't consume as many resources as Windows, nor have the applications that you'd usually have chugging your RAM. Games are also shipped in custom settings for the console's capabilities, so they have assets optimized for its format, unlike on PC, where assets tend to emphasize quality or performance, instead of a tailored mix of both. Fortunately, a 32 GB RAM kit is affordable nowadays unless you go for high-bin, exotic performance kits with select ICs, so you should buy that instead.
 
Joined
Nov 26, 2021
Messages
1,401 (1.50/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Trust me, I make that claim from experience. GPU-Z is capable of measuring and reporting the MVDDC rail wattage on these cards. It can easily push north of 100 W, and the memory controller load isn't even maxed out. D30riving the clamshell G6X on the 3090 at high resolutions such as 4K is absurdly power demanding. Using 3DMark Speed Way (heavy raytracing workload) as an example, it will average 120 watts here.
This suggests that the early GDDR6X devices had high power consumption, i.e. the power consumed by a single chip is probably around 3 W instead of the 1 to 2 W that has been the norm for a while. It would be nice if we could get a screenshot from a 3090 Ti in the same benchmark. I also noticed that the combined chip and VRAM power drawn is barely 75% of the board draw. It seems that the VRMs are rather inefficient.

Honestly, given the high prices of the flagships, HBM is beginning to look better for them. An additional 500 to 600 dollars won't bother the buyers of these cards. Also, for laptop GPUs, LPDDR5 would be better than GDDR6 etc. Widen the interface by 2x and you would still save power.
 
Last edited:
Joined
Dec 25, 2020
Messages
5,054 (3.98/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
This suggests that the early GDDR6X devices had high power consumption, i.e. the power consumed by a single chip is probably around 3 W instead of the 1 to 2 W that has been the norm for a while. It would be nice if we could get a screenshot from a 3090 Ti in the same benchmark. I also noticed that the combined chip and VRAM power drawn is barely 75% of the board draw. It seems that the VRMs are rather inefficient.

Honestly, given the high prices of the flagships, HBM is beginning to look better for them. An additional 500 to 600 dollars won't bother the buyers of these cards. Also, for laptop GPUs, LPDDR5 would be better than GDDR6 etc. Widen the interface by 2x and you would still save power.

Apparently it's a lot lower, the way 3090 Ti has been re-engineered improves the memory subsystem in many ways. The 3080 Ti does not have this improved memory, it uses a lower bin 19 Gbps G6X chip that is also used in the RTX 3070 Ti, so it's not a valid comparison. However, it is also a newer revision from the original 3080 10 GB's chips. Basically:

3080 (original 10 GB model) uses 10x 8Gbit Micron MT61K256M32JE-19:T (D8BGW), rated 19 Gbps
3070 Ti (8x), 3080 12 GB, 3080 Ti (12x) 8Gbit Micron MT61K256M32JE-19G:T (D8BWW), rated 19 Gbps
3090 uses 24x 8Gbit Micron MT61K256M32JE-21 (D8BGX), rated 21 Gbps
3090 Ti and 4090 use 12x 16Gbit Micron MT61K512M32KPA-21:U (D8BZC), rated 21 Gbps
As of now, other Ada cards use the same chips as 3090 Ti and 4090 but in lower quantities appropriate for their bus widths

Which makes the RTX 3090 unique in its extreme memory power consumption, as it has the first generation and first revision of chips, at their highest speed bin and you actually need to feed 24 of them. It's the worst case scenario.

From my understanding, the problem with HBM is that the silicon and the memory must be flawless and cannot be tested until packaged, if there's problems with the substrate, GPU ASIC or in any of the active HBM stacks, the entire package has to be discarded. This greatly reduces yield and was a cause for concern for AMD with Fiji and the two Vega generations. Titan V as well, it had a bad/disabled HBM stack (3072 out of 4096-bit enabled). It might not be feasible, especially considered the more affordable products tend to use harvested versions of the higher end chips, or they just disable them to maximize yield and profit as Nvidia has done with the 4090.
 
Last edited:
Joined
Nov 20, 2015
Messages
18 (0.01/day)
Location
Italy
If the performance numbers are true than this is another DOA card! Hey I'm saying this for both gpu makers, what a shock, we can be objective and not hold ANY company in our hearth!

For this to actually be good it would need to be at least 6-7% faster as the report suggested and not cost a penny over $300 and also be available with 16GB of vram for $40 more!

This needs to be on par with the RX 6800, draw less power and cost $300 or less to be actually good value! If AMD are smart they will go with this strategy and offer a 16GB model as well for $40 or $50 more!
Maybe in 2010 you could have expected it, now e 2 class jump is a dream, and a sub 300$ price is even more a dream, 7600 will be the replacement of 6600, probably it will go like a 6700 with a little bit less energy drain, hardly any more. A 6800 is in another league, and the xt version is still one of the most price performance card you could buy today for 1440p.
 
Top