• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon RX 7800 XT Based on New ASIC with Navi 31 GCD on Navi 32 Package?

Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
No. Simply, no. GPUs are not that complicated to have a performance estimate on.

6900XT vs 3090 were roughly equal (figuring out the SKUs aside where AMD seems to have reacted with 6900XT)
- 80CU vs 82SM, roughly same amount of transistors and shader units. Nvidia had slight disadvantage from being half a node behind.
- AMD bet on LLC to make up for 256-bit memory bus vs 384-bit on 3090. A successful bet, in hindsight.

This is simply not the case for 4090 vs 7900XTX. 128SM vs 96CU on same process node, same memory bus width, similar enough LLC.
There are definitely cases where 79000XTX can get close, mostly when power or memory becomes the limiting factor.

It's an invalid comparison because they aren't the same architecture or work in a similar way, remember back in the Fermi v. TeraScale days, the GF100/GTX 480 GPU had 480 shaders (512 really but that config never shipped) while a Cypress XT/HD 5870 had 1600... nor can you go by the transistor count estimate because the Nvidia chip has several features that consume die area such as tensor cores and an integrated memory controller and on-die cache that the Navi 31 design does not (with L3 and IMCs being offloaded onto the MCDs and the GCD focusing strictly on graphics and the other SIPP blocks). It's a radically different approach in GPU design that each company has taken this time around, so I don't think it's "excusable" that the Radeon has less compute units because that's an arbitrary number (to some extent).

If you ask me, I would make a case for the N31 GCD being technically a more complex design than the portion responsible for graphics in AD102. And of course, the 7900 XTX can never get close unless you pump double the wattage into it.
 
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
It's an invalid comparison because they aren't the same architecture or work in a similar way, remember back in the Fermi v. TeraScale days, the GF100/GTX 480 GPU had 480 shaders (512 really but that config never shipped) while a Cypress XT/HD 5870 had 1600... nor can you go by the transistor count estimate because the Nvidia chip has several features that consume die area such as tensor cores and an integrated memory controller and on-die cache that the Navi 31 design does not (with L3 and IMCs being offloaded onto the MCDs and the GCD focusing strictly on graphics and the other SIPP blocks). It's a radically different approach in GPU design that each company has taken this time around, so I don't think it's "excusable" that the Radeon has less compute units because that's an arbitrary number (to some extent).

If you ask me, I would make a case for the N31 GCD being technically a more complex design than the portion responsible for graphics in AD102. And of course, the 7900 XTX can never get close unless you pump double the wattage into it.
Thing is my card is just as fast as yours at RT and I bet I paid less for my card. too.
 
Joined
Nov 11, 2016
Messages
3,142 (1.14/day)
System Name The de-ploughminator Mk-II
Processor i7 13700KF
Motherboard MSI Z790 Carbon
Cooling ID-Cooling SE-226-XT + Phanteks T30
Memory 2x16GB G.Skill DDR5 7200Cas34
Video Card(s) Asus RTX4090 TUF
Storage Kingston KC3000 2TB NVME
Display(s) LG OLED CX48"
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Viper Ultimate
Keyboard Corsair K75
Software win11
Yep this is definitely an AMD titled GPU post. You would think these cards are an absolute failure. There is one thing the 7900 series cards can do that no 6000 card can. Both cards can run 4K 144Hz panels no problem. There is no Game, not even TWWH3 (Not the benchmark) that cannot be enjoyed at 4K with those cards. Idle monitor usage or video playback issues are so small that people forget that the message before the cards were launched was to get a nice Beefy PSU to avoid any issues with power and I hope none of the people Whining about power draw own an OLED panel. This card has no foreknowledge so saying it will be X, Y or Z is just conjecture. I will say the last time AMD showed confidence the Internet barbqed them and some of the negative hype is directly from the propaganda Campaign that is Nvidia's "advantage". This card should absolutely kick butt at 1440P anything anyway. Let's keep in mind that the 6700XT is not far away from a 6800 at 1440P.

4K 144hz LMAO
Take another guess maybe, barely 60FPS in 2023 games :roll:
hogswart.png
dead space.png

Forspoken.png
performance-3840-2160.png


Without relying on Upscaling, 7900XT/X won't have the grunt going forward @ 4K60FPS (much less 4K144FPS), and that's where DLSS vs FSR will come into the equation

Edit: forgot Jedi
Jedi.png
 
Last edited:
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Thing is my card is just as fast as yours at RT and I bet I paid less for my card. too.

Yes, you paid less for your card, but I bought mine before even the 6900 XT released (as it was the last model to come out). That was Sept. 24 2020, almost 3 years ago. See the issue at hand? ;)
 
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
4K 144hz LMAO
Take another guess maybe, barely 60FPS in 2023 games :roll:
View attachment 302178View attachment 302179
View attachment 302180View attachment 302181

Without relying on Upscaling, 7900XT/X won't have the grunt going forward @ 4K60FPS (much less 4K144FPS), and that's where DLSS vs FSR will come into the equation
Yep well I guess there have been improvements since launch. It never ceases to amaze me how people will argue with owners on how their cards are supposed to perform.
AMD Software_ Adrenalin Edition 2023-06-23 11_25_48 AM.png


Yes, you paid less for your card, but I bought mine before even the 6900 XT released (as it was the last model to come out). That was Sept. 24 2020, almost 3 years ago. See the issue at hand? ;)
It is all relative I am happy with my purchase and that is all that should matter. Are you not happy with yours?
 
Joined
Dec 25, 2020
Messages
4,908 (3.91/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Ocean Plastic Mouse
Keyboard Galax Stealth
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Yep well I guess there have been improvements since launch. It never ceases to amaze me how people will argue with owners on how their cards are supposed to perform.View attachment 302182


It is all relative I am happy with my purchase and that is all that should matter. Are you not happy with yours?

I sure am! But I resent there's no upgrade path other than the 4090 3 years later :(
 
Joined
May 17, 2021
Messages
3,005 (2.70/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
I totally disagree with this argument that only people that have card x can talk about card x, that is absurd. Like this is a ultra secret thing and we can't all know everything about any card.
 

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,527 (2.37/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
People can definitely talk about things they don't own - but folks can stop trolling. If your input isn't constructive to the OP, don't waste your keycaps.
 
Joined
Feb 24, 2023
Messages
2,299 (4.97/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / RX 480 8 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / Viewsonic VX3276-MHD-2
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / FSP Epsilon 700 W / Corsair CX650M [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
Yet I also disagree with the concept of "owner of no X can't discuss X," it's as invalid as it gets.
 
Last edited by a moderator:
Joined
Feb 18, 2005
Messages
5,351 (0.76/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
I love how people are using this image as evidence that AMD isn't rubbish at low-load power consumption. Look at the bottom 5 worst GPUs. LOOK AT THEM. Who makes them?

Then look at the 3090 and 3090 Ti. They're near the bottom, yet their successors are in the middle of the pack - almost halving power consumption. It's like one company is concerned with making sure their product is consistently improving in all areas generation-to-generation, while the other is sitting in the corner with their thumb up their a**.

The fact that AMD has managed to bring down low-load power consumption since the 7000-series launch IS NOT something to praise them for, because if they hadn't completely BROKEN power consumption with the launch of that series (AND THEIR TWO PREVIOUS GPU SERIES), they wouldn't have had to FIX it.

Now all of the chumps are going to whine "bUt It DoESn'T mattER Y u MaKIng a fUSS?" IT DOES MATTER because it shows that one of these companies cares about delivering a product where every aspect has been considered and worked on, and the other just throws their product over the fence when they think they're done with it and "oh well it's the users' problem now". If I'm laying down hundreds or thousands of dollars on something, I expect it to be POLISHED - and right now only one of these companies does that.

I totally disagree with this argument that only people that have card x can talk about card x, that is absurd. Like this is a ultra secret thing and we can't all know everything about any card.
It's an incredibly lazy non-argument used by those who are intellectually bankrupt. The best course of action is to ignore said people.
 
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I am not against discussing other products but making desultory comments with no context is my issue. You will never hear me complain about the performance of Nvidia cards but I can talk about my opinion on Ray tracing, DLSS and FSR but I can't say one is better I have not used them so.
 
Joined
Jan 17, 2018
Messages
388 (0.17/day)
Processor Ryzen 7 5800X3D
Motherboard MSI B550 Tomahawk
Cooling Noctua U12S
Memory 32GB @ 3600 CL18
Video Card(s) AMD 6800XT
Storage WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB)
Display(s) Samsung Odyssey G9
Case Be Quiet! Silent Base 802
Power Supply Seasonic PRIME-GX-1000
Another person pointed at the newer numbers, but probably you didn't read more posts. That being said the over 40W power consumption in TPU's review is still pretty high. Please read rest of the posts before replying. Not going to repeat what is already written.

As for 7600, you are missing the point about performance. More efficient? Where do you base this?
Do you not actually read any of the reviews on this site? You seem to run off of old information, or just plain wrong information. The efficiency of the 7600 over the 6650XT is clear as day in the review on TPU. Sure, the 6650XT isn't in the 'efficiency' chart, but the more power-efficient 6600XT is, which itself is around 15% more efficient than the 6650XT was.

I was being kind when I said the 7600 is 20% more efficient. It's probably closer to 30%.
 
Joined
Sep 10, 2018
Messages
5,603 (2.68/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R9 5950X Stock
Motherboard X570 Aorus Master/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory 32 GB 4x8GB 4000CL15 Trident Z Royal/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) LG G2 65/LG C1 48/ LG 27GP850/ MSI 27 inch VA panel 1440p165hz
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Corsair K95 RGB Platinum/ Logitech G Pro
I sure am! But I resent there's no upgrade path other than the 4090 3 years later :(

Yeah that does suck I expected the 4080 to be 800 usd and a decent upgrade but at it's price it really sucks. I also expected the 7900XTX to be better after how good the 6900X/6950X were and was disappointed leading me to the 4090 as my only decent upgrade option. These are honestly in my opinion the most meh high end products from both companies due to pricing.

The 7800XT is shaping up to be a really meh product slightly faster than a 4070 in raster but worse at every other metric likely for around the same price. I guess we already have the real 7800XT though the 7900XT that is just priced stupid but starting to make more sense in the low 700s
 
Joined
Mar 25, 2023
Messages
72 (0.17/day)
Processor Celeron G5905, I3 10100, I5 10400F, I7 10700F
Motherboard Asrock H410, B460, B560, Gigabyte B560
Cooling Zalman CNPS80G
Memory each System 16 or 32GB: Kingston 2666 CL12, 2933 CL14
Video Card(s) Arc A380/A770, IGP
Storage SSD and some HDD
Display(s) Philips 24inch 1080p 165Hz IPS and 32 inch 1440p 165Hz VA
Case Antec, Corsair, Nanoxia
Audio Device(s) Different AVR, Speakers: Klipsch, Polk .....
Power Supply FSP, Deepcool
Mouse Logitech G
Keyboard Logitech G
Software Win 10, Bodhi Linux, Deepin
no need for this garbage, booth Nvidia and AMD did a level up.

7600 300€ vs 6650 249€
RTX 4060 6GB 350€ vs RTX 3060 12GB 300€


Arc 750 8GB 229€
Arc 380 6GB 134€ its still cheaper with its 6GB than garbage and old Cards like: RX 6400 139€, GTX 1630 148€, GTX 1650 172€
 
Joined
Jan 8, 2017
Messages
9,108 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
because if they hadn't completely BROKEN power consumption with the launch of that series

I have never had this problem and I have the card since launch.
 
Joined
Sep 6, 2013
Messages
3,050 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Do you not actually read any of the reviews on this site? You seem to run off of old information, or just plain wrong information. The efficiency of the 7600 over the 6650XT is clear as day in the review on TPU. Sure, the 6650XT isn't in the 'efficiency' chart, but the more power-efficient 6600XT is, which itself is around 15% more efficient than the 6650XT was.

I was being kind when I said the 7600 is 20% more efficient. It's probably closer to 30%.
Just go one page before the one you point in that review and explain me those numbers.
AMD Radeon RX 7600 Review - For 1080p Gamers - Power Consumption | TechPowerUp

Before answering remember this
7600 new arch, new node both should be bringing better efficiency for 7600. But what do we see? In some cases 6600XT with lower power consumption, in some cases 7600 with lower power consumption.

Unfortunately I have to agree with @Assimilator in his above comment. And I am mentioning Assimilator here because he knows my opinion about him. Simply put, he in in my ignore list!
But his comment here is correct.
The fact that AMD has managed to bring down low-load power consumption since the 7000-series launch IS NOT something to praise them for, because if they hadn't completely BROKEN power consumption with the launch of that series (AND THEIR TWO PREVIOUS GPU SERIES), they wouldn't have had to FIX it.
...
it shows that one of these companies cares about delivering a product where every aspect has been considered and worked on, and the other just throws their product over the fence when they think they're done with it and "oh well it's the users' problem now".

When AMD was with a tight budget, think Polaris period, it was almost their way of doing their business. Never in one generation power efficiency and performance improvements at the same time. Only one of those. Now that they DO have money they should have been able to do that. Work the final product in all aspects and come up with a new product that is better in everything compared to the previous one. But unfortunately they focused so much on CPUs, that they lost the boat with GPUs. And NOW with AI explosion they are realizing that GPUs are becoming more important. But are they going to throw money in all GPU architectures or only on Instinct? If they focus only on Instinct they will lose the next generation of consoles. At least Microsoft who I bet realized by now that to beat PlayStation they need to do something drastic. I hope at AMD to fear the possibility of MS or Sony or both going with Intel and Nvidia next gen. Those will be more expensive consoles to make, but gaming cards in PCs are also much more expensive, so higher prices on consoles isn't something we should rule it out as a possibility.

I have never had this problem and I have the card since launch.
You have the card, we look at numbers from TPU review. Either your card works as it should, which will be great and something went wrong with the review numbers, or something else is happening.
 
Last edited:
Joined
Apr 14, 2018
Messages
480 (0.21/day)
Probably referencing to AMD's slide where it says that Navi 31 is designed to go over 3.0GHz and that higher power consumption in some scenarios, if I am not mistaken, idle, multi monitor, video playback - something like that.


I do read many tea leaves here, you on the other hand don't read correctly what I wrote. :p Never compared 6800XT with 7900XT.

Also 7600 should have been as fast as 6700, just as a generational rule where the x600 should be as fast or faster than previous x700 in both Nvidia's and AMD's case.
But the problem is not 7600 vs 6700. It's 7600 vs 6650 XT. Same specs, same performance, meaning RDNA2 to RDNA3 = minimal gains.

Do you bring this bad take into every AMD thread? There are PROVEN performance gains, and PROVEN efficiency gains. Off the top of my head 6900XT vs 7900XT in cyberpunk 2077; RDNA3 being roughly 25% more efficient in that specific scenario from TPUs own benchmarks. Please stop with misinformation.
 
Joined
May 17, 2021
Messages
3,005 (2.70/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Now that they DO have money they should have been able to do that.

money doesn't solve those problems, ask Intel
 
Joined
Sep 6, 2013
Messages
3,050 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Do you bring this bad take into every AMD thread? There are PROVEN performance gains, and PROVEN efficiency gains. Off the top of my head 6900XT vs 7900XT in cyberpunk 2077; RDNA3 being roughly 25% more efficient in that specific scenario from TPUs own benchmarks. Please stop with misinformation.
It's funny. I am an AMD fan for over 20 years, always choosing AMD hardware when having the option, an RX 6600 at 190 euros does look like a nice little upgrade for my RX 580 and I really regret that 7600 in Greece is still overpriced(over 280 euros), all my posts are usually AMD friendly, but expressing STRONG CONCERN about the company and it's choices and where it goes and if it will be able to remain a true competitor to Intel and Nvidia for years to come is enough excuse for you who obviously have no idea about my posts to start accusing me of "bringing this bad take into every AMD thread". Nice. Bravo!!!! :clap:

People where expecting 7900 series to be killers, in raster at least and at a lower power consumption. Based on TPU numbers that probably every one here disputes, 7900 XT is on average 14% faster than 6950XT with more shading units, more ROPS, more RT cores, more VRAM, more memory bandwidth. It's RDNA3 and still looks like a bigger RDNA2 chip. Do you even understand where my concerns are?

Maybe I should stop posting sinsere concerns and start posting pictures like the one below to make you happy.
1687543197116.png
 
Joined
Sep 17, 2014
Messages
21,210 (5.98/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Look, we see things differently. And it's not about 50W of power in gaming for example or in Furmark. It's in video playback. I mean why waste an extra of 30W of power while watching a movie? I am not 15 years old when I was playing games all day. Today I will spend more time watching youtube videos and movies than gaming. So power consumption in video playback is important for me. Does AMD have the luxury to tell me "If you want low power consumption in video playback go and buy a competing card"? As I said, AMD should start looking for advantages that CAN achieve. Not trying to play catch up with Nvidia with Nvidia dictating the rules. Where is FreeSync 3.0 with Frame Generation? I am throwing it as an example. AMD should be looking in improving it's hardware in various ways, not just trying to follow where Nvidia wants to drive the market.
Well, if you go about life wondering if you might waste 30W somewhere, I think you're on a good path to suck all the fun out of it :D

I don't think AMD is trying to play catch up; I think we have lots of examples of them really doing their own thing to carve out a unique competitive position for the company.
- When Nvidia had RTX in play, AMD response was 'that's cool, but it won't go places until it gets mainstream market reach - midrange GPUs need to carry it comfortably'.
- RDNA2 had RT support but barely sacrifices hardware / die size for it.
- RDNA3 much the same even with what it does have.

Meanwhile, they rolled out a path of RDNA that was focused on chiplet. We're too early in the race here to say whether they made the right bet. But here are the long term trends we are looking at:
- Chiplets have caught up to Intel where monolithic was unable to for AMD
- Chiplet now enables them to remain close to Nvidia's top end with, frankly, not a whole lot of trouble
- Raster performance clearly remains a high performance affair, looking at newer engines, you'll keep a strong hardware requirement on it, with or without RT
- RT performance seems easy to lift through gen-to-gen general performance gains. It just takes time; at the same time, there is only a handful of titles that you can only really play well on Nvidia - even today, 3,5 gens past RT's release, even here, AMD's laid back approach to RT implementation isn't really breaking them.
- They still own the consoles and therefore have a final say in widespread market adoption wrt RT.

I'm really not seeing a badly positioned company or GPU line up here overall. That idea just applies to a minority that chases the latest greatest in graphics, the early adopter crowd, because that's really still where RT is at. AMD isn't playing catch up. They're doing a minimal effort thing on the stuff they don't care much about, seems like the proper rationale to me.

It's funny. I am an AMD fan for over 20 years, always choosing AMD hardware when having the option, an RX 6600 at 190 euros does look like a nice little upgrade for my RX 580 and I really regret that 7600 in Greece is still overpriced(over 280 euros), all my posts are usually AMD friendly, but expressing STRONG CONCERN about the company and it's choices and where it goes and if it will be able to remain a true competitor to Intel and Nvidia for years to come is enough excuse for you who obviously have no idea about my posts to start accusing me of "bringing this bad take into every AMD thread". Nice. Bravo!!!! :clap:

People where expecting 7900 series to be killers, in raster at least and at a lower power consumption. Based on TPU numbers that probably every one here disputes, 7900 XT is on average 14% faster than 6950XT with more shading units, more ROPS, more RT cores, more VRAM, more memory bandwidth. It's RDNA3 and still looks like a bigger RDNA2 chip. Do you even understand where my concerns are?

Maybe I should stop posting sinsere concerns and start posting pictures like the one below to make you happy.
View attachment 302198
You say barely anything happened between RDNA2 and 3, but...
the energy efficiency is virtually on par, both last and current gen. And the only GPU they can't match is the 4090 in raw perf.

1687544342818.png
 
Last edited:
Joined
Sep 6, 2013
Messages
3,050 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
money doesn't solve those problems, ask Intel
Intel is stuck at an old manufacturing node. Thanks for having the money they havem they are competitive in CPUs, of course at 300W power consumption and they managed to hire some people to bring out some GPUs that do mostly work, even being first generation. So money does work in tech. If they fix their manufacturing what do you expect to happen? In fact that's what I fear. AMD concentrated so much on CPUs, that if tomorrow Intel manages to present a working 5nm node (nanometers not Intel marketing numbers) we could have a repeat of how things progressed after Intel Core Duo introduction.

AMD obviously does have the engineers to build great CPUs and GPUs. Now it needs to start investing those numbers where they should be invested. They thew 40 billions on Xilinx. It's time to start throwing more money on GPUs and software.

@Vayra86 I agree with most of your post. I was screaming about RT performance and was called Nvidia shill back then. With 7900XTX offering 3090/Ti RT performance, how could someone be negative about that? How can 3090/Ti performance being bad today when it was a dream yesterday?
The answer is marketing. I believe people buy RTX 3050 cards over RX 6600 because of all that RT fuss. RT is useless on an RTX 3050, but still people buy it, i believe for that reason. "RTX from Nvidia". Marketing. The same marketing won market share for AMD with the first Ryzen series. Much lower IPC compared to Intel CPUs but "8 CORES". The same marketing wins market share for Intel the last few years. "16 Cores CPU" and half of then are E cores. But people love to read higher numbers.
AMD should have doubled RT Cores in RDNA 3 even if that meant more die area, more expensive chips, lower profit margins.

PS
AMD Radeon RX 7600 Review - For 1080p Gamers - Power Consumption | TechPowerUp
Efficiency might be high in those charts, but the charts in the previous page say otherwise, or at least that efficiency of the chip isn't the same in every task. Seems that in cases, like video playback, there is a problem.
 
Last edited:
Joined
Sep 17, 2014
Messages
21,210 (5.98/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
AMD has been using the higher vram as a selling point for year, it was like that in the Polaris days almost a decade ago, it kept it ever since. It's no a new move or a miscalculation of rdna3
It materialized for Polaris 8GB too.

It will materialize for RDNA3. But that could still make it a miscalculation. 16GB would have been fine on the 7900XT's raw perf as it is now too... I think @londiste made a fine point there.

I'm talking about aforementioned absurdly high wattage at video playback and multi-monitor usage. It doesn't matter if it's fixed, the only thing that matters is they launched a product unable to perform efficiently. You can afford taking money from customers for a pre-alpha testing privilege when you're miles ahead of your competition, not when you're stone age behind and you yourself are not a competition at all.

7900 XTX was launched marginally cheaper than 4080 and has nothing to brag about. +8 GB VRAM does nothing since 4080 can turn DLSS3 on and yecgaa away from the 7900 XTX. "Sooper dooper mega chiplet arch" does also do nothing when 4080 can preserve 60ish framerates RT On whilst 7900 XTX goes for shambly 30ish FPS with permanent stuttering inbound. More raw power per buck? WHO CARES?
Since the overwhelming majority of performance is still in raster and certainly not always with DLSS3 support, yeah, who cares huh.

Consider also the fact that playing most of the new stuff at launch right now, as it was for the last year(s), is an absolute horror show. Nothing released without breaking issues. To each their own. I've always gamed extremely well following the early adopter wave of each gen loosely. In GPUs its the same thing as with games. We get all sorts of pre-alpha shit pushed on us. The entire RTX brand is a pre-alpha release since Turing, and we could say they're in open beta now with Ada. Until you turn on path tracing. You're paying every cent twice over for Nvidia's live beta here, make no mistake. We didn't see them discount Ampere heavily like AMD did RDNA2, for example ;) In the meantime, I can honestly not always tell if RT is really adding much if anything beneficial to the scene. In Darktide, its an absolute mess. In Cyberpunk, the whole city becomes a glass house, sunsets are nice though, but when you turn RT off, not a single drop of atmosphere is really lost at all while the FPS doubles.

Anyone not blindly sheeping after the mainstream headlines, cares.

Also, do consider the fact Ampere 'can't get DLSS3' for whatever reason, the main one being an obvious Nvidia milking you hard situation. The power per buck advantage isn't just AMDs pricing, its Nvidia's approach to your upgrade path much more so.
 
Last edited:
Joined
Apr 14, 2018
Messages
480 (0.21/day)
It's funny. I am an AMD fan for over 20 years, always choosing AMD hardware when having the option, an RX 6600 at 190 euros does look like a nice little upgrade for my RX 580 and I really regret that 7600 in Greece is still overpriced(over 280 euros), all my posts are usually AMD friendly, but expressing STRONG CONCERN about the company and it's choices and where it goes and if it will be able to remain a true competitor to Intel and Nvidia for years to come is enough excuse for you who obviously have no idea about my posts to start accusing me of "bringing this bad take into every AMD thread". Nice. Bravo!!!! :clap:

People where expecting 7900 series to be killers, in raster at least and at a lower power consumption. Based on TPU numbers that probably every one here disputes, 7900 XT is on average 14% faster than 6950XT with more shading units, more ROPS, more RT cores, more VRAM, more memory bandwidth. It's RDNA3 and still looks like a bigger RDNA2 chip. Do you even understand where my concerns are?

Maybe I should stop posting sinsere concerns and start posting pictures like the one below to make you happy.
View attachment 302198

It doesn’t matter what you’re a fan of. The opinion you keep reiterating is flat out wrong.
 
Joined
Sep 20, 2014
Messages
36 (0.01/day)
AMD is far less competitive than it has been in years.

It reminds me of the initial pascal launch.

Navi 31 is basically Vega without the HBM2, but it has worse power consumption, had to shift down a tier to make it look competitive with it's competition and likely a higher cost to make vs the tier it is now competing against because it is slower than anticipated. Nvidia flagship is also far away faster. If the RTX 4090 was less cutdown, we would be looking at a generational difference between two flagships.

If nvidia wanted to, they could likely price the RTX 4080 at $699 and the RTX 4070 ti(which should be an RTX 4070) at $500 dollars and destroy AMD's lineup and still make a decent margin. However there seems to be some price fixing going about or NV does not want to depreciate their last gen so much since there is so much on the market still.

What's making this fight appear closer than it is, is Nvidia spending silicon on tensor cores which enabled AMD to get a bit closer in performance. However this silicon cost on tensor cores is well spent since it enables Nvidia to tackle the AI market simultaneously with the same lineup. If AMD is losing ground on raster while going all out on raster, the division has poor future. The marketshare and revenue from the gaming market simply won't justify the R and D expense, on top of the large dies/low profit compared to other sector's AMD is competing in.
 
Joined
May 17, 2021
Messages
3,005 (2.70/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Intel is stuck at an old manufacturing node. Thanks for having the money they havem they are competitive in CPUs, of course at 300W power consumption and they managed to hire some people to bring out some GPUs that do mostly work, even being first generation. So money does work in tech. If they fix their manufacturing what do you expect to happen? In fact that's what I fear. AMD concentrated so much on CPUs, that if tomorrow Intel manages to present a working 5nm node (nanometers not Intel marketing numbers) we could have a repeat of how things progressed after Intel Core Duo introduction.

AMD obviously does have the engineers to build great CPUs and GPUs. Now it needs to start investing those numbers where they should be invested. They thew 40 billions on Xilinx. It's time to start throwing more money on GPUs and software.


what you call the manufacturing node problem is just a engineering problem just as the one you are talking to with AMD. Money doesn't solve that, the past is full of richer dead companies that couldn't solve those kinds of problems with money
 
Top