• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XTX

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,858 (0.33/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASUS ROG Strix X670E-I Gaming WiFi
Cooling ID-COOLING SE-207-XT Slim Snow
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA)
Storage 2TB Samsung 990 Pro NVMe
Display(s) AOpen Fire Legend 24" (25XV2Q), Dough Spectrum One 27" (Glossy), LG C4 42" (OLED42C4PUA)
Case ASUS Prime AP201 33L White
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight (White), G303 Shroud Edition
Keyboard Wooting 60HE / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.3447
They need to get their drivers to read the monitor's EDID properly so that the idle clocks would be on par with both NVIDIA and Intel. Using CRU is still the workaround for that issue.

After that, pretty much Adrenalin is fine for what it is (aside from Enhanced Sync being useless nowadays).

1670869490810.png


Yes, of course I've reported this issue through their Bug Report tool. It's been 3 years actually and still no notable movement on this. (Find my 5700 XT thread)
 
Last edited:
Joined
Nov 9, 2011
Messages
407 (0.09/day)
System Name ChaoticG8R / Lil'G8R
Processor Intel i7-970 / Intel i7-4790K
Motherboard ASUS Sabertooth X58 / ASUS ROG Impact VII Z97
Cooling Swiftech H220 AIO / Cooler Master Nepton 140XL
Memory Mushkin Enh. Radioactive 3x4GB DDR3 1600 / Corsair Vengeance 2x8GB DDR3 2133
Video Card(s) 2 x EVGA GTX 680 4GB / Zotac AMP! GTX 980
Storage 240GB Crucial M500 + RAID1 2x2TB WD BlackCaviar / Raid0 3x256 GB Crucial MX100 + 2TB WD BlackCaviar
Display(s) QNIX QX2710 @ 120hz + 2x Sceptre 20"
Case Cooler Master Storm Sniper Black Edition / Thermaltake Core V1
Power Supply BFG LS1200 1250W / Corsair AX860i 860W
Mouse ROCCAT Kone XTD
Keyboard ROCCAT Ryos MK Pro
Benchmark Scores Runs daily @ 4.5Ghz; Runs stable @ 4.8Ghz; Max OC on air: 5.0Ghz
Im optimistic that the board partners will push the XTX even farther. It’s fairly obvious that nvidia has put some incredible effort to ensure their internal cooler/design is competitive with AIBs, while it feels more like a true bare minimum/“reference” style that AMD is providing.

I would assume we’ll have some reviews tomorrow on launch for the AIBs @W1zzard or is AMD going to stagger them again like previously? If you can’t give a yes or no due to NDA, can you at least acknowledge that?

Thanks in advance :)
 
Joined
Mar 10, 2010
Messages
11,878 (2.29/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I'm gonna leave this here
Top Nvidia shareholders

Vanguard Group Inc. representing 7.7% of total shares outstanding

BlackRock Inc. representing 7.2% of total shares outstanding

Top AMD shareholders

Vanguard Group Inc. representing 8.28% of total shares outstanding

BlackRock Inc. representing 7.21% of total shares outstanding

Brilliant post, again, , ,,Not.
I think the thread your after is the actual project Calisto poor performance thread.
You can shitpost in peace there.

OP equals a 3090Ti in raytacing, edges the 4080 in 4k raster and cheaper than the competition, not bad, not the outright win many(I) wanted but not bad.

Come on Santa you Shit, sort it out.(unlikely but I can dream)
 
Joined
May 24, 2007
Messages
5,406 (0.87/day)
Location
Tennessee
System Name AM5
Processor AMD Ryzen R9 7950X
Motherboard Asrock X670E Taichi
Cooling EK AIO Basic 360
Memory Corsair Vengeance DDR5 5600 64 Gb - XMP1 Profile
Video Card(s) AMD Reference 7900 XTX 24 Gb
Storage Samsung Gen 4 980 1 TB / Samsung 8TB SSD
Display(s) Samsung 34" 240hz 4K
Case Fractal Define R7
Power Supply Seasonic PRIME PX-1300, 1300W 80+ Platinum, Full Modular
Performance/watt should be larger on AIB cards given 3 and 4 8 pin connectors. You will see larger power draw with larger performance increases.

Energy efficiency. Clear advantage over 4080 in regard to rasterization and price. Ray Tracing in 4K still isn’t ready for prime time yet.

@W1zzard The argument that you should by a 4080 over a 7900 XTX for ray tracing is ludicrous in my opinion. Averaging 4K 29 FPS on 4080 versus 21 FPS 7900XTX in Cyberpunk is comical. It’s almost across the board game to game.

Well deserved Editors Choice.
 
Last edited:
Joined
Sep 8, 2020
Messages
204 (0.15/day)
System Name Home
Processor 5950x
Motherboard Asrock Taichi x370
Cooling Thermalright True Spirit 140
Memory Patriot 32gb DDR4 3200mhz
Video Card(s) Sapphire Radeon RX 6700 10gb
Storage Too many to count
Display(s) U2518D+u2417h
Case Chieftec
Audio Device(s) onboard
Power Supply seasonic prime 1000W
Mouse Razer Viper
Keyboard Logitech
Software Windows 10
I have a fan. :laugh:
 
Joined
Jun 2, 2017
Messages
8,041 (3.16/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Sorry, i'll be keeping my 6800XT for another gen i guess.
That is fine it is your money to do with as you please. I play TWWH3 and when I saw 145 FPS in the review I was sold.
 
Joined
Oct 15, 2010
Messages
951 (0.19/day)
System Name Little Boy / New Guy
Processor AMD Ryzen 9 5900X / Intel Core I5 10400F
Motherboard Asrock X470 Taichi Ultimate / Asus H410M Prime
Cooling ARCTIC Liquid Freezer II 280 A-RGB / ARCTIC Freezer 34 eSports DUO
Memory TeamGroup Zeus 2x16GB 3200Mhz CL16 / Teamgroup 1x16GB 3000Mhz CL18
Video Card(s) Asrock Phantom RX 6800 XT 16GB / Asus RTX 3060 Ti 8GB DUAL Mini V2
Storage Patriot Viper VPN100 Nvme 1TB / OCZ Vertex 4 256GB Sata / Ultrastar 2TB / IronWolf 4TB / WD Red 8TB
Display(s) Compumax MF32C 144Hz QHD / ViewSonic OMNI 27 144Hz QHD
Case Phanteks Eclipse P400A / Montech X3 Mesh
Power Supply Aresgame 850W 80+ Gold / Aerocool 850W Plus bronze
Mouse Gigabyte Force M7 Thor
Keyboard Gigabyte Aivia K8100
Software Windows 10 Pro 64 Bits
barely beating the 4080.
It's identical in raster, unless you can detect a 3% difference?
RT nvidia +16%.

But both are $700 gpus.

I saw 145 FPS in the review I was sold.
Yeah, that's sad, it's like approving those price hikes. Next gen 8900XTX $1500. And you'll see 185 FPS in TWWH3 and be sold again.
 
Joined
Jan 2, 2008
Messages
3,296 (0.55/day)
System Name Thakk
Processor i7 6700k @ 4.5Ghz
Motherboard Gigabyte G1 Z170N ITX
Cooling H55 AIO
Memory 32GB DDR4 3100 c16
Video Card(s) Zotac RTX3080 Trinity
Storage Corsair Force GT 120GB SSD / Intel 250GB SSD / Samsung Pro 512 SSD / 3TB Seagate SV32
Display(s) Acer Predator X34 100hz IPS Gsync / HTC Vive
Case QBX
Audio Device(s) Realtek ALC1150 > Creative Gigaworks T40 > AKG Q701
Power Supply Corsair SF600
Mouse Logitech G900
Keyboard Ducky Shine TKL MX Blue + Vortex PBT Doubleshots
Software Windows 10 64bit
Benchmark Scores http://www.3dmark.com/fs/12108888
If I was in the market for a GPU, I'd rather go for a used RTX3090 than a 7900 XTX tbh.
 
Joined
May 21, 2019
Messages
12 (0.01/day)
Ah, yes, launch driver bugs; very typical of AMD. You know, more of you should be demanding “FineWine” performance at launch and not 6-12 months later. That simply means that the driver compiler is still not optimized for RDNA3 after all of this development time, and yes, it’s resource intensive, but most people base decisions on launch performance.

Overall, RDNA3 isn’t terrible, however we should all remember that this generation, AMD and Nvidia are both on TSMC again. Nvidia is using “4N” or Nvidia-optimized N4(P?) while AMD is using N5 (also with AMD customized libraries) for GPU and N6 for MCDs.

They’re not quite on equal terms: N4P (closest to 4N) offers 11% more performance and 22% more efficiency over N5 (just 4% and 7% over N5P). Density is only a 6% reduction vs N5. These are TSMC’s own numbers. AMD can refresh Navi 31 GPU in 8-12 months with N4P at higher wafer costs and that would close pricing between AMD 7950XTX and eventual Nvidia RTX 4080Ti.

Nvidia screwed themselves with Ampere by going to Samsung 8LPP.
 
Joined
Sep 17, 2014
Messages
21,049 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Compact and loud go together. Too bad, why can't they design these things better?
But other than RT performance it's quite great.

The XT is only about 20w more than 6800xt, that's a high increase in efficiency
39dB at 58C core temp is pretty neat I'd say, you can't look at noise in isolation imho.

These either OC to the moon or they can go pretty damn silent under load, being as they are in the review; on the OC page you can see the XTX hitting near 3 GHz.

It does not however extract a whole lot of FPS from it in Heaven, and not all of that I think is attributable to the CPU limitations. The lack of refinement we see there echoes throughout the review; apparently it clocks very aggressively at the beginning of a run and is quickly forced to push back to remain within 350W power target.

That is fine it is your money to do with as you please. I play TWWH3 and when I saw 145 FPS in the review I was sold.
Yeah somehow that game cripples my 1080 too, I'm looking at 30 FPS on the campaign map. Still wondering why it drops so low; TWWH2 does 50% more on average. Its not exactly paid off by massive increase in fidelity...

It's identical in raster, unless you can detect a 3% difference?
RT nvidia +16%.
The RT results AND raster results are pretty wide apart in some cases, but if you consider that and then still consider there's 3% in advantage of AMD, that's meaningful. Some games push 20% more FPS, while the biggest loser is what, 15% in favor of Nvidia. The number of games is also higher where it scores better. Given the lacking optimization elsewhere in the product, its safe to say that gap can increase further in favor of AMD. Not a given, but definitely plausible.
 
Last edited:
Joined
Feb 21, 2006
Messages
1,999 (0.30/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Ca.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) AMD Radeon RX 7900 XTX 24GB (24.5.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 14TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c
Joined
Nov 26, 2021
Messages
1,372 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
It's identical in raster, unless you can detect a 3% difference?
We agree; you didn't quote the full sentence.
I was expecting the 7900 XTX to be around 90% of a 4090 in traditionally rasterized games, but it is barely beating the 4080
 
Joined
Jun 2, 2017
Messages
8,041 (3.16/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Yeah, that's sad, it's like approving those price hikes. Next gen 8900XTX $1500. And you'll see 185 FPS in TWWH3 and be sold again.
You can call it what you want. Who knows, it depends but yes I do buy AMD cards. Even the Vega 64 has been worth every single penny and my RX570 is still going strong too. I am not taking food out of my mouth to buy a $1000 GPU so it really is inert. I do not like the prices but a PS5 is $999 all day long and pre-builts with cards like the 3090 and TI versions are still $4000+ so it is what it is.
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
  • Considerably lower ray tracing performance than RTX 4080

Cough.


1670870628734.png


Ray Tracing in 4K still isn’t ready for prime time yet.
No offense, 3090Ti.
 
Joined
Jul 22, 2009
Messages
183 (0.03/day)
Location
Bucharest
So what happens if I connect a Dell U2720Q or similar monitor that has a Type-C display port using the Type-C DisplayPort 2.1 in the 7900 XTX?
 
Joined
Sep 17, 2014
Messages
21,049 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
  • Considerably lower ray tracing performance than RTX 4080

Cough.


View attachment 274193


No offense, 3090Ti.
Yeah the RT perf should not be a deal breaker on its own, I do agree.

But then you see that the perf/$ gap is also exactly 17% to the 4080, and it kinda does get ya thinking. That balance isn't tipping decisively in AMD's favor there; after all, RDNA3 is missing the mark on a few other aspects, most notably idle power plus a historical drawback; people are seeing history repeat: cards getting released that could have been refined more.

And to be fair to them, I see that too. New technology, blah blah fantastic, but if that serves to brute force your way to the sub top of leaderboards while leaving refinement on the table, that ain't good. RDNA2 was great exactly because it did tick all those boxes. And that's a literal repeat of what we saw since Fury. HBM never materialized in the gaming space. If MCM isn't paying off bigtime, what's it doing here? After all, if it offers cheaper products for being non monolithic, why aren't we seeing it?

I'm kinda withholding judgment because I expect two things in the near future;
- price drops
- driver refinement
 
Last edited:

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,858 (0.33/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASUS ROG Strix X670E-I Gaming WiFi
Cooling ID-COOLING SE-207-XT Slim Snow
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA)
Storage 2TB Samsung 990 Pro NVMe
Display(s) AOpen Fire Legend 24" (25XV2Q), Dough Spectrum One 27" (Glossy), LG C4 42" (OLED42C4PUA)
Case ASUS Prime AP201 33L White
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight (White), G303 Shroud Edition
Keyboard Wooting 60HE / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.3447
So what happens if I connect a Dell U2720Q or similar monitor that has a Type-C display port using the Type-C DisplayPort 2.1 in the 7900 XTX?
It should just consider it a normal DisplayPort connection.
 
Joined
Oct 6, 2021
Messages
1,481 (1.55/day)
It's identical in raster, unless you can detect a 3% difference?
RT nvidia +16%.

But both are $700 gpus.


Yeah, that's sad, it's like approving those price hikes. Next gen 8900XTX $1500. And you'll see 185 FPS in TWWH3 and be sold again.
Why? Did you roll a dice and decide on that price? The reality is that contrary to what you think these GPUs don't have very high profit margins like in the past, $700 should be more or less the cost of production. Over $300 just for memory (optimistic), it would be great if you could maintain that performance with just 12Gb or 16gb of vram, so it could cost $150-200 less

My main and only complaint is in relation to the huge variation in performance difference between games, in some cases it even beats the 4090 and in others it loses by about 20% to the 4080. Plus, It seems a rule that games based on the Unreal Engine run bad AMD gpus, I wonder if AMD doesn't have money left over to do something about it.

Anyway, it's still the most interesting product AMD has released in recent years, too bad there aren't decent games to effectively use so much computational power.
 
Joined
Jun 11, 2019
Messages
492 (0.27/day)
Location
Moscow, Russia
Processor Intel 12600K
Motherboard Gigabyte Z690 Gaming X
Cooling CPU: Noctua NH-D15S; Case: 2xNoctua NF-A14, 1xNF-S12A.
Memory Ballistix Sport LT DDR4 @3600CL16 2*16GB
Video Card(s) Palit RTX 4080
Storage Samsung 970 Pro 512GB + Crucial MX500 500gb + WD Red 6TB
Display(s) Dell S2721qs
Case Phanteks P300A Mesh
Audio Device(s) Behringer UMC204HD
Power Supply Fractal Design Ion+ 560W
Mouse Glorious Model D-
Anyway, it's still the most interesting product AMD has released in recent years, too bad there aren't decent games to effectively use so much computational power.
IMO 6900XT was way more interesting - actually faster than nvidia's best in raster, and a time when ray-tracing was less relevant and in way fewer games. This gen feels like a geniune step back again - they can't even comprehensibly beat a cut-down die while still lagging in features (or gimmicks, if you wish, it doesn't really matter because one has them and the other doesn't, at least right here and right now without FSR3 and so on).
 
Joined
Sep 17, 2014
Messages
21,049 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Why? Did you roll a dice and decide on that price? The reality is that contrary to what you think these GPUs don't have very high profit margins like in the past, $700 should be more or less the cost of production. Over $300 just for memory (optimistic), it would be great if you could maintain that performance with just 12Gb or 16gb of vram, so it could cost $150-200 less

My main and only complaint is in relation to the huge variation in performance difference between games, in some cases it even beats the 4090 and in others it loses by about 20% to the 4080. Plus, It seems a rule that games based on the Unreal Engine run bad AMD gpus, I wonder if AMD doesn't have money left over to do something about it.

Anyway, it's still the most interesting product AMD has released in recent years, too bad there aren't decent games to effectively use so much computational power.
Historically Nvidia has ran margins of 40-60% and they're peaking above 60% today. So strangely enough as nodes shrink, both revenue and margins shoot up, and not one at cost of the other.

700 cost of production? Got source, or just pulling it out of what's been going around the internet in random blurbs? I'm interested if you have real info. And if AMD has to cut into its margins so much more than Nvidia doing its monolithic approach, what the fuck is AMD even on right now? Crack? Or is this the only way they can realistically move forward? Or is it an investment in the future? If so, they should damn well make sure to release it in a better state.

The story just doesn't make sense. If MCM is better wrt yields on smaller nodes, it should end up cheaper.

Note; here's AMD's gross margin. Note that CPU is chasing the chiplet route for a while now too.
1670872599282.png
 
Last edited:
Joined
Sep 6, 2013
Messages
3,034 (0.78/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
I'm a huge fan of new intel, but they're not even close to 'biting' AMD from behind. If intel can match the 6800xt with battlemage next year I will be impressed, but I doubt it.

This architecture does have similarities to bulldozer, but at the end of the day it pushes the FPS needed to compete with the 80 class.

They will need to drop price, for sure -- but hopefully the silicon savings allow for that.
They don't need to reach 6800X level of performance. Nvidia is securing the high end and Intel will take market share from AMD in the low-mid range market by selling to big OEMs. Intel can sell multiple times more products than AMD to OEMs, even inferior products. Intel is also doing something smart. It's investing from the beginning in RT performance. AMD is the only one of the three GPU manufacturers that still considers RT performance secondary. That's a mistake in my opinion. And now that they are having problems destroying RTX 4080 in raster, it's going to cost them heavily in sales. I can already see a drop of $100 in price, at least for the XT, even if Nvidia doesn't drop RTX 4080's price, or/and RTX 4070 Ti comes at $800. I agree with you in this one.
 
Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
IMO 6900XT was way more interesting - actually faster than nvidia's best in raster, and a time when ray-tracing was less relevant and in way fewer games. This gen feels like a geniune step back again - they can't even comprehensibly beat a cut-down die while still lagging in features (or gimmicks, if you wish, it doesn't really matter because one has them and the other doesn't, at least right here and right now without FSR3 and so on).

Yeah that's a good point, the 4080 isn't using a fully enabled AD103, and then worringly neither is the 4090 close to fully using the AD102. A fully enabled AD102 on presumably the 4090 Ti is gonna bitch slap everything (and melt wallets).
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,102 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
0.1 V is too low, you need at least 0.5-0.6 V to open transistor. Clocks aren't real too.
This is what the monitoring circuitry in the card returns, I have no way to disprove these results

View attachment 274179View attachment 274182

TFW The 7900XTX pushed balls to the wall beats top end RDNA2 by a whopping ~43FPS in raster.
View attachment 274181
5800X vs 13900K though. Check the deltas for cards present in both tests

These either OC to the moon
Actually these don't OC at all, as mentioned in the review. When you change GPU clock you get 3 results:
- No change in performance
- Loss in performance
- Crash

What you don't get is any performance gain.

What works for "OC" is undervolt, so the boosting increases clocks (without you manually touching clocks), and increasing power limit, so the boosting increases clocks (without you manually touching clocks).

Others would say "OC is broken", if that's the way it works now.

Greetings. Awesome review as always... small error though:

4090 is $2400 now???
That's the current market price. I tried to find one at lower price this morning, not possible, all sold out.
 
Joined
Sep 17, 2014
Messages
21,049 (5.96/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Actually these don't OC at all, as mentioned in the review. When you change GPU clock you get 3 results:
- No change in performance
- Loss in performance
- Crash

What you don't get is any performance gain.

What works for "OC" is undervolt, so the boosting increases clocks (without you manually touching clocks), and increasing power limit, so the boosting increases clocks (without you manually touching clocks).

Others would say "OC is broken", if that's the way it works now.
Yeah I noticed that, and the GPU is also at peak power all the time. Looks remarkably similar to Zen4.

What do you estimate the chances of this changing with future updates?
 
Top