- Joined
- Dec 25, 2020
- Messages
- 3,240 (3.20/day)
- Location
- São Paulo, Brazil
System Name | Project Kairi Mk. III "Lunar Tear" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS Special Edition |
Motherboard | MSI MEG Z690 ACE (MS-7D27) BIOS 1.D0 |
Cooling | id-cooling Frostflow X 360 w/ Thermalright BCF and Thermal Grizzly Kryonaut Extreme |
Memory | G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2 |
Video Card(s) | ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition |
Storage | 2x WD Green SN350 480GB + 1x XPG Spectrix S40G 512GB NVMe + 4x WD VelociRaptor WD3000HLFS 300GB HDDs |
Display(s) | LG OLED evo G3 55' - 4K 120 Hz HDR supremacy |
Case | Cooler Master MasterFrame 700 in bench mode |
Audio Device(s) | EVGA Nu Audio (classic) + Sony MDR-V7 cans |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Razer DeathAdder Essential Mercury White |
Keyboard | Redragon Shiva Lunar White |
Software | Windows 11 Enterprise 23H2 |
Benchmark Scores | "Speed isn't life, it just makes it go faster." |
AMD doesn't believe in ray-tracing. They said they it will be fully supported and available only in the cloud.
So, not mainstream and not for you.
View attachment 270730
You're taking this (very old, pre-RDNA2) slide quite out of context. AMD isn't going to be running raytracing server farms for Radeon owners, cloud computing is aimed at the application specific market.
And not for me? Not sure what you mean by that. I mean, I know I'm only a hobo that still has an RTX 3090 (smh I don't have a 4090 yet, what am I, poor?), but... I dunno, I enjoy raytraced gaming, even if my wood GPU only gets 100 fps or whatever, wtf how am I so poor, playing at 1080p and not using frame generation

I see people talking about cost per mm2 for production costs. Yep, sure that's increased. But did you even look to see the mm2 used by each GPU?
RTX 3080 : 628.4mm2
RTX 4080 : 379mm2
Even if their per mm costs have increased the die size has drastically decreased, by 40%. It definitely does NOT justify the massive cost increase to the cards. Nvidia are being greedy, it's a corporation after all, we expect them to do that. The problem is AMD isn't being competitive, and neither is Intel (in this high-end space). Nvidia has the market by the balls, you don't buy AMD cause they aren't very future proof, and you don't buy nvidia (but you will) cause they're too expensive.
Those people who say they don't believe in Ray Tracing, go live on an intel integrated and tell me you're still fine with it for gaming. Graphics goes forwards, Ray Tracing solves problems that typical shader based raster programs find difficult to scale, and we've been finding difficult to remedy for a decade now without dedicated hardware. Traditional triangle based rasterization is at it's limit of being efficient, and you might not think it, but taking the Ray Tracing route is about making certain effects MORE efficient, because otherwise you have to brute force them with traditional shader programs, which end up slower (grab a GTX 1080 and use it to run Quake 2 RTX, they have the entire ray tracing stack running in shaders).
The explanation for the die sizes is quite straightforward: The RTX 4080 is built on a much more advanced lithography node and is a lower segment ASIC (AD103) compared to the highest tier die (the AD102), and the RTX 3080 had a seriously cutdown GA102, only 68 out of the 84 computing units present in the GA102 are enabled in an RTX 3080. This number increases to 70 on 3080-12GB, 80 in the 3080 Ti and 82 in the 3090, with the 3090 Ti having a fully enabled processor. Using high yield harvested (low-quality!) large dies with several disabled units tends to hurt power efficiency. Add first-generation GDDR6X memory and it's no wonder the original Ampere has seen better days.
But indeed, I agree, Jensen has raised prices this generation quite a bit. But not only that, they've also left an insane amount of space for refresh SKUs, and even a comfortable lead above the RTX 4090 for an eventual 4090 Ti or potential 30th anniversary release Titan Ada or something, as the RTX 4090 has only 128 out of the 142 units of the AD102 processor enabled.