• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Plans GeForce RTX 4060 Launch for Summer 2023, Performance Rivaling RTX 3070

Joined
Jun 10, 2014
Messages
2,900 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
The products themselves aren't the disaster. Their pricing is. And the insult that we're being told that it's okay is.
If we want lower prices, then AMD needs to show serious competition, and not just on paper, they need to have high availability of better priced cards globally.

And if RTX 4080 is a "pricing disaster", is RX 6950 XT that too? And for whichever you choose, are you basing this conclusion on MSRP, the pricing in TPU's reviews or your local pricing?
Because this may lead to very different conclusions. RX 6950 XT shows up at a greater value in TPU's reviews because it's based on Newegg's pricing at the time of writing, and right now Newegg have a couple of RX 6950 XTs way below MSRP. So is this price representative for the global market?
In my area pricing for RX 6950 XT is all over the place, and vary a lot day by day, very few in stock at or near MSRP. Many are priced comparatively to RTX 4080.
When I compare products I base my conclusions on US MSRP, which is not perfect, but is probably still more representative for a relative comparison than specific shops. (at least now that shops have stocks again)
 

ARF

Joined
Jan 28, 2020
Messages
3,934 (2.55/day)
Location
Ex-usa
If we want lower prices, then AMD needs to show serious competition, and not just on paper, they need to have high availability of better priced cards globally.

And if RTX 4080 is a "pricing disaster", is RX 6950 XT that too? And for whichever you choose, are you basing this conclusion on MSRP, the pricing in TPU's reviews or your local pricing?
Because this may lead to very different conclusions. RX 6950 XT shows up at a greater value in TPU's reviews because it's based on Newegg's pricing at the time of writing, and right now Newegg have a couple of RX 6950 XTs way below MSRP. So is this price representative for the global market?
In my area pricing for RX 6950 XT is all over the place, and vary a lot day by day, very few in stock at or near MSRP. Many are priced comparatively to RTX 4080.
When I compare products I base my conclusions on US MSRP, which is not perfect, but is probably still more representative for a relative comparison than specific shops. (at least now that shops have stocks again)

Well, the RX 6950 XT is poor value.
Prices in Germany in euro:

Radeon RX 6800 - 499.00
Radeon RX 6800 XT - 648.00
Radeon RX 6900 XT - 699.00
Radeon RX 6950 XT - 880.00

In this group the RX 6800 is the sweet deal if you don't care about the relatively painful performance drop.

1668967597868.png
 
Joined
Jan 14, 2019
Messages
9,839 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
If we want lower prices, then AMD needs to show serious competition, and not just on paper, they need to have high availability of better priced cards globally.

And if RTX 4080 is a "pricing disaster", is RX 6950 XT that too? And for whichever you choose, are you basing this conclusion on MSRP, the pricing in TPU's reviews or your local pricing?
Because this may lead to very different conclusions. RX 6950 XT shows up at a greater value in TPU's reviews because it's based on Newegg's pricing at the time of writing, and right now Newegg have a couple of RX 6950 XTs way below MSRP. So is this price representative for the global market?
In my area pricing for RX 6950 XT is all over the place, and vary a lot day by day, very few in stock at or near MSRP. Many are priced comparatively to RTX 4080.
When I compare products I base my conclusions on US MSRP, which is not perfect, but is probably still more representative for a relative comparison than specific shops. (at least now that shops have stocks again)
I don't know where you live, but here in the UK, AMD offers better prices than Nvidia all across their product range. The 3060 is about £50-100 more expensive than the 6600, the 3070 is £100-150 more expensive than the 6700 XT, the 3080 is also £150 more expensive than the 6800 XT, and the 6900 XT is £50 cheaper than the 12 GB 3080. What more do you want?
 

ARF

Joined
Jan 28, 2020
Messages
3,934 (2.55/day)
Location
Ex-usa
I don't know where you live, but here in the UK, AMD offers better prices than Nvidia all across their product range. The 3060 is about £50-100 more expensive than the 6600, the 3070 is £100-150 more expensive than the 6700 XT, the 3080 is also £150 more expensive than the 6800 XT, and the 6900 XT is £50 cheaper than the 12 GB 3080. What more do you want?

Lower prices from AMD. The 6800 XT is still too expensive compared to the historical trends of lowering the pricing over aging. The 6800 XT is still around its 2-year-old original MSRP.
AMD sells Navi 21 from 499 to 880 euro depending on the binning. So, there is plenty of room for price reductions on 6800 XT / 6900 XT and 6950 XT.
 
Joined
Dec 25, 2020
Messages
4,596 (3.79/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Logitech G305 Lightspeed K/DA
Keyboard Logitech K400 Plus
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
What unmitigated disaster?
RTX 4080 and RTX 4090 has proven to be great performers, the RTX 4080 performed better than "leaks" were projecting. The pricing can change if AMD offers some serious competition.

As for the rest of the product lineup, we actually don't know yet. But let's use any opportunity to bash Nvidia prematurely anyway!

I am not bashing prematurely. The RTX 4090 costs about 60% more than what it should, the RTX 4080 is irrationally priced considered the 7900 XTX is likely going to whoop it - and they still wanted to charge $900 for an even slower version of it based out of a midrange chip(!), and no, NVIDIA doesn't do price wars anymore. They'd rather limit volume and spam SKUs than lower prices, and the Ada lineup has PLENTY of space for SKU spam, above and below both 4080 and 4090, including room for a faster AD103-based chip and a SIGNIFICANTLY faster AD102-based chip above the RTX 4090 as well.

Like I said - I'm not an NVIDIA stakeholder or Jensen's dad, I want a competitively priced, yet high-quality product. NVIDIA cannot deliver that with this generation. They may be high-quality, but the price is an absurdity and the black-box ecosystem makes it even worse. I'll pass, and this is from an RTX 3090 owner.

I owned a Radeon VII back in the day. Lovely card, but Vega 20 was never a gaming GPU. It's little wonder that specific processor became the bedrock for the CDNA architecture. It also did not cost thousands of dollars, which greatly excuses it.
 
Joined
Jan 14, 2019
Messages
9,839 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
Lower prices from AMD. The 6800 XT is still too expensive compared to the historical trends of lowering the pricing over aging. The 6800 XT is still around its 2-year-old original MSRP.
AMD sells Navi 21 from 499 to 880 euro depending on the binning. So, there is plenty of room for price reductions on 6800 XT / 6900 XT and 6950 XT.
They're still cheaper than the equal offerings from Nvidia.
 
Joined
Jun 10, 2014
Messages
2,900 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I am not bashing prematurely. The RTX 4090 costs about 60% more than what it should, the RTX 4080 is irrationally priced considered the 7900 XTX is likely going to whoop it
I too think the pricing is way off. ~1000$ for RTX 4090 and ~$800 for RTX 4080 is the maximum I think makes sense, and that is even factoring the insane levels of inflation we have right now.
But as I've been saying, the solution is more competition, real competition. AMD needs to ship enough volumes to make a dent in Nvidia's sales across the markets, then the prices will drop rapidly.

NVIDIA doesn't do price wars anymore. They'd rather limit volume and spam SKUs than lower prices, and the Ada lineup has PLENTY of space for SKU spam…
This is just nonsense.
Nvidia isn't limiting volume.

and they still wanted to charge $900 for an even slower version of it based out of a midrange chip(!)…
And this is where you venture into nonsense territory.
The chip segmentation is pretty arbitrary and varies from generation to generation, so the term "midrange chip" makes no sense. Even the naming of the chips just refers to the order they've been designed, it bears no meaning whether it will be a mid-range product or not. So if AD103 is performing like a high-end chip, then it's a high-end chip.

Keep in mind back in the Kepler era, GK104 was used for GTX 680 (because GK100 was faulty). In the Maxwell generation, GM204 was used for the GTX 980, the original top model of the lineup. The same goes for Pascal, GTX 1080(GP104) was the top model for almost a year until GTX 1080 Ti arrived.
 
Joined
Dec 25, 2020
Messages
4,596 (3.79/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Logitech G305 Lightspeed K/DA
Keyboard Logitech K400 Plus
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
I too think the pricing is way off. ~1000$ for RTX 4090 and ~$800 for RTX 4080 is the maximum I think makes sense, and that is even factoring the insane levels of inflation we have right now.
But as I've been saying, the solution is more competition, real competition. AMD needs to ship enough volumes to make a dent in Nvidia's sales across the markets, then the prices will drop rapidly.

This is just nonsense.
Nvidia isn't limiting volume.

And this is where you venture into nonsense territory.
The chip segmentation is pretty arbitrary and varies from generation to generation, so the term "midrange chip" makes no sense. Even the naming of the chips just refers to the order they've been designed, it bears no meaning whether it will be a mid-range product or not. So if AD103 is performing like a high-end chip, then it's a high-end chip.

Keep in mind back in the Kepler era, GK104 was used for GTX 680 (because GK100 was faulty). In the Maxwell generation, GM204 was used for the GTX 980, the original top model of the lineup. The same goes for Pascal, GTX 1080(GP104) was the top model for almost a year until GTX 1080 Ti arrived.

They aren't now (beyond holding back on lower SKUs in the stack to move the same Ampere stock they call ancient GPUs unworthy of DLSS 3 but as a certified poor, I digress) but they would rather do that then engage in a price war, especially considered how much room for SKUs they have.

On the last bit, nonsense? I wasn't the one who announced a card and then unlaunched it, it was NVIDIA. No matter where you put it, even in these generations of old (the GTX 980 wasn't the top model, there were two GPUs above it, the 980 Ti and the Titan X), the xx104-class cards have always been the middle of the pack ones. Even with Kepler, the GTX 600 series were quickly complimented by the 700 series that introduced the GK110 which was sizably faster than the GK104, similarly to how the GTX 500 series (and the 580) launched only 8 months apart from the GTX 480 that fixed the 400 series' terrible thermals, it was wise of NVIDIA at the time not to repeat the GF100.

Anyway, the ill-fated 4080-12GB (full AD104) was no different, relative to the full AD102 it has like 40% of the shader count, and NVIDIA quickly realized that it wasn't going to stick. If they had gone through with it, the 4080-12GB would have been laughed out of every publication, which would hype the 7900 XT instead. The 103 segment is new and in Ampere it was only used in the RTX 3060 Ti in a very cutdown configuration, or in the mobile RTX 3080 Ti. Similarly to the AD103 compared to the AD102, the GA103 was a smaller and less powerful processor than the GA102. You could call it high end, but it was never intended to be in the leading pack either. It'd work... if AMD hadn't crashed their party with the 7900 XT in that segment either, which is giving you a Navi 31 chip with a MCD and some shaders disabled, which should more than perform competitively with the RTX 4080-16GB.
 

ARF

Joined
Jan 28, 2020
Messages
3,934 (2.55/day)
Location
Ex-usa
They're still cheaper than the equal offerings from Nvidia.

First we have to define what "equal" actually means. Maybe for some people RTX 3060 is "equal" with RX 6800 XT.
We know there is an extremely high performance difference but they can argue that the RTX 3060 is much cheaper cause, you know...

Look, there is 80-20% market share difference against AMD.
AMD has always been the cheaper, value option and despite this, it is not enough to improve the market situation of the company.
So, AMD needs to step up with something completely different, plus the discounts, of course.
 
Last edited:
Joined
Oct 15, 2010
Messages
951 (0.19/day)
System Name Little Boy / New Guy
Processor AMD Ryzen 9 5900X / Intel Core I5 10400F
Motherboard Asrock X470 Taichi Ultimate / Asus H410M Prime
Cooling ARCTIC Liquid Freezer II 280 A-RGB / ARCTIC Freezer 34 eSports DUO
Memory TeamGroup Zeus 2x16GB 3200Mhz CL16 / Teamgroup 1x16GB 3000Mhz CL18
Video Card(s) Asrock Phantom RX 6800 XT 16GB / Asus RTX 3060 Ti 8GB DUAL Mini V2
Storage Patriot Viper VPN100 Nvme 1TB / OCZ Vertex 4 256GB Sata / Ultrastar 2TB / IronWolf 4TB / WD Red 8TB
Display(s) Compumax MF32C 144Hz QHD / ViewSonic OMNI 27 144Hz QHD
Case Phanteks Eclipse P400A / Montech X3 Mesh
Power Supply Aresgame 850W 80+ Gold / Aerocool 850W Plus bronze
Mouse Gigabyte Force M7 Thor
Keyboard Gigabyte Aivia K8100
Software Windows 10 Pro 64 Bits
First we have to define what "equal" actually means. Maybe for some people RTX 3060 is "equal" with RX 6800 XT.
We know there is an extremely high performance difference but they can argue that the RTX 3060 is much cheaper cause, you know...

Look, there is 80-20% market share difference against AMD.
AMD has always been the cheaper, value option and despite this, it is not enough to improve the market situation of the company.
So, AMD needs to step up with something completely different, plus the discounts, of course.
You need amd to be much more competitive so that nvidia lower their prices and you then can buy nvidia? that's not how it works, what needs to happen is that nvidia loses market share to amd and for that to happen then people need to buy more from amd.

From 6800(even 6800 xt at $520) and below amd is wayyyyyy better value than nvidia counterparts, but hey nvidia mindshare does not understand how a rx 6600 is lightyears ahead in value than a rtx 3060.
 

ARF

Joined
Jan 28, 2020
Messages
3,934 (2.55/day)
Location
Ex-usa
You need amd to be much more competitive so that nvidia lower their prices and you then can buy nvidia? that's not how it works, what needs to happen is that nvidia loses market share to amd and for that to happen then people need to buy more from amd.

From 6800(even 6800 xt at $520) and below amd is wayyyyyy better value than nvidia counterparts, but hey nvidia mindshare does not understand how a rx 6600 is lightyears ahead in value than a rtx 3060.

This means that nvidia has created a large fanbase loyal to the brand, and also a brand that is recognisable as the "to-go" in the graphics cards market no matter the performance and no matter the price.
It's like a voodoo black magic or something...
 
Joined
Dec 25, 2020
Messages
4,596 (3.79/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Logitech G305 Lightspeed K/DA
Keyboard Logitech K400 Plus
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
This means that nvidia has created a large fanbase loyal to the brand, and also a brand that is recognisable as the "to-go" in the graphics cards market no matter the performance and no matter the price.
It's like a voodoo black magic or something...

This is not very far from the truth; however, AMD also has its diehards and loyalists. It's just that the segments that are loyal to AMD are either people who have grown wise and repulsed to NVIDIA's predatory business practices, old timers that have nostalgia from ATI, or Linux users. All combined, it's a very small minority of people.

NVIDIA's marketing machine is extraordinarily effective. When someone mentions ray tracing, what comes to mind? RTX. When someone brings up upscaling solutions, what comes to mind? DLSS. By successfully capturing the public's attention, they have built a perceived trust - and are now capitalizing on the brand name.
 
Joined
Sep 13, 2021
Messages
86 (0.09/day)
This is not very far from the truth; however, AMD also has its diehards and loyalists. It's just that the segments that are loyal to AMD are either people who have grown wise and repulsed to NVIDIA's predatory business practices, old timers that have nostalgia from ATI, or Linux users. All combined, it's a very small minority of people.
Creatives have no alternative, nVidia rules due to his broad software support. 4090 sales are fine, price is ok.

NVIDIA's marketing machine is extraordinarily effective. When someone mentions ray tracing, what comes to mind? RTX. When someone brings up upscaling solutions, what comes to mind? DLSS. By successfully capturing the public's attention, they have built a perceived trust - and are now capitalizing on the brand name.
Ray tracing was introduced by nVidia into gaming. DLSS was introduced by nVidia into gaming.
Being the technology leader in HPC/Visualization and AI has its perks. AMD lags behind here. These are not marketing tricks, AMD's future depends on catching up here. These technologies increase in importance from generation to generation. For the mid-tier gamer, this may not matter, if you only want rasterization, AMD is a better solution. If you want more than just playing with the card or like RT effects/eye candy, nVidia has a clear advantage.
If the 4080/16 GB came for $700, as some here are demanding, AMD could pack up. If the 4080/16 GB rasterization is about the same as the 7900 XTX for 1000$, 1100$ for 4080 would be ok. compared to AMD for better RT and Tensor cores. We will see.
 
Last edited:
Joined
Dec 25, 2020
Messages
4,596 (3.79/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Logitech G305 Lightspeed K/DA
Keyboard Logitech K400 Plus
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Creatives have no alternative, nVidia rules due to his broad software support. 4090 sales are fine, price is ok.


Ray tracing was introduced by nVidia into gaming. DLSS was introduced by nVidia into gaming.
Being the technology leader in HPC/Visualization and AI has its perks. AMD lags behind here. These are not marketing tricks, AMD's future depends on catching up here. These technologies increase in importance from generation to generation. For the mid-tier gamer, this may not matter, if you only want rasterization, AMD is a better solution. If you want more than just playing with the card or like RT effects/eye candy, nVidia has a clear advantage.
If the 4080/16 GB came for $700, as some here are demanding, AMD could pack up. If the 4080/16 GB rasterization is about the same as the 7900 XTX for 1000$, 1100$ for 4080 would be ok. compared to AMD for better RT and Tensor cores. We will see.

People have more money than sense. I only bought a 3090 back then because I saw a window at launch, I had the money and I took it. A week later, the crypto mining boom sent GPU prices skyrocketing, and at its height, it was at almost triple the already absurd sum I spent on it. But that doesn't make the price fine, even if sales are as expected for corporate. There's little justification beyond "just 'cause we can" for its pricing.

Not to mention I'm not too sure about that either. After the stores sold the initial batch of 4090s that got to my country, I haven't seen any restocks occur yet...

And no, NVIDIA just seized the moment. They didn't invent raytraced graphics. They just were the first to market with a product ready for it. AMD, Intel, and NV worked with Microsoft to design the specification. Also... the 4080 16GB isn't enough to make AMD pack up even if they didn't have something better than the 6900 XT on the way. It's just not that good a product.
 

ARF

Joined
Jan 28, 2020
Messages
3,934 (2.55/day)
Location
Ex-usa
People have more money than sense. I only bought a 3090 back then because I saw a window at launch, I had the money and I took it. A week later, the crypto mining boom sent GPU prices skyrocketing, and at its height, it was at almost triple the already absurd sum I spent on it. But that doesn't make the price fine, even if sales are as expected for corporate. There's little justification beyond "just 'cause we can" for its pricing.

Not to mention I'm not too sure about that either. After the stores sold the initial batch of 4090s that got to my country, I haven't seen any restocks occur yet...

And no, NVIDIA just seized the moment. They didn't invent raytraced graphics. They just were the first to market with a product ready for it. AMD, Intel, and NV worked with Microsoft to design the specification. Also... the 4080 16GB isn't enough to make AMD pack up even if they didn't have something better than the 6900 XT on the way. It's just not that good a product.

It is very possible that RTX is the new PhysX and the same fate will be shared.

People have more money than sense. I only bought a 3090 back then because I saw a window at launch, I had the money and I took it. A week later, the crypto mining boom sent GPU prices skyrocketing, and at its height, it was at almost triple the already absurd sum I spent on it.

Did you sell it for large profit?
 
Joined
Dec 14, 2011
Messages
944 (0.21/day)
Location
South-Africa
Processor AMD Ryzen 9 5900X
Motherboard ASUS ROG STRIX B550-F GAMING (WI-FI)
Cooling Corsair iCUE H115i Elite Capellix 280mm
Memory 32GB G.Skill DDR4 3600Mhz CL18
Video Card(s) ASUS RTX 3070 Ti TUF Gaming OC Edition
Storage Sabrent Rocket 1TB M.2
Display(s) Dell S3220DGF
Case Corsair iCUE 4000X
Audio Device(s) ASUS Xonar D2X
Power Supply Corsair AX760 Platinum
Mouse Razer DeathAdder V2 - Wireless
Keyboard Corsair K70 MK.2 Low-Profile Rapidfire
Software Microsoft Windows 11 Pro (64-bit)
And the 1070 on par with the 980ti, seems lot of people forgot what a true generational leap is.

Indeed; hopefully there will be a hardware website somewhere that will make a chart like this, to put things back into perspective and hopefully make people aware of what a "ride" nGreedia is taking them for.
 
Joined
Sep 13, 2021
Messages
86 (0.09/day)
People have more money than sense.
No, they have other opinions. But there is a disturbing tendency regarding the ability to objectively discuss differing opinions.

to I only bought a 3090 back then because I saw a window at launch, I had the money and I took it. A week later, the crypto mining boom sent GPU prices skyrocketing, and at its height, it was at almost triple the already absurd sum I spent on it. But that doesn't make the price fine, even if sales are as expected for corporate. There's little justification beyond "just 'cause we can" for its pricing
Has anyone here claimed that cryptomining has ensured fair prices for gaming GPUs?

Not to mention I'm not too sure about that either. After the stores sold the initial batch of 4090s that got to my country, I haven't seen any restocks occur yet...
There are many sources, claiming 4080 is in stock, 4090 sold out fast. So what is your argument here?

And no, NVIDIA just seized the moment. They didn't invent raytraced graphics. They just were the first to market with a product ready for it. AMD, Intel, and NV worked with Microsoft to design the specification. Also... the 4080 16GB isn't enough to make AMD pack up even if they didn't have something better than the 6900 XT on the way. It's just not that good a product.

"Ray tracing was introduced by nVidia into gaming". 100% fact. Your answer assumes I said otherwise. Can you explain why? Raytracing is industry standard visualization for a long time. Due to high computing demand, it was for professionals only before nVidia implemented it to the rtx 20x0 series.
You failed to explain your statement. I said: 4080 for 700$ and 7900 would be DOA for 900-1000$, if they have the same rasterization power. Have you any argument?

It is very possible that RTX is the new PhysX and the same fate will be shared.
Wrong. GPUs without hardware-supported RT can only be found in the entry-level class. The higher the price, the more important the RT performance and a broad software support. AMD will lose the enthusiast customer base if they don't catch up with RT like they lost the pro segment. It won't be long before the first games developed on new generation gaming engines will hit the market. Then it gets serious.
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
3,934 (2.55/day)
Location
Ex-usa
Wrong. GPUs without hardware-supported RT can only be found in the entry-level class. The higher the price, the more important the RT performance and a broad software support. AMD will lose the enthusiast customer base if they don't catch up with RT like they lost the pro segment. It won't be long before the first games developed on new generation gaming engines will hit the market. Then it gets serious.

I disagree.
RT is too far away from being reality, you can't run it without very deep and aggressive upscaling (DLSS and similar)...

I am also an enthusiast, and I don't care about ray-tracing - traditional lighting is good for me, the games have much more serious problems than only the lighting.
This is why I can't wait to order a brand new AMD Radeon RX 7900 XT 20 GB.
 
Joined
Aug 10, 2021
Messages
166 (0.17/day)
System Name Main
Processor 5900X
Motherboard Asrock 570X Taichi
Memory 32GB
Video Card(s) 6800XT
Display(s) Odyssey C49G95T - 5120 x 1440
I will......

When hell freezes over.
so now?
:D:p
1669033411420.png


"Ray tracing was introduced by nVidia into gaming". 100% fact. Your answer assumes I said otherwise. Can you explain why? Raytracing is industry standard visualization for a long time. Due to high computing demand, it was for professionals only before nVidia implemented it to the rtx 20x0 series.
ehh, nah kinda. It's like a vague half-truth. Saying it like that, makes it sound like nVidia just straight up "invented" RT for gaming/real-time. Which isn't really the truth.
Do you think MS just scrapped together DXR the month between 2080 release and DXR public release? Or that PS5 and Xbox just added (weak) RT possibility in the 2 year between (or the 6000-series RDNA2 (inb4 it is weak, yes not the point))
Yes, nVidia was first to market with RT-"capable" product, and did some extremely good marketing work, with using RTX branding for their initial RT


Wrong. GPUs without hardware-supported RT can only be found in the entry-level class. The higher the price, the more important the RT performance and a broad software support.
lmao, I quoted this inbetween it being edited, so it's a bit mangled

Too early to say if it'll get the "physX" treatment, but I doubt it. Given RT isn't a vendor specific product, but is gaining access through the API's (DXR, Vulkan RT), then it'll probably spread and be usable in the future (matter for discussion ofc). As it look now, it'll probably end up being ether full on solo light source, or in a hybrid way (but more) like now (or both)
 
Joined
Sep 13, 2021
Messages
86 (0.09/day)
so now?
:D:p
View attachment 271096


ehh, nah kinda. It's like a vague half-truth. Saying it like that, makes it sound like nVidia just straight up "invented" RT for gaming/real-time. Which isn't really the truth.
I said, nVidia introduced RT to gaming, which is 100% factual.

Do you think MS just scrapped together DXR the month between 2080 release and DXR public release? Or that PS5 and Xbox just added (weak) RT possibility in the 2 year between (or the 6000-series RDNA2 (inb4 it is weak, yes not the point))
Yes, nVidia was first to market with RT-"capable" product, and did some extremely good marketing work, with using RTX branding for their initial RT
It needs about 4-5 years to upgrade the game engine development. The better effects are about to come the next years. PS5 and Xbox won't be able to offer good RT anytime soon. Who wants cutting edge graphics, requires cutting edge GPUs, not mainstream.
 
D

Deleted member 185088

Guest
Indeed; hopefully there will be a hardware website somewhere that will make a chart like this, to put things back into perspective and hopefully make people aware of what a "ride" nGreedia is taking them for.
You can check AdoredTV videos, he was one of the few that didn't fall for nVidia's marketing for Ampere.
 
Joined
Sep 13, 2021
Messages
86 (0.09/day)
I disagree.
RT is too far away from being reality, you can't run it without very deep and aggressive upscaling (DLSS and similar)...

I am also an enthusiast, and I don't care about ray-tracing - traditional lighting is good for me, the games have much more serious problems than only the lighting.
This is why I can't wait to order a brand new AMD Radeon RX 7900 XT 20 GB.
They have all problems, as the RT effects have been subsequently integrated into mature game engines. The next generation of game engines has RT as an integrated part, which gives a different quality.
 
Joined
Apr 14, 2018
Messages
459 (0.21/day)
It is very possible that RTX is the new PhysX and the same fate will be shared.

Closed ecosystem anything is bad. Eventually everything will be ray traced and RTX will certainly be a thing of the past.
 
Joined
Dec 25, 2020
Messages
4,596 (3.79/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1G
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Logitech G305 Lightspeed K/DA
Keyboard Logitech K400 Plus
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
Did you sell it for large profit?

No. I kept it because I needed it, and I still do.

Also, I don't think it is going anywhere soon. DirectX Raytracing is not a closed NVIDIA-only thing, and since Ampere it's supported from the bottom up through every tier - and it works.
 
Joined
Jun 10, 2014
Messages
2,900 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
They aren't now (beyond holding back on lower SKUs in the stack to move the same Ampere stock they call ancient GPUs unworthy of DLSS 3 but as a certified poor, I digress) but they would rather do that then engage in a price war, especially considered how much room for SKUs they have.
So you're saying Nvidia could potentially exploit something so we are going to assume they are evil?
Where is the evidence of Nvidia holding back the lower SKUs? (beyond nonsense from some YouTube channels)
It's normal that lower chips follow in sequence. I thought people would remember this by now.

On the last bit, nonsense? I wasn't the one who announced a card and then unlaunched it, it was NVIDIA.
Renaming a product due to market backlash? How is this relevant to your claims?

No matter where you put it, even in these generations of old (the GTX 980 wasn't the top model, there were two GPUs above it, the 980 Ti and the Titan X),
GTX 980 was the top model for about half a year, and it remained in the high-end segment until it was suceeded by Pascal.

the xx104-class cards have always been the middle of the pack ones. Even with Kepler, the GTX 600 series were quickly complimented by the 700 series that introduced the GK110 which was sizably faster than the GK104, similarly to how the GTX 500 series (and the 580) launched only 8 months apart from the GTX 480 that fixed the 400 series' terrible thermals, it was wise of NVIDIA at the time not to repeat the GF100.
The mid-range cards of the 600-series was using both GK106 and GK104 chips.
The 600-series was "short lived" compared to the current release tempo. Back then Nvidia used to release a full generation and a refreshed generation (with new silicon) every ~1.25-1.5 years or so.
Geforce GTX 480 was delayed due to at least three extra steppings.

And back in the 400-series they used a GF100 chip in the GTX 465, which scaled terribly.
You should spend some time looking through the List of Nvidia GPSs. The naming is arbitrary; in one generation a 06 chip is the lowest, in others the 08 chip is. What they do is design the biggest chip in the family first, then "cut down" the design into as many chips as they want to, and name them accordingly; 0, 2, 3, 4, 6, 7, 8. Sometimes they even make it more complicated by making 110, 114, etc. which seems like minor revisions to 100 and 104 respectively.
So listen and learn, or keep digging…


AMD has always been the cheaper, value option and despite this, it is not enough to improve the market situation of the company.
So, AMD needs to step up with something completely different, plus the discounts, of course.
This might be your impression, but it doesn't match the reality. Back in the ATI days, they used to offer higher value in the upper mid-range to lower high-end segments, but since then they have been all over the place.
The Fury cards didn't start things off well, low availability and high price. Followed by RX 480/580 which were very hard to come by at a good price, compared to the competitor GTX 1060 which sold massive amounts and still was very available, even below MSRP at times. The RX Vega series was even worse, most have now forgotten that the $400/$500 price tag was initially with a game bundle, and it took months before they were somewhat available close to that price. Over the past 5+ years, AMD's supplies have been too low. Quite often the cheaper models people want are out of stock, while Nvidia's counterparts usually are. This is why I said AMD needs to have plenty of supplies to gain market shares.

We need to stop painting Nvidia/AMD/(Intel) as villains or heroes. They are not our friends, they are companies who want to make money, and given the chance, they will all overcharge for their products.

It is very possible that RTX is the new PhysX and the same fate will be shared.
RTX is their term for the overarching GPU architecture:

I doubt it will go away until their next major thing.
 
Top