• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Plays the VRAM Card Against NVIDIA

Status
Not open for further replies.
Joined
Jan 12, 2023
Messages
190 (0.39/day)
System Name IZALITH (or just "Lith")
Processor AMD Ryzen 7 7800X3D (4.2Ghz base 5.0Ghz boost, -30 PBO offset)
Motherboard Gigabyte X670E Aorus Master Rev 1.0
Cooling Deepcool Gammaxx AG400 Single Tower
Memory Corsair Vengeance 64GB (2x32GB) 6000MHz CL40 DDR5 XMP (XMP enabled)
Video Card(s) PowerColor Radeon RX 7900 XTX Red Devil OC 24GB (3.29Ghz boost -40mV UV)
Storage 2x1TB SSD, 2x2TB SSD, 2x 8TB HDD
Display(s) Samsung Odyssey G51C 27" QHD (1440p 165Hz) + Samsung Odyssey G3 24" FHD (1080p 165Hz)
Case Corsair 7000D Airflow Full Tower
Audio Device(s) Corsair HS55 Surround Wired Headset/LG Z407 Speaker Set
Power Supply Corsair HX1000 Platinum Modular (1000W)
Mouse Corsair RGB Harpoon PRO Wired Mouse
Keyboard Corsair K60 RGB PRO Mechanical Gaming Keyboard
Software Windows 11 Professional
I was going to try write a more technical post but I don't have the knowledge to back it up (yet), so I'll just reiterate: corporate marketing, TAME corporate marketing at that, is a silly thing to get into a mud fight over. AMD and Nvidia are not your friends and you gain nothing by it.

From a common-sense perspective: more VRAM is going to be more future-proofed. You have more resources available to you when future games demand more. AMD sell higher-VRAM-equipped cards more than Nvidia, ergo this advantage belongs to them and it is a valid critique against the competition. Just as Nvidia's Ray Tracing prowess is a future-proofed advantage that belongs to them. Once again, don't get offended, just buy the equipment that better suits your needs. Saddling yourself to a single tribe is doing yourself a disservice and defending that tribe with empty ad-hom and no data is a good way to look foolish.
 
Joined
Sep 26, 2014
Messages
4 (0.00/day)
As long as AMD matches rasterization performance, DLSS 2.1, Reflex feature for competetice games and RT performance of nvidia cards in mid-range 3080/4070/4070Ti.
While having reliable drivers like they now have will make me give them a try.

The one thing stopping me better support for DLSS, better RT on nvidia, DLSS 2.1 looks better than FSR 2.1, nvidia got FG for CPU bottleneck and in some games.
Reflex can be replicated with caping framerate so that is non issue.
Also AMD has sometimes quirky performance in some games usually RT performance in new games and has multiple outliers at time which are only fixed some time down the pipeline.

While nvidia has always dialed in optimization for all games even on Beta launch days.
The only major caviat of nvidia for mee at the moment is lacking VRAM and minor con is pricing.
This is the only time around I am considering AMD since ATI times, so well over a decade.

AMD better captiulise on nvidia big mistake with lacking VRAM.
I am sure I am not the only one who think this way.

Currently that is only RX 7900, which is out of my price range. As I am looking at spending at most 700€ incl tax.
Waiting for RX 7800 in June.
As long time nvidia doesn't release Refresh by then due to VRAM with bigger chip coming down a tier.
They can due that when they are selling 60 tier card at best as 70 tier.
 
Joined
Jan 14, 2019
Messages
10,032 (5.15/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
As long as AMD matches rasterization performance
They do at every price range with an AMD card in it.

FSR 2.0 equals it in quality and speed. Those who say it doesn't live in a placebo world.

Reflex feature for competetice games
They haven't got that, but they've got Anti-Lag which is theoretically similar.

RT performance of nvidia cards in mid-range 3080/4070/4070Ti.
That's a valid point, although those cards are a fair bit more expensive than their AMD equivalents in raster performance. I'm not sure I'd pay 20-30% more for 10-15% better RT performance.

While having reliable drivers like they now have will make me give them a try.
They do have reliable drivers. The 5700 XT days are over.

Seriously, if you're looking for an upgrade, consider all the above. ;)
 
Joined
Sep 26, 2014
Messages
4 (0.00/day)
They do at every price range with an AMD card in it.


FSR 2.0 equals it in quality and speed. Those who say it doesn't live in a placebo world.


They haven't got that, but they've got Anti-Lag which is theoretically similar.


That's a valid point, although those cards are a fair bit more expensive than their AMD equivalents in raster performance. I'm not sure I'd pay 20-30% more for 10-15% better RT performance.


They do have reliable drivers. The 5700 XT days are over.

Seriously, if you're looking for an upgrade, consider all the above. ;)
Multiple techtubers say that DLSS looks better Quality vs Quality to them.
That's about it... tho.

The only thing I forgot is I am really excited about RTX Remix of old games.
That's one thing that is keeping me away from going with AMD.

I am debating the RX6950XT, but I may just wait and see until June for RX 7800.
 
Joined
Jan 14, 2019
Messages
10,032 (5.15/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Multiple techtubers say that DLSS looks better Quality vs Quality to them.
That's about it... tho.
I'd never listen to techtubers, personally. They're all about clickbait and ad revenues these days. I've seen some terrible content with false/biased info come out from even of the major ones at some point.

I've seen both DLSS 2.0/2.1 and FSR 2.0 first-hand, and can't tell the difference. You can also check out game reviews here on TPU and see if there's any difference yourself. Just be careful, you may spot small details on a still image (I'm not saying that you will, though) which would be unrecognisable in a live game.

The only thing I forgot is I am really excited about RTX Remix of old games.
Yeah, the tech looks promising. I'm just not sure what will become of it. We'll see.

I am debating the RX6950XT, but I may just wait and see until June for RX 7800.
The 6950 XT is a fine card, and its drivers have matured to the point when you can expect zero issues. Waiting for future releases is always a gamble, in my opinion.
 
Joined
Sep 26, 2022
Messages
216 (0.36/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
Waiting for future releases is always a gamble, in my opinion.
2 reasons to not wait:
  1. Low performance/€ improvement (so far)
  2. Growing tensions between China and Taiwan
Bonus:
7700/7700XT will likely have similar performance to the 6800XT, but with (only) 12G VRAM
 
Joined
Jan 20, 2019
Messages
1,295 (0.67/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
I'm glad AMD's playing the VRAM card, especially with some of the HU stuff ive just viewed in this thread. A bit of a shocker!

AMD better back up the glad tidings with abundant VRAM offerings with 7600/7800 tier cards. Hopefully, NVIDIA will.... nah toss that, "obsolescence" pays well. The green captain is seeing too much green to be concerned with the minority.

I'm still praying for some messianic intervention ... 16GB+ XTX/4080s rolling for $700/$800. Intel the messiah? Perhaps AMD playing santa with weighty discounts? Or maybe the Green monster with its pandemic-prolonged-price-strategy catching the covid bug (they're certainly asking for it)? who knows.
 
Joined
Sep 15, 2011
Messages
6,484 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
This is so much BS that it is ridiculous. I have yet a game to play that would require more than 10GB of VRAM. Even the crappiest of all ports ever released, the "The Last of Us - P1" is smooth as butter on Ultra with G-Sync ON, even if the so called in game VRAM usage is around 12.8GB of VRAM.
 
Joined
Mar 19, 2023
Messages
153 (0.36/day)
Location
Hyrule Castle, France
Processor Ryzen 5600x
Memory Crucial Ballistix
Video Card(s) RX 7900 XT
Storage SN850x
Display(s) Gigabyte M32U - LG UltraGear+ 4K 28"
Case Fractal Design Meshify C Mini
Power Supply Corsair RM650x (2021)
I have yet a game to play that would require more than 10GB of VRAM.
The keyword being "yet".
In just 3 months, we've had 4 large games where 8Gb started being a serious problem. I predict that the trend isn't going to stop at all during the next two years.
We'll see just how the 10Gb 3080 lasts, and I think the 12Gb 4070/Ti will be worse. At least with the 3080 you had 2 good years of PS4 era holdovers until the requirements climbed hard. The 4070s feel sufficient "for now" I'm sure, but will it even last 2 years? I highly doubt it.
 
Joined
Apr 14, 2022
Messages
670 (0.88/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Asus XG35VQ
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Software Windows 11 64bit
For gaming PC, CPU have become stupid strong enough, few years ago I'd be hyped but right now, for gaming, you really need a 4-5 years upgrade plan...I'd rather save money, put a mortgage and get a RTX 6090 than a Zen 6 for sure

For GPU, if they weren't so damn expensive I'd be changing them more often but at least they are the one making a real gaming difference

I wouldn't say the cpus have become strong enough. Yes their MT performance is amazing but they are miles behind for supporting high end gpus like 4080/4090/7900XT/XTX at ''low'' resolutions.

The GPUs increase their performance drastically in every gen while the CPUs are not.

The keyword being "yet".
In just 3 months, we've had 4 large games where 8Gb started being a serious problem. I predict that the trend isn't going to stop at all during the next two years.
We'll see just how the 10Gb 3080 lasts, and I think the 12Gb 4070/Ti will be worse. At least with the 3080 you had 2 good years of PS4 era holdovers until the requirements climbed hard. The 4070s feel sufficient "for now" I'm sure, but will it even last 2 years? I highly doubt it.

In 3 months, 4 console games have been released (all of them AMDs sponsored as someone in the forum would say but I think it's a coincidence), all of them were broken at launch, all of them got or will get a patch to make them run properly, all of them don't justify the vram or any other requirement for the graphics they deliver.
While at the same time we had games like A Plague Tale Requiem, CP77 (even)Overdrive that require less than half vram than the aforementioned games and look light years better.

When Crysis released, no one could play it and no one had a problem with that.
Because we all knew that it's a game that the dev just put stuff in a graphics engine that cannot handle, just to pose what they could achieve. Even today Crysis or ARMA 3 etc. cannot be played at high fps because of their crap engines.

nVidia doesn't put much vram because their cards have a ton of uses apart from gaming.
AMDs VRAM is useless to nearly any professional software out there, so it's a cost free advantage and line in the presentations and ads.
 
Joined
Jan 14, 2019
Messages
10,032 (5.15/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
I wouldn't say the cpus have become strong enough. Yes their MT performance is amazing but they are miles behind for supporting high end gpus like 4080/4090/7900XT/XTX at ''low'' resolutions.
Nobody plays at low resolutions with a 4090. If you do, having 200 FPS instead of 300 will mean nothing.

In 3 months, 4 console games have been released (all of them AMDs sponsored as someone in the forum would say but I think it's a coincidence), all of them were broken at launch, all of them got or will get a patch to make them run properly, all of them don't justify the vram or any other requirement for the graphics they deliver.
While at the same time we had games like A Plague Tale Requiem, CP77 (even)Overdrive that require less than half vram than the aforementioned games and look light years better.

When Crysis released, no one could play it and no one had a problem with that.
Because we all knew that it's a game that the dev just put stuff in a graphics engine that cannot handle, just to pose what they could achieve. Even today Crysis or ARMA 3 etc. cannot be played at high fps because of their crap engines.

nVidia doesn't put much vram because their cards have a ton of uses apart from gaming.
AMDs VRAM is useless to nearly any professional software out there, so it's a cost free advantage and line in the presentations and ads.
It looks to me like game devs started to increase system requirements to tick the RT box lately, instead of actually increasing visual quality, which is sad.
 
Joined
Apr 14, 2022
Messages
670 (0.88/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Asus XG35VQ
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Software Windows 11 64bit
Nobody plays at low resolutions with a 4090. If you do, having 200 FPS instead of 300 will mean nothing.


It looks to me like game devs started to increase system requirements to tick the RT box lately, instead of actually increasing visual quality, which is sad.
1440p is not a low resolution and in most games the 4090(even the 4080) cannot be fully utilised. And that's because the cpus are ''slow''.
The term ''slow'' is not accurate though. The truth is that the CPUs are not fully utilised too because of the development issues.

You're right, most games add RT shadows, reflections, AO etc. while it's not necessary.
ex. RDR2, TLOU, God of War, Uncharted, A Plague Tale Innocence/Requiem etc.....you don't need RT if you have time/money and develop the game properly.
But at the same time you can't ask for a truckload of vram without delivering a next gen result.
 

jarpe

New Member
Joined
Feb 12, 2023
Messages
4 (0.01/day)
1. I quoted UK prices at Scan. The 7900 XTX is £1000, the 4080 is £1300. That's 30% price increase for a -2% reduction in performance. Even in RT, it's only 16% faster. You must be crazy to think it's a good deal.
2. The difference between DLSS 2.0 and FSR 2.0 is placebo at best. Personally, I prefer native resolution to either of the two. I don't care about magic frames, either. The only valid argument is ray tracing, but even that isn't worth 30% extra money when it's only 16% faster.
View attachment 291349

Even for 1200 USD, the 4080 offers terrible value:
View attachment 291350
The average price difference in UK from 7900XTX vs 4080 is 10% not 30%, in ocuk for example the cheapest 7900XTX cost £1050 and cheapest 4080 £1100, 5% price difference.

Also in games that heavily use RT(and the RT actually make a difference) the 4080 is 25-45% faster.
DLSS3 is far superior to FSR.
In addition the 4080 have less power consumption.

In cyberpunk for example with RT ultra at 4K(DLSS+FG) the 4080 gets 110fps and 7900XTX just 20fps at the same graphic settings, that is 5.5 times faster, that is the real world difference between the two. As you can see the 4080 offers so much more for only 10% more money, this Gen AMD manage to be greedier than nvidia.
 
Joined
Jan 14, 2019
Messages
10,032 (5.15/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
DLSS3 is far superior to FSR.
That is an opinion, not fact. Besides, FSR 3.0 is just around the corner, I'd wait for it before calling judgement.

In cyberpunk for example with RT ultra at 4K(DLSS+FG) the 4080 gets 110fps and 7900XTX just 20fps at the same graphic settings, that is 5.5 times faster, that is the real world difference between the two. As you can see the 4080 offers so much more for only 10% more money, this Gen AMD manage to be greedier than nvidia.
Whether you like FG or not depends on how sensitive you are to input latency. The technology is not without issues at the moment, especially when you generate extra frames from a low frame rate situation.

Personally, I consider FG frame rates irrelevant for comparison.

Also in games that heavily use RT(and the RT actually make a difference) the 4080 is 25-45% faster.
Not really.
1681563468127.png


The average price difference in UK from 7900XTX vs 4080 is 10% not 30%, in ocuk for example the cheapest 7900XTX cost £1050 and cheapest 4080 £1100, 5% price difference.
Fair enough - I mostly shop at Scan, that's why I was comparing prices from there. A 5% difference might actually be worth it.
 
Joined
Mar 19, 2023
Messages
153 (0.36/day)
Location
Hyrule Castle, France
Processor Ryzen 5600x
Memory Crucial Ballistix
Video Card(s) RX 7900 XT
Storage SN850x
Display(s) Gigabyte M32U - LG UltraGear+ 4K 28"
Case Fractal Design Meshify C Mini
Power Supply Corsair RM650x (2021)
In 3 months, 4 console games have been released (all of them AMDs sponsored as someone in the forum would say but I think it's a coincidence), all of them were broken at launch, all of them got or will get a patch to make them run properly, all of them don't justify the vram or any other requirement for the graphics they deliver.
"It's the devs fault" or the alternative "AMD sponsored them" taken by some as the "AMD VRAM conspiracy"
While at the same time we had games like A Plague Tale Requiem, CP77 (even)Overdrive that require less than half vram than the aforementioned games and look light years better.
"I've had better"
When Crysis released, no one could play it and no one had a problem with that.
Because we all knew that it's a game that the dev just put stuff in a graphics engine that cannot handle, just to pose what they could achieve. Even today Crysis or ARMA 3 etc. cannot be played at high fps because of their crap engines.
"10 years ago, there was one game that was also beeeg and this is just more of the same, even though it's basically 3 titles out of 4"
nVidia doesn't put much vram because their cards have a ton of uses apart from gaming.
AMDs VRAM is useless to nearly any professional software out there, so it's a cost free advantage and line in the presentations and ads.
"the corporation known for highly reliable compute doesn't put a lot of VRAM because their cards are only for gaming, while the corporation that doesn't have good compute support shoves a ton of VRAM for kicks"

Right, it's the same things I've been reading to the point of complete distaste, so I'm just going to put it in a nice image:
Screenshot 2023-04-15 121531.png

Since the denialists are in full force right now, and that logic isn't very useful against them, I'm just going to ignore these pointless ramblings and let time run its course.
Within about a year or so, I expect that all of them will have repeated "bad pc port" or "another AMD sponsored title" until their tongues fall off.

And then they'll do as we've been advocating for all this time, buy a 16Go VRAM card and end this charade.

Oh and BTW, I've been told that I'm extremely optimistic about the 16Go by 3-4 years time. It's more of a "at least 16Go" situation according to some.
(also no need to tell me about the graph's inaccuracy, I didn't make it accurate because it's not about the numbers but the trend, and the trend is very, very clear)
 
Last edited:
Joined
Nov 3, 2011
Messages
690 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H115i Elite Capellix XT
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K IPS FreeSync/GSync DP, LG 27UL600 27in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
I wouldn't say the cpus have become strong enough. Yes their MT performance is amazing but they are miles behind for supporting high end gpus like 4080/4090/7900XT/XTX at ''low'' resolutions.

The GPUs increase their performance drastically in every gen while the CPUs are not.



In 3 months, 4 console games have been released (all of them AMDs sponsored as someone in the forum would say but I think it's a coincidence), all of them were broken at launch, all of them got or will get a patch to make them run properly, all of them don't justify the vram or any other requirement for the graphics they deliver.
While at the same time we had games like A Plague Tale Requiem, CP77 (even)Overdrive that require less than half vram than the aforementioned games and look light years better.

When Crysis released, no one could play it and no one had a problem with that.
Because we all knew that it's a game that the dev just put stuff in a graphics engine that cannot handle, just to pose what they could achieve. Even today Crysis or ARMA 3 etc. cannot be played at high fps because of their crap engines.

nVidia doesn't put much vram because their cards have a ton of uses apart from gaming.
AMDs VRAM is useless to nearly any professional software out there
, so it's a cost free advantage and line in the presentations and ads.
VRAM is important in pro apps i.e. it avoids VRAM-related errors.

VRAM.png


https://www.pugetsystems.com/recomm...-Premiere-Pro-CC-143/Hardware-Recommendations

RTX 4080's 16 GB VRAM is sufficient enough like the professional RTX A4000's 16 GB VRAM.

"8 GB" VRAM is just a toy video card.

Your "AMDs VRAM is useless to nearly any professional software out there" is flawed since professional video cards like GA104-based RTX A4000 have 16 GB VRAM.

VRAM affects the game's primary artwork quality. For RTX 4060 Ti /4070 / 4070 Ti's price range, I wouldn't be looking at NVIDIA. I can afford RTX 4080 16 VGRAM.

The average price difference in UK from 7900XTX vs 4080 is 10% not 30%, in ocuk for example the cheapest 7900XTX cost £1050 and cheapest 4080 £1100, 5% price difference.

Also in games that heavily use RT(and the RT actually make a difference) the 4080 is 25-45% faster.
DLSS3 is far superior to FSR.
In addition the 4080 have less power consumption.

In cyberpunk for example with RT ultra at 4K(DLSS+FG) the 4080 gets 110fps and 7900XTX just 20fps at the same graphic settings, that is 5.5 times faster, that is the real world difference between the two. As you can see the 4080 offers so much more for only 10% more money, this Gen AMD manage to be greedier than nvidia.
Lesser NVIDIA SKUs with "8 GB VRAM" didn't deliver the minimum PS5 experience e.g. TLOU P1. TLOU P1 or Resident Evil 4 Remake are not the only games that killed 8GB VRAM.

RTX 4080's 16 GB VRAM is the minimum for professional cards like RTX A4000 16 GB VRAM. RTX 4080's GA103 chip is reused for mobile RTX 4090 SKUs.

RTX 4080's 16 GB VRAM existence debunks pro-NVIDIA shills' "8 GB VRAM" is enough arguments. VRAM affects the game's primary artwork quality.

Very bold of them to make another VRAM related blog post after the whole fiasco with the 6500 XT launch where they removed their blog post stating 4 GB VRAM GPUs are bad.
PS5 / XSX console GPU equivalent on the PC is at least RX 6700 XT 12 GB. 6500 XT is below PS5's specs.

Technically it is possible when higher density memory chips become available, on the last gen there was a guy offering to make "custom" 3080 20GB (desoldering the 1GB chips and soldering 2GB ones). This doesn't require any modification to the die, the bus width is still the same. The probability that we see this from Nvidia is very, very small, though.

NVIDIA didn't combine the best aspects of RTX 3070 Ti's 6144 CUDA cores with RTX 3060's 12 GB VRAM since it would be close to GA104-based RTX A4000 16 GB VRAM.

RTX 3060's 12 GB VRAM has a 2 GB memory chip density.

There are 16 GB VRAM mods for RTX 3070 and it works.
 
Last edited:
Joined
Sep 26, 2022
Messages
216 (0.36/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
There are 16 GB VRAM mods for RTX 3070 and it works.
Remember Point of View? IIRC they did something similar on the days of the GTX570, launching a 2560MB version, it was especially usuful for SLI setups.
 

bug

Joined
May 22, 2015
Messages
13,265 (4.04/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
"It's the devs fault" or the alternative "AMD sponsored them" taken by some as the "AMD VRAM conspiracy"

"I've had better"

"10 years ago, there was one game that was also beeeg and this is just more of the same, even though it's basically 3 titles out of 4"

"the corporation known for highly reliable compute doesn't put a lot of VRAM because their cards are only for gaming, while the corporation that doesn't have good compute support shoves a ton of VRAM for kicks"

Right, it's the same things I've been reading to the point of complete distaste, so I'm just going to put it in a nice image:
View attachment 291651
Since the denialists are in full force right now, and that logic isn't very useful against them, I'm just going to ignore these pointless ramblings and let time run its course.
Within about a year or so, I expect that all of them will have repeated "bad pc port" or "another AMD sponsored title" until their tongues fall off.

And then they'll do as we've been advocating for all this time, buy a 16Go VRAM card and end this charade.

Oh and BTW, I've been told that I'm extremely optimistic about the 16Go by 3-4 years time. It's more of a "at least 16Go" situation according to some.
(also no need to tell me about the graph's inaccuracy, I didn't make it accurate because it's not about the numbers but the trend, and the trend is very, very clear)
Ah, you can draw a graph. I didn't know that, that obviously changes everything.

The issue is really simple: it doesn't matter how much VRAM a video card has, anyone can come up with a texture that won't fit. Hardware limits will always be surpassed by software,simply because software changes faster. It is perfectly reasonable for me to require 4, 6 or 8GB VRAM, while you won't accept anything under 16 GB. It doesn't make any of us wrong (unless we buy those cards to play Solitaire maybe).
 
Joined
Mar 19, 2023
Messages
153 (0.36/day)
Location
Hyrule Castle, France
Processor Ryzen 5600x
Memory Crucial Ballistix
Video Card(s) RX 7900 XT
Storage SN850x
Display(s) Gigabyte M32U - LG UltraGear+ 4K 28"
Case Fractal Design Meshify C Mini
Power Supply Corsair RM650x (2021)
The issue is really simple: it doesn't matter how much VRAM a video card has
By all means, do talk. I am merely waiting for your tongue to fall off, as I said. I expect that'll take a year or so?
No need to test your sarcasm on me in the meantime, I'm afraid the time for sarcasm is past and now it's only about waiting for Irony to come and establish its kingdom yet again.
 
Joined
Nov 3, 2011
Messages
690 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H115i Elite Capellix XT
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K IPS FreeSync/GSync DP, LG 27UL600 27in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
Remember Point of View? IIRC they did something similar on the days of the GTX570, launching a 2560MB version, it was especially usuful for SLI setups.
GA104 can support 16 GB VRAM via the RTX A4000 SKU and RTX 3070 Ti 16 GB VRAM prototype.

RTX-3070TI-16GB-4-768x1045.jpg


Source: MEGAsizeGPU
 

pmerritt

New Member
Joined
Jan 13, 2023
Messages
8 (0.02/day)
Given the 3070 seems to struggle at 720p in some games I don't think turning textures down 1 notch really fixes the issue.

Also your dichotomy is false. Frequently it is 2 GPUs with similar performance like 3070 vs 6800 or 3060Ti vs 6700XT where the latter while performing about the same has more VRAM. Those who chose the 3060Ti or 3070 over the similar priced AMD cards for the ray tracing are suffering in the latest games at settings the 6800 or 6700XT can play perfectly fine.
What games are you playing where a 3070 has trouble at 720p? I have a 1070ti and the ONLY game I turned down that low was Hogwarts and it was only to get higher framerates in a couple of areas because everywhere else was about 90fps.......and people say there is nothing wrong with it's optimization. There could be something wrong with your card.
 
Joined
Sep 26, 2022
Messages
216 (0.36/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
GA104 can support 16 GB VRAM via the RTX A4000 SKU and RTX 3070 Ti 16 GB VRAM prototype.
I understand that. And the GTX570 also could support (and fully access/use) 2560MB of VRAM, the point is, even though such variant was never "announced" by Nvidia, some AIB stepped up and filled that gap.
This time nobody "had the balls" to do it.

1681800568759.png
 
Joined
Nov 3, 2011
Messages
690 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H115i Elite Capellix XT
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K IPS FreeSync/GSync DP, LG 27UL600 27in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
I understand that. And the GTX570 also could support (and fully access/use) 2560MB of VRAM, the point is, even though such variant was never "announced" by Nvidia, some AIB stepped up and filled that gap.
This time nobody "had the balls" to do it.

View attachment 292029
FYI, GF110 chip design can support up to 6 GB VRAM via server Tesla X2090 or workstation Quadro 7000 https://www.techpowerup.com/gpu-specs/tesla-x2090.c1887 or https://www.techpowerup.com/gpu-specs/quadro-7000.c1840

GTX 570 SKU has a defect recovered GF110 chip design.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
16,374 (4.68/day)
Location
Kepler-186f
Processor Ryzen 7800X3D -30 uv
Motherboard AsRock Steel Legend B650
Cooling MSI C360 AIO
Memory 32gb 6000 CL 30-36-36-76
Video Card(s) MERC310 7900 XT -50 uv
Display(s) NZXT Canvas IPS 1440p 165hz 27"
Case NZXT H710 (Red/Black)
Audio Device(s) HD58X, Asgard 2, Modi 3
Power Supply Corsair RM850W

techspot just did a new review of vram posted about 50 minutes ago.

AMD is such good value, man I would never buy a 8gb vram card in todays gaming world. wild Nvidia still does this.
 
Status
Not open for further replies.
Top