- Joined
- Jul 5, 2013
- Messages
- 31,435 (7.21/day)
You're complaining about having more RAM? WTH?!?Dislike the 32gb vram though, 24 would be enough.
You're complaining about having more RAM? WTH?!?Dislike the 32gb vram though, 24 would be enough.
System Name | Mean machine |
---|---|
Processor | AMD 6900HS |
Memory | 2x16 GB 4800C40 |
Video Card(s) | AMD Radeon 6700S |
Costs more, sucks down power, it's not needed for a 5080 tier GPU. 16 is meh, 24 is good.You're complaining about having more RAM? WTH?!?
That would be a fair point IF VRAM used a lot of power. It doesn't. VRAM accounts for less than 7% of a cards total power profile. It's just not a problem. 32GB is solid.Costs more, sucks down power, it's not needed for a 5080 tier GPU. 16 is meh, 24 is good.
We'll have to see how expectations match reality when The Witcher 4 releases
Leading analyst firm Ampere says more Switch 2 consoles will be sold this year than the PC gaming handheld landscape has sold in its entirety"
It would be interesting if these "lower tier" cards at least had the hardware to be used as testbeds (maybe showcase is the right word) for things like Neural texture compression etc so they could punch above their weight, like the Switch 2 supposedly does with its custom SOC.
That would be a fair point IF VRAM used a lot of power. It doesn't. VRAM accounts for less than 7% of a cards total power profile. It's just not a problem. 32GB is solid.
That's not as good a point as it appears. There are several that can run on much less. Besides, the public is getting wise to all of that nonsense are just not paying it.My issue with VRAM of this magnitude is that it'll get scalped by the people who want to run LLMs, because 32 GB is enough to run bare bones versions.
That's not as good a point as it appears. There are several that can run on much less. Besides, the public is getting wise to all of that nonsense are just not paying it.
It would do nothing because even a 16GB card can run LLM's. Making a 32GB card into 24GB card to "attempt" to avoid scaping makes as much sense as a screen-door on a submarine.What I'm trying to convey is that if they offered a 24 GB version it'd limit the option for people to run bootleg LLMs.
No, it would not.The benefit to this would be a decrease in street cost, because they wouldn't be bought up by people trying to run LLMs...which would keep the street price lower (and thus be more affordable to people whose intention is gaming).
Context is important. I said;What I'm reading from you is that the public is too smart to pay too much...?
..which is what I'm seeing happening on Ebay.Besides, the public is getting wise to all of that nonsense are just not paying it.
You thinking that making a 32GB card into 24GB card is going to make any difference at all. It will not. It's an interesting thought, but it's not grounded in reality.What am I missing here?
Processor | 5950x |
---|---|
Motherboard | B550 ProArt |
Cooling | Fuma 2 |
Memory | 4x32GB 3200MHz Corsair LPX |
Video Card(s) | 2x RTX 3090 |
Display(s) | LG 42" C2 4k OLED |
Power Supply | XPG Core Reactor 850W |
Software | I use Arch btw |
it's not that insignificant.That would be a fair point IF VRAM used a lot of power. It doesn't. VRAM accounts for less than 7% of a cards total power profile. It's just not a problem. 32GB is solid.
While that's technically true, it's not significant enough to be a concern. Bring on the 32GB!it's not that insignificant.
It would do nothing because even a 16GB card can run LLM's. Making a 32GB card into 24GB card to "attempt" to avoid scaping makes as much sense as a screen-door on a submarine.
No, it would not.
Context is important. I said;
..which is what I'm seeing happening on Ebay.
You thinking that making a 32GB card into 24GB card is going to make any difference at all. It will not. It's an interesting thought, but it's not grounded in reality.
You are trying to create a solution for problem without meritful and credible information to support such. You're effectively saying: "We should reduce the VRAM because it might discourage scalpers." This is not a perspective that card engineers should ever consider. It's marketing and distribution problem, NOT an card design and engineering problem. How well it would work is dubious to an extreme. High unlikely would be another way of putting it.Laser focus. You are justifying that people are unwilling to pay because of eBay listings. Let me get that absolutely straight.
Most people buying something to use in an AI farm or LLM do not buy from eBay. Most people buy them new, and consider the cards useless after use given that they are not treated gingerly. This is like using the price of a cheeseburger to figure out what the price of wholesale beef stock is. You're looking at something 5 rungs into a completely different value chain.
What I'm looking at is that today people running AI farms buy whatever GPUs will let them perform what they need. If I were looking to run an AI farm or LLM I'd spend the $1000+300 (MSRP+street tax) for a theoretical 9080xt before I plink down $10000 an some AI accelerator from Nvidia every day of the week. That business use new purchase would absolutely be reasonable, and it would justify the +$300 street tax, thereby denying somebody who wanted a $1000 card for its performance from purchasing it. Now the prices are theoretical, but the truth of this is that this doesn't influence the eBay market, because as yet eBay sells no new product from AMD and Nvidia. It sells used stuff, and I can try to sell my aging 3060ti for $350. That doesn't influence the cost of the new 9060xt on the store shelf for an MSRP of $300.
Taken more enthusiastically, the price of tea in China rarely equates to the price of cattle in Montana. If your thread is that both of them are farmed, then I respect your ability to pull a connection but cannot fathom why you think one is directed to the other...when the cost of fertilizer is present and actually directly influences the cost of tea.
You are trying to create a solution for problem without meritful and credible information to support such. You're effectively saying: "We should reduce the VRAM because it might discourage scalpers." This is not a perspective that card engineers should ever consider. It's marketing and distribution problem, NOT an card design and engineering problem. How well it would work is dubious to an extreme. High unlikely would be another way of putting it.
I understand that Nvidia was trying to avoid having a second cryptocurrency mining boom and bust cycle after getting burned by the first boom and bust cycle, but the proper response for cryptocurrency mining is not an engineering response like LHR. The proper way to deal with cryptocurrency mining is to prosecute the cryptocurrency miners as pyramid scheme fraudsters because when you boil what cryptocurrencies are, they are pyramid schemes that will collapse when each cryptocurrency scheme's last token like a bitcoin is mined because mining the last token will set off a sell-off panic due to lack of tokens to mine, with miners only being able to get transaction commissions. This frustrated me when I wanted to buy a GPU for GPGPU work other than cryptocurrency mining like Folding@home or BOINC. I would like to use my GPU to help solve real-world math and science problems when I am not playing games or using the GPU for GPGPU productivity work. Such GPGPU work could cause false positives with the LHR algorithm and slow such productivity work down. To get a non-LHR GPU in order to avoid the false positive hash detections, I had to buy the highest end Nvidia GPU when a non-LHR-crippled then-midrange GPU would have served my needs. AMD was not an option back then because its drivers were buggy trash that repeatedly crashed in Microsoft Outlook of all apps. AMD has since rewritten the drivers so that its drivers are now good.You show me a design engineer who has final say at Nvidia or AMD, and I'll show you a person who doesn't exist.
I'll also show you precedence. You seem to think it's OK for you not to support ideas, but everybody else must... When Nvidia had all of their cards being gobbled up by miners, they released the LHR version of each card. A version specifically designed to counter mining, which by your admission should never happen. That said, reality seems to argue quite substantially with your magic version of events. It's great to live in a world with your rules.
Unfortunately, I live in the real one. A world where driving at 60 mph on a road that is a maximum of 50 is a thing, and where police will actually ignore them to catch the people going 70 on the same road. A place where the only way to stop thing like scalping and speeding is to make it impossible...and that would be via not having enough VRAM to run the LLM, or having a retarder on an engine that limits speed so that no matter how you smash the gas pedal you cannot move faster than the limit.
I do enjoy the hypocrisy of claiming eBay is your source, then arguing about card designers, and finally coming to the fact (accidentally) that the people who control this position themselves in the market to get the greatest profit (ie marketing drives sales). So we are clear, that's good functioning capitalism (and before you guess, no that's not a problem). I don't admire that you somehow think that engineering controls anything. Let me introduce you to the real world. In it, price points are set before design begins, designs target market segments, and things that can be used to do something much cheaper than specialized expensive hardware will absolutely be used. In this case a high performance GPU is about 1/10th to 1/5th of an AI hardware accelerator, and if it's capable of 25% of the performance at 25% of the price it's suddenly within peoples means to justify projects that otherwise could not be given the large input cost.
You seem to want to pretend I'm imagining solutions for immaterial problems. I counter with you being inaccurate, and the three letters that should make you reconsider are LHR. If you need more than that my response is "China, China China! China number one!" Tom's Hardware on smuggled H200 accelerators in China
System Name | Blytzen |
---|---|
Processor | Ryzen 7 7800X3D |
Motherboard | ASRock B650E Taichi Lite |
Cooling | Deepcool LS520 (240mm) |
Memory | G.Skill Trident Z5 Neo RGB 64 GB (2 x 32 GB) DDR5-6000 CL30 |
Video Card(s) | Powercolor 6800XT Red Dragon (16 gig) |
Storage | 2TB Crucial P5 Plus SSD, 80TB spinning rust in a NAS |
Display(s) | MSI MPG321URX QD-OLED (32", 4k, 240hz), Samsung 32" 4k |
Case | Coolermaster HAF 500 |
Audio Device(s) | Logitech G733 and a Z5500 running in a 2.1 config (I yeeted the mid and 2 satellites) |
Power Supply | Corsair HX850 |
Mouse | Logitech G502X lightspeed |
Keyboard | Logitech G915 TKL tactile |
Benchmark Scores | Squats and calf raises |
I was thinking more respective strengths. If you can give someone a respective taste at a fraction of the cost leveraging fancy new technology then I'm all for it (concessions aside). Just so long as that's not your starting point, hence the mention of The Witcher 4 given CDPR's issues with trying to get Cyberpunk working on certain hardware. RTX mega geometry raising the floor of older cards springs to mind (as does that Eurogamer article that asks "Where's the imagination in it all? Where's the Nintendo in it all, the toy maker?")You know, not having a source makes whatever you say pretty sketchy. It's telling that a simple Google search sees half a dozen articles pulling the exact same crap, almost like you're parroting talking points without really considering what they mean.
Let me offer a link with information beyond the click-bait title. Ampere Analytics information
If we take a second to verify this, but the quotes include:
"
.....
- The console segment did not show growth in 2024. It is expected to stagnate until 2026 (when GTA VI is anticipated to launch). The release of Nintendo Switch 2 and increased game sales for it are not expected to significantly boost the revenue of the segment.
"
- Ampere Analysis believes that by 2030, 103.1 million Nintendo Switch 2 consoles will be sold worldwide. Between late 2026 and 2028-2029, the active audience for both Nintendo Switch consoles will be around 130 million.
Do you want to interpret this, or should I? What I'm reading is that the Switch 2 will not sell as well as the Switch. I'm reading that if the Switch 2 sells to their figures, at least (130-103.1)/130 or 20.6% of the Nintendo market (likely much more given that this includes people who buy more than one or replacement consoles) will still be on the Switch. That's almost 5 years after launch. If we're to match that up with the PC market, this is like having 20% of the Nvidia users still being on a 10 series video card today...and worse when you consider that today we're having the 8 GB of VRAM debate where the 10 series had the 3 vs 6 GB debate (on the 1060). Imagine for just a moment somebody slamming their way into the conversation, and telling you having more than 6 GB or VRAM is silly, because today Nvidia is selling a model with 3 GB and it's only barely hitting its limits. (and whatever the AMD equivalent you'd like to make-up as well).
Consoles were excellent when a 1060 was $400, and a 30 series card was unobtanium. The thing is, today, a 10 series card is a joke. Even the venerable 1080 is a bit of a joke. Why in Hades would you pretend the Switch 2 has anything to do with my disappointment in a $300 MSRP 9060 at 8 GB...unless you're somehow trying to transition this discussion into a "well, low spec consoles sell" discussion. One that isn't part of this thread.
Forget cost. Forget power.
My issue with VRAM of this magnitude is that it'll get scalped by the people who want to run LLMs, because 32 GB is enough to run bare bones versions. It's the same as the mining issue, where GPUs that were (presumably inadvertently) designed to be good miners were unobtainable for gaming. That's my selfish reason, not the power draw or similar. I just want something that satisfies us enough, but won't have it be bought out by people who want to use it for other purposes.
System Name | Tiny the White Yeti |
---|---|
Processor | 7800X3D |
Motherboard | MSI MAG Mortar b650m wifi |
Cooling | CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3 |
Memory | 32GB Corsair Vengeance 30CL6000 |
Video Card(s) | ASRock RX7900XT Phantom Gaming |
Storage | Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB |
Display(s) | Gigabyte G34QWC (3440x1440) |
Case | Lian Li A3 mATX White |
Audio Device(s) | Harman Kardon AVR137 + 2.1 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Steelseries Aerox 5 |
Keyboard | Lenovo Thinkpad Trackpoint II |
VR HMD | HD 420 - Green Edition ;) |
Software | W11 IoT Enterprise LTSC |
Benchmark Scores | Over 9000 |
The 5060 is practically a 4060 which makes it effectively a 5050; so sure, but then the 5060ti should certainly be having more in every rendition of it, and not just a power starved 16GB upgrade of what is another 8GB card at its core.Ill play devils advocate. Assuming you agree with the below
5090 32gb = Good
5080 16gb = not terrible but should have 24
5070ti 16gb = okay
5070 12gb = not terrible but 16 would have been preferable
Now we are left with the 5060 and the 5060ti. Assuming you agree that 16 gb on the 5070ti is fine i dont see why a 5060 shouldn't have 8 frankly. It costs less than half of the 5070ti and the 9070xt.
Its a matter of balance, a 5080 level card really doesn't need more than 24. It goes both ways, I don't see a life in any card with current gen raster perf using 8GB as little as I see life in a high end card (for gaming, not semi pro like x90) using 32.That would be a fair point IF VRAM used a lot of power. It doesn't. VRAM accounts for less than 7% of a cards total power profile. It's just not a problem. 32GB is solid.
System Name | AM5_TimeKiller |
---|---|
Processor | AMD Ryzen 7 9800X3D |
Motherboard | ASUS ROG Strix B650E-F Gaming |
Cooling | Arctic Freezer II 420 rev.7 (with 6 fans in push-pull setup) |
Memory | G.Skill Trident Z5 Neo RGB, 2x16 GB DDR5, Hynix A-Die, 6400 MHz @ CL30-39-39-102-141 1T @ 1.40 V |
Video Card(s) | ASUS TUF Radeon RX 9070 XT GAMING |
Storage | Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB |
Case | Corsair 7000D Airflow |
Audio Device(s) | Creative Sound Blaster X-Fi Titanium |
Power Supply | Seasonic Prime TX-850 |
Mouse | Logitech wireless mouse for 15€, 6y old |
Keyboard | Logitech wireless keyboard, 12y old |
5090 32gb = GoodIll play devils advocate. Assuming you agree with the below
5090 32gb = Good
5080 16gb = not terrible but should have 24
5070ti 16gb = okay
5070 12gb = not terrible but 16 would have been preferable
Now we are left with the 5060 and the 5060ti. Assuming you agree that 16 gb on the 5070ti is fine i dont see why a 5060 shouldn't have 8 frankly. It costs less than half of the 5070ti and the 9070xt.
This is an opinion that not everyone shares. There is a need for single slot, low profile GPUs and a 5050/5050ti would do nicely and that makes plenty of sense.5050 = makes no sense
True.Its a matter of balance
Mildly disagree. I can see uses where more than 16GB would be needed for optimal performance.a 5080 level card really doesn't need more than 24.
True, it does go both ways and in both cases.It goes both ways, I don't see a life in any card with current gen raster perf using 8GB as little as I see life in a high end card (for gaming, not semi pro like x90) using 32.
While that's technically true, it's not significant enough to be a concern. Bring on the 32GB!
Why it it zero? Please explain.While there’s zero chance AMD makes a 9080
True! And that's why I said bring it on!there’s also nothing wrong with more vram
That is highly debatable.a theoretical 9080 would benefit more from a 24gb/384-bit config.
Why it it zero? Please explain.
True! And that's why I said bring it on!
That is highly debatable.
I'm not debating any of that nonsense. I am debating only the viability of a 9080XT 32GB card, and my strong assertion is that AMD should do it. A 9080 24GB and a 9080XT 32GB? That would be excellent!Current market on top of tariff nonsense won’t support a healthy market for the card that would likely see a street price of ~1200 usd.
Processor | Intel |
---|---|
Motherboard | MSI |
Cooling | Cooler Master |
Memory | Corsair |
Video Card(s) | Nvidia |
Storage | Western Digital/Kingston |
Display(s) | Samsung |
Case | Thermaltake |
Audio Device(s) | On Board |
Power Supply | Seasonic |
Mouse | Glorious |
Keyboard | UniKey |
Software | Windows 10 x64 |
Some really do need to work on their bias and stop trashing AMD threads with toxic brand favoritism.
I understand that Nvidia was trying to avoid having a second cryptocurrency mining boom and bust cycle after getting burned by the first boom and bust cycle, but the proper response for cryptocurrency mining is not an engineering response like LHR. The proper way to deal with cryptocurrency mining is to prosecute the cryptocurrency miners as pyramid scheme fraudsters because when you boil what cryptocurrencies are, they are pyramid schemes that will collapse when each cryptocurrency scheme's last token like a bitcoin is mined because mining the last token will set off a sell-off panic due to lack of tokens to mine, with miners only being able to get transaction commissions. This frustrated me when I wanted to buy a GPU for GPGPU work other than cryptocurrency mining like Folding@home or BOINC. I would like to use my GPU to help solve real-world math and science problems when I am not playing games or using the GPU for GPGPU productivity work. Such GPGPU work could cause false positives with the LHR algorithm and slow such productivity work down. To get a non-LHR GPU in order to avoid the false positive hash detections, I had to buy the highest end Nvidia GPU when a non-LHR-crippled then-midrange GPU would have served my needs. AMD was not an option back then because its drivers were buggy trash that repeatedly crashed in Microsoft Outlook of all apps. AMD has since rewritten the drivers so that its drivers are now good.
AI could be a great boon for productivity. However, it needs to be done ethically to stop such unethical practices such as the art theft and copyright infringement that DeviantArt's DreamUp, Midjourney, Stability AI's Stable Diffusion, Runway AI, OpenAI, and other commit and which should be prosecuted. Why do we have to rely on the cloud to do ethical AI when it can be done locally in a GPU or NPU with better privacy and security protections than what the cloud can offer? Ethical AI could help disabled people enjoy their lives without needing caretakers 24/7 to serve all of their needs so that caretakers can focus on tasks that require humans to perform.
Also, having more VRAM can save energy and lower wear and tear on the GPU when dealing with large GPGPU workloads. When you have a big task like video editing or image editing and the VRAM is filled up, the GPU has to either waste energy by staying in the full power state to process the data that is swapped in over the PCIe bus, or go to low power modes while waiting for the data to be swapped which also wears down the GPU because of thermal expansion or contraction.
I will give you that the excess VRAM can be an energy waster in casual gaming and idle scenarios where the VRAM is not being fully used. However, more VRAM can save resources by not requiring the user to swap a video card when the video card runs out of VRAM because the user got a newer high resolution monitor or because a well-optimized but detailed next-generation game requires a lot of VRAM.
As for why I find the rumor of a high end RX 9000 series GPU is very plausible, from what I have read or heard, it sounds like AMD was working on a high end GPU but the executives shut it down in development, thinking that Nvidia would hit yet another home run that everyone would buy no matter the Nvidia premium due to existing vendor lock-in such as PhysX and CUDA. The executives probably had no idea that Nvidia would screw over its customers this hard and removed most of the features that ensured vendor lock-in to Nvidia, so there is a market for high end GPUs that is sitting unfilled. Now AMD and Intel have the vendor lock-in on 32-bit OpenCL workloads now since Nvidia abandoned that market with Blackwell.
EDIT: I forgot to include the copyright infringement that OpenAI and others have committed. Adding them.
I was thinking more respective strengths. If you can give someone a respective taste at a fraction of the cost leveraging fancy new technology then I'm all for it (concessions aside). Just so long as that's not your starting point, hence the mention of The Witcher 4 given CDPR's issues with trying to get Cyberpunk working on certain hardware. RTX mega geometry raising the floor of older cards springs to mind (as does that Eurogamer article that asks "Where's the imagination in it all? Where's the Nintendo in it all, the toy maker?")
.
System Name | Mean machine |
---|---|
Processor | AMD 6900HS |
Memory | 2x16 GB 4800C40 |
Video Card(s) | AMD Radeon 6700S |
So we are largely in agreement..?5090 32gb = Good
5080 should have had 24 GB, as this card could really leverage more than just 16 GB, now 5080S will get more VRAM but and change in compute units
5070ti 16gb = okay, or could have had 20 GB, 5070 Ti with 20GB VRAM would be amazing deal
5070 12gb = it's on the edge, should have had 16 GB, as it costs way too much for 12 GB card.
5060 Ti = 12 GB
5060 = 8 GB
5050 = makes no sense