• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Announces New CMP Series Specifically Designed for Cryptocurrency Mining; Caps Mining Performance on RTX 3060

Raevenlord, typo in title on facebook.
 
Didn't they do this before and fail because miners flash their bios anyway bypassing whatever trickery nvidia put in place to limit mining?
 
Checks in the driver are so easy to circumvent, won't take experienced people more than a day
Soldering on video outputs to the card would take longer.
They brought out mining cards previously such as the 1080 with none or at best one but apparently those ports were covered and didn't work with the specific Bios installed on the card.
I say make all mining cards headless and cheaper so miners snap those up instead.
 
Can't say that I agree with gimping the performance of the RTX 3060 in mining and someone will find a way round the blocks anyway.

Releasing mining specific cards in general is a good idea in the current situation, but will have to be priced very cheaply to make it attractive to miners, since the cards have no value afterwards for resale.

The best idea of all though, would be to have enough capacity to simply make enough cards for everybody and forget about cut down mining cards. Looks like realworld realities prevent this though. :ohwell:
 
just looked up Rvii prices on ebay -hugs my Radeon vii- :laugh:
It's insane right? I sold mine recently which paid for a lot of the cost of a 3090.
 
Feels like, to me, that nvidia is testing out this whole, "Let's Quadro cards out for mining as their own line" with driver lockouts with these lower end cards. Then they can release the next generation of cards with hardware-based lockouts and drivers that have been fully broken, relocked down, broken, and relocked again over several iterations.

That way, they can release a new product stack, perhaps the Super line?, and prevent this situation from ever happening again. I'm fine with that. Let GPU mining die already. It's been a pox upon the industry for way too long. Go build an ASIC and leave gamers in peace. And if you think it's illegal to differentiate your product stack arbitrarily, well, Intel's been doing it for years. Nvidia had Quadro or Tesla. It's not illegal in any way and it's the way out of this GPU mining mess.

And I can't wait till they lock people out. I wish they could do it right this second (or even at the 3000 series launch), but unfortunately that genie's already out of the bottle. Hopefully, next product stack they can get it fixed.
 
Huh? Your comment is genuinely incomprehensible to me. I don't understand what you're trying to say
BTW The 3070 100% does not need more than 8GB. I have one and played at 4K and rarely got above 7GB in new AAA titles. At 1440p it's absolutely enough.
With low quality textures I guess.
My 6800 16GB have used up to 15!!!GB playing Sotr at 4k ultra + rt ultra + msaa. Many other games here using above 10GB at 4k with everything set to ultra.
Buying a new expensive card and having to dial down pq so it doesn’t burst the 8/10GB limitation doesn’t make sense to me.
 
Feels like, to me, that nvidia is testing out this whole, "Let's Quadro cards out for mining as their own line" with driver lockouts with these lower end cards. Then they can release the next generation of cards with hardware-based lockouts and drivers that have been fully broken, relocked down, broken, and relocked again over several iterations.

That way, they can release a new product stack, perhaps the Super line?, and prevent this situation from ever happening again. I'm fine with that. Let GPU mining die already. It's been a pox upon the industry for way too long. Go build an ASIC and leave gamers in peace. And if you think it's illegal to differentiate your product stack arbitrarily, well, Intel's been doing it for years. Nvidia had Quadro or Tesla. It's not illegal in any way and it's the way out of this GPU mining mess.

And I can't wait till they lock people out. I wish they could do it right this second (or even at the 3000 series launch), but unfortunately that genie's already out of the bottle. Hopefully, next product stack they can get it fixed.
The same hardware they use to mine presented the computational quest for a Rona (+others)cure with the equivalent of the entire top 500 supercomputers combined, or more.
Now I'm pissed a bit I'm on old GPU still but calm down , unless you game 16hours a day is the butt hurt warrantied, if you are just pissed you don't have a new GPU for a few hours a night just to leave it sat switched off then chill ,some of us are trying to use our GPU for more.
 
Yeah, cool, I bet there will be custom drivers to remove the limits within weeks, if not days. Also, they're not making extra GPUs, right? They're just carving out another product line out of the silicon they've already ordered. I'm happy to be wrong on this but me thinks nothing will change drastically because of this move.
The only entity benefitting from this is Nvidia: Once the crypto bubble bursts, every mining card without a display output is useless to anyone else and so cannot be sold back into the used market.

This is a 100% Nvidia-benefitting move and hurts the consumers who, having been unable to buy gaming cards for six months, are then also deprived of the wave of cheap graphics cards that happens at the end of the bubble by these mining-only models. Adding display outputs costs very little, but definitively restricts what the card can be used for.
 
I'm here for the popcorn! :D

Cant wait to see the pricing and how fast supply will last, or could this be another paper launch date.
 
They need a new silicon spinning (ideally one with actually better performance per watt for crypto workloads) or this is dead on arrival.
 
With low quality textures I guess.
My 6800 16GB have used up to 15!!!GB playing Sotr at 4k ultra + rt ultra + msaa. Many other games here using above 10GB at 4k with everything set to ultra.
Buying a new expensive card and having to dial down pq so it doesn’t burst the 8/10GB limitation doesn’t make sense to me.
VRAM allocating and VRAM usage isn't the same thing. I just switched from 980 Ti to 1080 Ti and Afterburner shows that more VRAM is being used even if I play a game which ran perfectly fine on 980 Ti and its 6GB.
 
I wonder if this makes any sens considering miners also make money by reselling GPU's after they have been used to death...
 
I wonder if this makes any sens considering miners also make money by reselling GPU's after they have been used to death...
They've already done this for years?
 
Nvidia said:
limit the hash rate, or cryptocurrency mining efficiency, by around 50 percent.
With CMP, we can help miners build the most efficient data centers while preserving GeForce RTX GPUs for gamers.
A step in the right direction! Now I can finally get my plus 1 GB more VRAM 1080 Ti close to MSRP, will consider price/perf (6700 XT launches on March 18th) before I do so though. All that said, way to go Nvidia!

Raevenlord said:
I'll allow myself some emotion now: finally!
Indeed!
 
With low quality textures I guess.
My 6800 16GB have used up to 15!!!GB playing Sotr at 4k ultra + rt ultra + msaa. Many other games here using above 10GB at 4k with everything set to ultra.
Buying a new expensive card and having to dial down pq so it doesn’t burst the 8/10GB limitation doesn’t make sense to me.
1. Allocated does not mean used. I doubt a 2+ year old game uses 15GB in reality when brand new AAA open-world games barely use 8GB at 4K. Hell, when I play Black Ops 2, which came out 9 years ago, my 3090 reports 23GB used! Does that mean the people of 2012 were being shafted with their maximum of 4GB on the 690, when in reality games needed 24GB? Or could it perhaps be the COD games just allocate as much VRAM as they can? Nope, definitely the former...
The only way we could check if SOTR really does use 15GB is to have two cards with identical performance, but with different equipped VRAM amounts (8GB/16GB, for example) and see if the game performs any worse when the VRAM is cut. Or perhaps there is a program to artifically store in VRAM without performance degradation, so you can effectively 'emulate' 12GB/8GB/4GB, etc. on a 16GB card? I'd be interested to see that comparison, as that's the only way to objectively measure it.

2. I was talking about 1440p, so a comparison to 4K is moot.

3. NVIDIA will use different compression algorithms so that 8GB will go further than previously. Also DirectStorage will alleviate a lot of VRAM usage as more data can be kept on SSDs instead of the VRAM.
 
  • Like
Reactions: bug
double dipping for moar money while pretending to care about gamer card plight
 
1. Allocated does not mean used. I doubt a 2+ year old game uses 15GB in reality when brand new AAA open-world games barely use 8GB at 4K. Hell, when I play Black Ops 2, which came out 9 years ago, my 3090 reports 23GB used! Does that mean the people of 2012 were being shafted with their maximum of 4GB on the 690, when in reality games needed 24GB? Or could it perhaps be the COD games just allocate as much VRAM as they can? Nope, definitely the former...
The only way we could check if SOTR really does use 15GB is to have two cards with identical performance, but with different equipped VRAM amounts (8GB/16GB, for example) and see if the game performs any worse when the VRAM is cut. Or perhaps there is a program to artifically store in VRAM without performance degradation, so you can effectively 'emulate' 12GB/8GB/4GB, etc. on a 16GB card? I'd be interested to see that comparison, as that's the only way to objectively measure it.

2. I was talking about 1440p, so a comparison to 4K is moot.

3. NVIDIA will use different compression algorithms so that 8GB will go further than previously. Also DirectStorage will alleviate a lot of VRAM usage as more data can be kept on SSDs instead of the VRAM.
#3 DirectStorage will not substitute SSD for VRAM. It will let games load data faster from the disk, but the latency will remain the same.
 
#3 DirectStorage will not substitute SSD for VRAM. It will let games load data faster from the disk, but the latency will remain the same.
Less data needs to be preloaded in VRAM which will lead to less VRAM usage with DirectStorage. If more texture data can be stored on the drive before it's sent to VRAM, the VRAM will have to store less in cache and that means more space for more textures or less VRAM usage.
 
Less data needs to be preloaded in VRAM which will lead to less VRAM usage with DirectStorage. If more texture data can be stored on the drive before it's sent to VRAM, the VRAM will have to store less in cache and that means more space for more textures or less VRAM usage.
Rendering a scene will still use the same amount of assets. This will simply result in faster initial loading times and possibly improved texture streaming (which has already been implemented with various degrees of success). Less pressure on VRAM? Most likely. Significant less? My money's on "no".
 
Back
Top