Thursday, September 3rd 2020

NVIDIA Reserves NVLink Support For The RTX 3090

NVIDIA has issued another major blow to multi-GPU gaming with their recent RTX 30 series announcement. The only card to support NVLink SLI in this latest generation will be the RTX 3090 and will require a new NVLink bridge which costs 79 USD. NVIDIA reserved NVLink support for the RTX 2070 Super, RTX 2080 Super, RTX 2080, and RTX 2080 Ti in their Turing range of graphics cards. The AMD CrossFire multi-GPU solution has also become irrelevant after support for it was dropped with RDNA. Developer support for the feature has also declined due to the high cost of implementation, small user base, and often poor performance improvements. With the NVIDIA RTX 3090 set to retail for 1499 USD, the cost of a multi-GPU setup will exceed 3079 USD reserving the feature for only the wealthiest of gamers.
Source: NVIDIA
Add your own comment

81 Comments on NVIDIA Reserves NVLink Support For The RTX 3090

#1
Sandbo
They are really worried that we can use the card for machine learning, I guess.

I really hope AMD could do something about the situation.
Posted on Reply
#2
Raendor
SandboThey are really worried that we can use the card for machine learning, I guess.

I really hope AMD could do something about the situation.
yes, Huang can’t sleep at night because he’s afraid someone will use 3080 or god forbid 3070 for professional machine learning activities. Don’t be ridiculous.
Posted on Reply
#3
Tomgang
Sli is dead and gone. It's been dead since pascal. I really do miss SLI and been running sli since gtx 285 whas the big deal. But gtx 1080 TI whas the first card ever I ran as single card config.
Posted on Reply
#4
theGryphon
Well, multi-GPU is dead because it absolutely makes zero sense for both NVIDIA and AMD.
Not only it requires a huge effort in driver and hardware support for a huge variety of games and applications, they have the financial incentive for its disappearance.

You want 1.5X-2X performance? Without multi-GPU support, you can't just slap a second-hand card to double-up ;)
Also observe the line-ups: for 1.5X-2X performance, you always have to pay more than 2X...
Posted on Reply
#5
Caring1
So apart from the fancy hourglass shape, what is different about the link?
If the fingers on the cards are the same as previous versions what is stopping people using those bridges?
Posted on Reply
#6
Blueberries
I just want to know if the 8k upscaled 60 FPS Ray-Traced demo required NVLink
Posted on Reply
#7
trog100
i already have a 2080ti it might be good idea to buy another one and sli them.. maybe.. he he

i recon about £500 quid for a good used one..

trog
Posted on Reply
#8
Dux
If you buy RTX 3090 for gaming, something is wrong with you. I don't care if you are rich. It just means you are a fool with too much money. It's a card for professional use and you would be better off just buying RTX 3080, watercooling it and OC-ing the shit out of it. Btw. does anyone doubt that there will be RTX 3080 Ti?
Posted on Reply
#9
AusWolf
Absolutely understandable. Paying double the price for marginal performance improvements, possible vram limitations and poor compatibility never made any sense imo. Besides, if this new generation delivers what it promises, no one will need a second gpu anyway.
Posted on Reply
#10
Daisho11
Nvidia is approaching Apple-tier ridiculous pricing. But do they have the brainwashed, cult-like following, like Apple does, to get away with fleecing their customers?
Posted on Reply
#11
bug
Does anyone still care about mGPU?
The tech never lived up to its initial promises (anyone remembers "don't get rid of your old video card, just install a new one on the side and enjoy"?) and support was always spotty.
Posted on Reply
#13
bug
DuxCroBtw. does anyone doubt that there will be RTX 3080 Ti?
There's certainly room for them, but their existence depends heavily on how RDNA2 turns out.
Posted on Reply
#14
ncrs
DuxCroIf you buy RTX 3090 for gaming, something is wrong with you. I don't care if you are rich. It just means you are a fool with too much money. It's a card for professional use and you would be better off just buying RTX 3080, watercooling it and OC-ing the shit out of it. Btw. does anyone doubt that there will be RTX 3080 Ti?
It's a gaming card. Professionals use Quadros (or rarely Titans) because of the certified drivers that make a 1060-equivalent Quadro P2000 over 7x faster than a 2080 Ti in professional use cases:

Posted on Reply
#15
bug
It's not a gaming card, it's a halo product. Like all halo products, it's way overpriced.
If you have that much disposable income, get one, by all means. But most of us won't.
Posted on Reply
#16
Space Lynx
Astronaut
DuxCroIf you buy RTX 3090 for gaming, something is wrong with you. I don't care if you are rich. It just means you are a fool with too much money. It's a card for professional use and you would be better off just buying RTX 3080, watercooling it and OC-ing the shit out of it. Btw. does anyone doubt that there will be RTX 3080 Ti?
not really. if you enjoy high refresh gaming like I do, my gtx 1070 still can't even play the first witcher game from 2009 above 105 fps... I'd really like 165 fps for that smooth clarity. so if you buy a 4k 144hz monitor, a 3090 actually probably won't be enough to get full smoothness in AAA games like cyberpunk 2077 maxed out.
Posted on Reply
#17
ncrs
bugIt's not a gaming card, it's a halo product. Like all halo products, it's way overpriced.
If you have that much disposable income, get one, by all means. But most of us won't.
I'm pretty sure that A100 is the Ampere generation halo product ;)
Posted on Reply
#18
RedelZaVedno
There's a simple truth to SLI on lower end tiers, NVidia don't want you to use it, period. You could pair 3070 and effectively get near 3090 level of performance for 2/3 of a price and similar power usage in supported titles. And lets be realistic here. Who would want to use 2 x 3090 for gaming anyway? Besides $3079 bill, AIB's OC 3090 will likely consume 350-400W, so 800W for GPU alone; add 10900K to the mix and you get +1kWh power consumption. That's enough energy to warm 15m2 room in a winter. Now image the heat in a 9m2 closed room during summer months after 5 hours of gaming on such monster. You got yourself a sauna.
Posted on Reply
#19
Dux
lynx29not really. if you enjoy high refresh gaming like I do, my gtx 1070 still can't even play the first witcher game from 2009 above 105 fps... I'd really like 165 fps for that smooth clarity. so if you buy a 4k 144hz monitor, a 3090 actually probably won't be enough to get full smoothness in AAA games like cyberpunk 2077 maxed out.
4K 144Hz. Idk honestly if such monitor exists. Wouldn't you be better off buying a 1080P/ 360Hz? I think you would enjoy that refresh rate more than higher resolution.
Posted on Reply
#20
Space Lynx
Astronaut
DuxCro4K 144Hz. Idk honestly if such monitor exists. Wouldn't you be better off buying a 1080P/ 360Hz? I think you would enjoy that refresh rate more than higher resolution.
not for me, i prefer 1440p 240hz. anything after around 210 fps is really diminishing returns, but I can tell the difference between 165hz and 210 fps on a 240hz monitor.

and yes that monitor has existed for over 2 years now...
Posted on Reply
#21
dinmaster
its been mentioned before, the nvlink is really not needed, dx12 uses mgpu and the cards act as one using only the pcie lanes. this news article is a none issue in my eyes..
Posted on Reply
#22
DemonicRyzen666
RedelZaVednoThere's a simple truth to SLI on lower end tiers, NVidia don't want you to use it, period. You could pair 3070 and effectively get near 3090 level of performance for 2/3 of a price and similar power usage in supported titles. And lets be realistic here. Who would want to use 2 x 3090 for gaming anyway? Besides $3079 bill, AIB's OC 3090 will likely consume 350-400W, so 800W for GPU alone; add 10900K to the mix and you get +1kWh power consumption. That's enough energy to warm 15m2 room in a winter. Now image the heat in a 9m2 closed room during summer months after 5 hours of gaming on such monster. You got yourself a sauna.
I agree with you on this, but it goes even further back then that.
I can't find the article but there was a review back with the atI hd 5000 series.
They did 5870 in crossfire vs Trifire 5670 for less price than one 5870 the three cards where beating the dual card 5870!
Posted on Reply
#23
bug
DemonicRyzen666I agree with you on this, but it goes even further back then that.
I can't find the article but there was a review back with the atI hd 5000 series.
They did 5870 in crossfire vs Trifire 5670 for less price than one 5870 the three cards where beating the dual card 5870!
Age old story. You may get better performance, but you need per-title support. Even then you run into micro-stutter and whatnot.

The tech has been with us for over 15 years and was never able to break 5% market adoption (if that). It was clearly going nowhere.
Posted on Reply
#24
DemonicRyzen666
bugAge old story. You may get better performance, but you need per-title support. Even then you run into micro-stutter and whatnot.

The tech has been with us for over 15 years and was never able to break 5% market adoption (if that). It was clearly going nowhere.
and high end cards have been around longer
store.steampowered.com/hwsurvey/videocard/

RTX 2080 ti With 0.98% why keep making them then ?

thats less than 5% too.
Posted on Reply
#25
QUANTUMPHYSICS
SandboThey are really worried that we can use the card for machine learning, I guess.

I really hope AMD could do something about the situation.
The only "situation" AMD needs to worry about is THE 3070 which obsoletes every single thing they have currently.

Why should Nvidia keep putting a feature out that less than 1% of all buyers use?

The 3090 will more than likely be the "professional" level card and only professionals would be spending that kind of money to have more than one.
Posted on Reply
Add your own comment
Apr 27th, 2024 02:44 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts