• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA TITAN V Lacks SLI or NVLink Support

I have run SLI for years. Almost every game that said it did not support it I got it to run in SLI. Currently running two 1070ti's now in SLI and all my games are running 4k with no issues at all. Average fps is 100+ in 4k. Single card ran around 30 fps which for me is not enough even 60fps I personally do not like. I talked to so many new gamers that said SLi sucked and didn't work. 80% of the time lol Sli wasn't even enabled on there drivers. 10% of the time it wasn't configured correctly and maybe 10% of the time the frame rendering was not configured.

Fallout 4 I got working in SLI just fine.

Also in SLI games run soo much smoother. Been working with SLI since my Diamond Monster Voodoo 2's


THIS THIS THIS. A lot of people don't understand how to configure a setup like this. I generally only recommend SLI/CF if you play with multi-monitor, high refresh rates and / or 4K+ resolutions. Many people discounting SLI/CF hear stories or only have experience with low-midrange GPU's, which isn't even worth it in the first place. This setup requires you to get the best single GPU available and double it for maximized performance.

Heroes Of Newerth, CS:S and Fallout 3 all worked terribly on my SLI setup so I got off that train real fast. Not everyone has problems but I have. That's why I would rather stick to what will most likely work, a single card solution.

What GPU's and monitors were you using?
 
I love when I tell someone wanting to go SLI not to and they still do. Then they experience the bugs and lack of support.

The sad thing is I remember doing 2 x HD 6950's and getting good performance in every game I played. I said to myself "Some day I will have the money to get an ultra 3-way sli/cf build.".

Now here I am and it's completely awful. Instead I built an excellent ITX build (Which is cool because I travel a lot), and in fact I don't see mega PC's being viable again for at least 2-3 years....
 
There's no way NVlink would work, even if it were enabled. It's for IBM's POWER only AFAIK.
 
THIS THIS THIS. A lot of people don't understand how to configure a setup like this. I generally only recommend SLI/CF if you play with multi-monitor, high refresh rates and / or 4K+ resolutions. Many people discounting SLI/CF hear stories or only have experience with low-midrange GPU's, which isn't even worth it in the first place. This setup requires you to get the best single GPU available and double it for maximized performance.
It didn't use to be this way. In the Fermi and Kepler days, you could SLI two mid-range (460's, 560's and 660's) for about half the price of a high end and achieve near the same average frame rate. SLI can still work this way, but requires way more experimentation with NVidia Inspector and setting custom flags.

I'll bite... what in 2-3 years do you see making "mega" PCs (whatever that is) viable?
Wild guess... the crash of crypto's everywhere making people hold their fiats closer to their chest.
 
There's no way NVlink would work, even if it were enabled. It's for IBM's POWER only AFAIK.
Why? IBM is using it for Power9 but this is primarily developed and used by NVidia. NVLink can do both GPU-CPU and GPU-GPU connections, supposedly in a mesh configuration. The closest competitor to the same functionality today is AMD's Infinity Fabric.
 
Why? IBM is using it for Power9 but this is primarily developed and used by NVidia. NVLink can do both GPU-CPU and GPU-GPU connections, supposedly in a mesh configuration. The closest competitor to the same functionality today is AMD's Infinity Fabric.
Well that's what I'm saying, it works only with POWER CPU's, it was after all developed in collaboration with IBM, for SUMMIT.

It's proprietary, not unlike IF from AMD or Intel's ominpath (competitor to Infiniband). This doesn't mean that it can't work with x86 or ARM CPU's, in the future, just that it cannot right now.
 
It didn't use to be this way. In the Fermi and Kepler days, you could SLI two mid-range (460's, 560's and 660's) for about half the price of a high end and achieve near the same average frame rate. SLI can still work this way, but requires way more experimentation with NVidia Inspector and setting custom flags.


Wild guess... the crash of crypto's everywhere making people hold their fiats closer to their chest.

Yea I do remember those days, it was a great value at the time.
 
I'll bite... what in 2-3 years do you see making "mega" PCs (whatever that is) viable?

It's pretty obvious Crossfire and SLI worked incredibly well 8 years ago due to the fact that it was new, but matured - and AMD and Nvidia were busy trying to beat each other at a major feature (Like G-Sync vs Freesync now).

But eventually it just wasn't profitable to keep applying so much manpower to an awesome, but WAY underutilized feature. AMD got higher performance, but Nvidia's more consistently worked (some years). Nvidia phased it out slowly - First Nvidia limited the first Titan to 3-way, then Pascal went with 2-way, and now the new Titan has ZERO sli support. Oh, and AMD only supported it through Polaris because they had to (But now Vega is out).

For now all of the competing API's and engines are making mGPU pointless to support. AMD was right when they insisted that DX12 move mGPU to the driver level, but DX12/Vulkan won't be the standard for at least 2-3 years. Once they ARE the standard, I expect mGPU to maybe become bigger than it ever has been.... But it is a BIG "maybe."
 
I have run SLI for years. Almost every game that said it did not support it I got it to run in SLI. Currently running two 1070ti's now in SLI and all my games are running 4k with no issues at all. Average fps is 100+ in 4k. Single card ran around 30 fps which for me is not enough even 60fps I personally do not like. I talked to so many new gamers that said SLi sucked and didn't work. 80% of the time lol Sli wasn't even enabled on there drivers. 10% of the time it wasn't configured correctly and maybe 10% of the time the frame rendering was not configured.
....
So you're basically saying that adding another card on SLI, is increasing the performance for you ...333% ?
Cool story bro.
 
So you're basically saying that adding another card on SLI, is increasing the performance for you ...333% ?
Cool story bro.

I stacked a Geforce 930 with hypermemory and a tweaked NV/AMD crossfire bridge right next to my RX480 8GB and got 320 FPS in Doom on 4K ultra.
 
There's no competition above the 1080, so they're selling the crappiest SKU first for the most amount of money. Economics of opportunity.

This is GV100 fully maxed out, it doesn't get better than this.
 
This is GV100 fully maxed out, it doesn't get better than this.
It is not fully maxed out...the caches have been trimmed and the reduction in memory directory correlates to lower memory bandwidth.
But it is still good. Priced like a Quadro V6000 tho.
 
The only sad thing about losing SLI, is that it means Voodoo is truly dead.
 
The only sad thing about losing SLI, is that it means Voodoo is truly dead.
(hugs his Voodoo3) don't say that, you scare the little ones.
 
It is not fully maxed out...the caches have been trimmed and the reduction in memory directory correlates to lower memory bandwidth.
But it is still good. Priced like a Quadro V6000 tho.


Well, as fully maxed out as a desktop GPU is going to get. There aren't shaders disabled like the previous early Titan releases.
 
I don't really get most of the complaints in this thread. This is a V100 with a reduced memory bandwidth and priced at a 70% discount. What is the problem here? This isn't aimed at gamers in any way, and is instead designed to be a reduced-price card for those working with deep learning and AI software, not to play Crysis with. If it was designed for gaming they'd have disabled the tensor cores and some of the FP64. Personally, I'm amazed at how good a value this card is for what it's designed for.
 
Don't worry, NVLink support will come in Titan Vv Gold Edition available only from NVIDIA at the low low price of $4999. With partially disabled drivers of course. Wouldn't want to touch the sales of the GV100 Quadro/Tesla products would we. :)
 
The funny thing is I WOULD buy this in a second if it turns out its tensor cores and bandwidth can be put to use for mining at say 100 ETH + 5000 MH/s...
 
Well that's what I'm saying, it works only with POWER CPU's, it was after all developed in collaboration with IBM, for SUMMIT.

It's proprietary, not unlike IF from AMD or Intel's ominpath (competitor to Infiniband). This doesn't mean that it can't work with x86 or ARM CPU's, in the future, just that it cannot right now.
You literally bolded the parts supporting your narrative and completely ignored the rest. NVLink also works with GPU-GPU connections.
Both Omnipath and Infiniband are in a slightly different niche, generally for connecting systems on a step higher level.
 
HAHAHAHA

This is the final middle finger to all those noobs saying Titan is 'a gaming card'. "But if I want to and I can, why wouldn't I buy one for gaming?"

Suck on that
 
Well, as fully maxed out as a desktop GPU is going to get. There aren't shaders disabled like the previous early Titan releases.

It has 4 disabled SMs as the full chip has 5376 shaders.
 
You literally bolded the parts supporting your narrative and completely ignored the rest. NVLink also works with GPU-GPU connections.
Both Omnipath and Infiniband are in a slightly different niche, generally for connecting systems on a step higher level.
Yes & it's an enterprise feature geared towards compute heavy systems, that's why it'll not be enabled on desktop systems, probably with the exception of pro cards like Quadro.
 
HAHAHAHA

This is the final middle finger to all those noobs saying Titan is 'a gaming card'. "But if I want to and I can, why wouldn't I buy one for gaming?"

Suck on that
If they can afford it I don't think they care what it's for. If it's expensive people will want it just for that reason alone so they can brag.
 
Back
Top