• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GV102 Prototype Board With GDDR6 Spotted, Up to 525 W Power Delivery. GTX 1180 Ti?

They are a business, any extra income is welcome...

Heres 15 pages of SLI supported games.. https://www.geforce.com/hardware/technology/sli/games
Here's another way to look at that: after 14 years in existence, less than 400 titles have added support for SLI. And that's when most of the work was being done in the video driver. Now that more of the burden has shifted onto the developers/game engine, you can figure out where that figure is going.
 
Here's another way to look at that: after 14 years in existence, less than 400 titles have added support for SLI. And that's when most of the work was being done in the video driver. Now that more of the burden has shifted onto the developers/game engine, you can figure out where that figure is going.
That is a game every two weeks..
 
Prototypes gives a fascinating insight into the development process of products. I wish Nvidia and other hardware manufacturers either provided an online archive or donated the hardware to museums, but after the launch of course. Most of the public never get to see the cool innovations and experimentations of prototypes and development kits. I find this stuff very fascinating, whether it's old Commodore, Amiga, Nintendo, 3dfx or brand new Nvidia and AMD parts.
 
  • Like
Reactions: bug
That is a game every two weeks..

Ya one thing I'd add here also is *MOST* games that are released do not need SLI. SLI is really only mandatory for the AAA titles that are incredibly demanding on the GPU, and those are the games that should have SLI to help the users out. I think most SLI users are reasonable and aren't asking for support on every given title, just the ones that are very demanding on the GPU. So I'd say if that's one game every 2 weeks, that's actually pretty encouraging for SLI support, IMO.
 
Make of it what you wish.
Not wishing anything...that's just how it is...

Make of it what you wish.
Vermintide 2 and FarCry5 are two recent SLI games and they aren't even on that list... stop trying to wish SLI away
 
Last edited:
Nvidia will keep SLI alive as long as the professional market demands it. Why do you think their Linux drivers support SLI and "stereoscopic 3D"? Let me give you a hint, it's not because of gaming…
 
Vermintide 2 and FarCry5 are two recent SLI games and they aren't even on that list... stop trying to wish SLI away
Vermintide is surprising seeing how it uses a little known engine (owned by Autodesk). Dunia engine (all FarCry games use it) has supported SLI since the beginning so ongoing support isn't surprising.
 
Nvidia will keep SLI alive as long as the professional market demands it. Why do you think their Linux drivers support SLI and "stereoscopic 3D"? Let me give you a hint, it's not because of gaming…
Professional usage may be legit (maybe relegated to Quadros?). But still, for consumers this technology is as good as dead.
 
Professional usage may be legit (maybe relegated to Quadros?). But still, for consumers this technology is as good as dead.
Professionals certainly use Teslas, Quadros and Titans when there are reasons to do so. But both GeForce cards and consumer CPUs are quite common for workstations and test setups. I've seen developers order pretty decent shipments of higher consumer cards, and even sometimes use them in SLI. A good portion of upper mid-range and high-end consumer graphics cards are in fact sold to companies.
 
Professional usage may be legit (maybe relegated to Quadros?). But still, for consumers this technology is as good as dead.
Why do you keep saying that. Its clearly not. I just gave you clear examples of how its not dying. Where are your examples besides just your gut?
 
Why do you keep saying that. Its clearly not. I just gave you clear examples of how its not dying. Where are your examples besides just your gut?
Because after all this times it's still around 1% adoption and clearly both Nvidia and AMD are scaling back support. But don't let that stand in your way, you have a list.
 
Because after all this times it's still around 1% adoption and clearly both Nvidia and AMD are scaling back support. But don't let that stand in your way, you have a list.
And new games coming out don't forget that part, the part where the devs are still doing work on it....get real man...try not to let all the evidence smack you in the face too hard..
 
And new games coming out don't forget that part, the part where the devs are still doing work on it....get real man...try not to let all the evidence smack you in the face too hard..
Game still sporting support are an indication that SLI is not dead yet. Not that it's not dying.
I mean, can you name any other technology that became a success after failing to break 2% market penetration 10+ years after its commercial release?
 
If Micron's GDDR6 is packaged in 2GB as rumored then there's 24GB of VRAM on that board which is probably too costly to manufacture for a consumer video card. Maybe a Tesla V?
 
If Micron's GDDR6 is packaged in 2GB as rumored then there's 24GB of VRAM on that board which is probably too costly to manufacture for a consumer video card. Maybe a Tesla V?
Did you read my article? The memory part number is listed there and how to decode the part number to get the memory capacity is posted as picture. These chips are without any doubt 256 Mb x 32 = 8 Gb = 1 GB.

Unless there is more memory chips on the other side of the PCB, which seems unlikely.
 
I guess SLi/Crossfire's death is not because they're completely useless, but because the ROI is not there. Too few people use that feature.

Up thru the 900 series, well over half our gaming builds we did for users were SLI ... with 95% at least "SLI ready". SLIs problem had more to do with nVidia than customers .... and nVidia's problem was this ... two 970s were the same price as a 980. And it had no downside. For most games, yoiu say a 70% average increase in FPS, on the games that matters, many were over 95%. That 70% gave yu 40% more frames than the 980. In the few popular games that did not support it, you lost maybe 12% in fps... weighing one against the other it was an easy decision. The problem for nVidia was, they make more money selling one 980 than they do twin 970s... even more so when game bundles are involved. With the 700, 600 and 500 series, the dollars favored SLI even more.

With the 10xx series, nvida made it harder. For reasons never explained, SLI average scaling at 1080p was only 18% ... @ 1440p, about 30%. Yet for some reason, at 2160p it was well over 55% with some games scaling near 100%. Looking at the numbers, it does seem off that the proce equartion was no longer valid at 1080p and 1440p where the top tier card was now keeping ya well above 60 fps. But where it couldn't, SLI managed to make sense at 2160p. With thr 4k HDR panels about to drop at $1,999 or so... they will be too high priced for mainstream users and i don't see a chnage in the parigm until these things drop into the range of affordability. But as monitor technology improvements continue to bring us more pixels, I don't see nVidia abandoning a multi GPU option at the high end.
 
Up thru the 900 series, well over half our gaming builds we did for users were SLI ... with 95% at least "SLI ready". SLIs problem had more to do with nVidia than customers .... and nVidia's problem was this ... two 970s were the same price as a 980. And it had no downside. For most games, yoiu say a 70% average increase in FPS, on the games that matters, many were over 95%. That 70% gave yu 40% more frames than the 980. In the few popular games that did not support it, you lost maybe 12% in fps... weighing one against the other it was an easy decision. The problem for nVidia was, they make more money selling one 980 than they do twin 970s... even more so when game bundles are involved. With the 700, 600 and 500 series, the dollars favored SLI even more.

With the 10xx series, nvida made it harder. For reasons never explained, SLI average scaling at 1080p was only 18% ... @ 1440p, about 30%. Yet for some reason, at 2160p it was well over 55% with some games scaling near 100%. Looking at the numbers, it does seem off that the proce equartion was no longer valid at 1080p and 1440p where the top tier card was now keeping ya well above 60 fps. But where it couldn't, SLI managed to make sense at 2160p. With thr 4k HDR panels about to drop at $1,999 or so... they will be too high priced for mainstream users and i don't see a chnage in the parigm until these things drop into the range of affordability. But as monitor technology improvements continue to bring us more pixels, I don't see nVidia abandoning a multi GPU option at the high end.

Great post - I haven't gamed below 4k for about 3-5 years now so I can't comment on the scaling for 1440p / 1080p, but on 4k I can absolutely confirm the scaling is a huge help. I would not be able to play my games in 4k / 60fps like I do without SLI. I certainly hope it sticks around. When 4k starts becoming mainstream, no doubt the next resolution up will start its rise in popularity where SLI will be needed yet again.
 
Up thru the 900 series, well over half our gaming builds we did for users were SLI ... with 95% at least "SLI ready". SLIs problem had more to do with nVidia than customers .... and nVidia's problem was this ... two 970s were the same price as a 980. And it had no downside. For most games, yoiu say a 70% average increase in FPS, on the games that matters, many were over 95%. That 70% gave yu 40% more frames than the 980. In the few popular games that did not support it, you lost maybe 12% in fps... weighing one against the other it was an easy decision. The problem for nVidia was, they make more money selling one 980 than they do twin 970s... even more so when game bundles are involved. With the 700, 600 and 500 series, the dollars favored SLI even more.

With the 10xx series, nvida made it harder. For reasons never explained, SLI average scaling at 1080p was only 18% ... @ 1440p, about 30%.…
So, we're venturing into conspiracy theories now?
SLI has never worked as well as you described. And Nvidia have not deliberately sabotaged their Pascal cards, but SLI support in top games have decreased over the last years, due to most top games being console ports. SLI support is highly dependent on how the game engine works, and Nvidia with all their driver tricks can't make a game which is not suited scale with SLI. Current games commonly use multi-pass rendering with synchronization and post-processing effects which limits multi-GPU scaling.

SLI has never been a good choice to get "cheap high-end performance"; SLI has only worked well in certain games, full of glitches and riddled with stutter. SLI is and have just been an option for those who "need" more performance than a single card can offer, in select titles.
 
Did you read my article? The memory part number is listed there and how to decode the part number to get the memory capacity is posted as picture. These chips are without any doubt 256 Mb x 32 = 8 Gb = 1 GB.

Unless there is more memory chips on the other side of the PCB, which seems unlikely.

No, sorry. I skimmed it. Good catch.
 
nvidia is not amd.

and i mean massive TDP like vega56/64 and 64 watercool way. loosers choice amd.

vega64 power eat juice is close that 500W with oc's and its so high that that amd gpu should just banned and get penalty ticket for amd.

nvidia politics are faraway thouse 'we dont care' politics.

300W is limit for nvidia,and that only special gpus like titan.

turing 1060 beat vega 64.
 
nvidia is not amd.

and i mean massive TDP like vega56/64 and 64 watercool way. loosers choice amd.

vega64 power eat juice is close that 500W with oc's and its so high that that amd gpu should just banned and get penalty ticket for amd.

nvidia politics are faraway thouse 'we dont care' politics.

300W is limit for nvidia,and that only special gpus like titan.

turing 1060 beat vega 64.
It's not that Nvidia are geniuses or something. The power efficiency comes from the fact that Nvidia also makes SoCs (mostly for automotive these days) and they have a vested interest in reusing the same design. But yes, that translates into a quite nice advantage for their dGPUs.
 
nvidia is not amd.

and i mean massive TDP like vega56/64 and 64 watercool way. loosers choice amd.

vega64 power eat juice is close that 500W with oc's and its so high that that amd gpu should just banned and get penalty ticket for amd.

nvidia politics are faraway thouse 'we dont care' politics.

300W is limit for nvidia,and that only special gpus like titan.

turing 1060 beat vega 64.

What I don't even...oh my brain hurts. Vega 64 have a 300w (water has 400W) limit so where are you getting 500w?
 
Back
Top