• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next GeForce series won't be Volta but Pascal-Refresh

Joined
Jan 5, 2006
Messages
18,584 (2.63/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
"Over a year ago, Nvidia launched the Pascal architecture, which is being used for the current generation of GeForce video cards. There has been a lot of talk and speculation lately about the future Volta architecture. However, the following GeForce series will not be based on this. According to recent reports, it concerns a further development of Pascal.

With the further development of Pascal architecture, Nvidia seems to choose a type of tick-tock strategy, as we know for example from Intel. This means, among other things, that we do not expect HBM2 memory on the future GeForce series, so the high-end cards should also have maximum GDDR5X video memory. However, the production process is slightly modified. Current chips are baked according to the 16 nm FinFET procedure of TSMC, this would be reduced to 12 nm. However, this is more an optimization of the current process rather than something new, which in this case we should not expect any miracles.
"

https://nl.hardware.info/nieuws/525...erie-wordt-doorontwikkeling-pascal-geen-volta
 
I feel more better about my card. Wooooo.
 
So AMD's Vega will be competing with Nvidia's Volta in HPC, and also a Pascal refresh?
 
It seems that Nvidia doesn't feel the need of much faster cards yet, even with Vega coming.
 
AMD did the same with with RX 4## and RX 5##. They were thought to be Vega until it was obvious that it wasn't.
 
A positive thing about refresh chips are better optimized drivers.
 
Source is Fudzilla.

We'll need to wait and see. Though Volta is heavily about AI/ML as suggested so may not be a surprise to diverge the two systems for PC and compute.
 
It seems that Nvidia doesn't feel the need of much faster cards yet, even with Vega coming.

i Cant blame them...i cant believe the best gpu amd has is the (iirc)580. it is in line with the 9xx gpu's released in 2015 performance wise:shadedshu:
 
i Cant blame them...i cant believe the best gpu amd has is the (iirc)580. it is in line with the 9xx gpu's released in 2015 performance wise:shadedshu:
Yup, clearly NVIDIA aren't worried about competition from AMD so they're extracting more profits from the current design at our expense. AMD really need to pull a Ryzen and then some with their flagship GPU to get competition back on track.

And from the articles I've seen, Vega is rumoured to perform like a 1080 at best, not even the 1080 Ti, so no wonder. This hardware.info article just helps to give credibility to those rumours.
 
Yup, clearly NVIDIA aren't worried about competition from AMD so they're extracting more profits from the current design at our expense. AMD really need to pull a Ryzen and then some with their flagship GPU to get competition back on track.

And from the articles I've seen, Vega is rumoured to perform like a 1080 at best, not even the 1080 Ti, so no wonder. This hardware.info article just helps to give credibility to those rumours.

I actually had high hopes for the rx 5XX series :( ..... I can remember when I pulled out my GTX 970 to test the RX 580 with twice as much memory, as well as not suffering from the .5 GB of "slow Vram",(the 970's "suffered" with) although it never made any difference for me. And the switch was almost unnoticeable aside from the normal subtle differences when you're switching from a Nvidia to an AMD GPU. If it wasn't for this massive boom in crypto coin value I wonder where the price of the RX series would be right now $150/580? Who knows.

Don't get me wrong I'm not biased toward either brand, it's simple matter of hoping for a healthy market of competition so that we benefit in the end (Ofc "we" being the collective consumer market). To that end -please AMD , get off your A$$'$)
 
It seems that Nvidia doesn't feel the need of much faster cards yet, even with Vega coming.
Problem is availability and they probably exploit this, miners don`t care about drivers, connectivity or such things.
 
"Over a year ago, Nvidia launched the Pascal architecture, which is being used for the current generation of GeForce video cards. There has been a lot of talk and speculation lately about the future Volta architecture. However, the following GeForce series will not be based on this. According to recent reports, it concerns a further development of Pascal.

With the further development of Pascal architecture, Nvidia seems to choose a type of tick-tock strategy, as we know for example from Intel. This means, among other things, that we do not expect HBM2 memory on the future GeForce series, so the high-end cards should also have maximum GDDR5X video memory. However, the production process is slightly modified. Current chips are baked according to the 16 nm FinFET procedure of TSMC, this would be reduced to 12 nm. However, this is more an optimization of the current process rather than something new, which in this case we should not expect any miracles.
"

https://nl.hardware.info/nieuws/525...erie-wordt-doorontwikkeling-pascal-geen-volta
Not a surprise to me ,I called this out months ago , voltas for next Christmas not this one and it never was going to be, they're just putting it on pciex now for pro user's and the consumer version isn't even in play yet though likely has taped out a Engineering sample or two.
 
Kepler does not stand up anymore because it is not a computational architecture.

Not sure about that , Kepler was fine architecturally speaking. Point is we had 2 generations of it and nothing major changed driver wise , on top of that when something like this happens the next generation which is a big step-up is pretty much guaranteed to put the previous one in the shadow in terms of driver optimizations.
 
Not sure about that , Kepler was fine architecturally speaking. Point is we had 2 generations of it and nothing major changed driver wise , on top of that when something like this happens the next generation which is a big step-up is pretty much guaranteed to put the previous one in the shadow in terms of driver optimizations.
No, Kepler is the least capable computational architechture. Fermi, Maxwell and Pascal are orders of magnitude better.

Nothing ever changed as drastically as AMDs driveroptimizations, and that is not because AMDs driver team is better but the other way around.

And having products not getting optimizations is nothing new either and absolutely nothing unique with Kepler. Don't know where this obsession started.

Not a surprise to me ,I called this out months ago , voltas for next Christmas not this one and it never was going to be, they're just putting it on pciex now for pro user's and the consumer version isn't even in play yet though likely has taped out a Engineering sample or two.
Well, it's not exactly a confirmation is it?
 
Last edited:
No, Kepler is the least capable computational architechture. Fermi, Maxwell and Pascal are orders of magnitude better.

Nothing ever changed as drastically as AMDs driveroptimizations, and that is not because AMDs driver team is better but the other way around.

And having products not getting optimizations is nothing new either and absolutely nothing unique with Kepler. Don't know where this obsession started.

Well, it's not exactly a confirmation is it?
No but ill still be right when they are released i assure you , i analysed the facts adequately ages ago ,like 6 months ago and the lead times for such architecture changes is such that i cant be wrong , Nvidia use of pr to deflate Amd releases has gone exactly as expected ,only big volta got shown off ,why because thats the main gpu they have been working on for release next , why , because their love child market is under attack big time , and not from Amd, yet ( instinct will change that) no feckin google and their tensor flow chips are stealing nvidias thunder in ai so they Needed a counter punch.
Also being realistic they didn't need to do consumer volta , we all agree their surely, i wont be buying pascal refresh though as I can't see it being worth it fiscally.
 
And having products not getting optimizations is nothing new either and absolutely nothing unique with Kepler. Don't know where this obsession started.

The observation stems from Nvidia's tradition of making major revamps of it's architecture each 2 generations or so. While AMD dose not , their design philosophy dictates incremental improvements and small changes to a base design , because of this their drivers never showed any major improvement and probably will never do , but performance doesn't degrade either because of it.

Making major changes comes with the inevitable effect of putting later iterations of hardware on a back-burner more often with Nvidia than with AMD as a result. We can all agree Kepler falls short now , a particular title Doom shows this more than anything where a 780ti can get you 60fps at 1080 on ultra just so, while something like a 970 which is on par computationally speaking delivers way more than that. It may difficult to discern the reason for the loss in performance over time , but it's evident that most of it comes from lack of driver optimization. Not saying it's new but it's more pronounced here.
 
No but ill still be right when they are released i assure you , i analysed the facts adequately ages ago ,like 6 months ago and the lead times for such architecture changes is such that i cant be wrong , Nvidia use of pr to deflate Amd releases has gone exactly as expected ,only big volta got shown off ,why because thats the main gpu they have been working on for release next , why , because their love child market is under attack big time , and not from Amd, yet ( instinct will change that) no feckin google and their tensor flow chips are stealing nvidias thunder in ai so they Needed a counter punch.
Also being realistic they didn't need to do consumer volta , we all agree their surely, i wont be buying pascal refresh though as I can't see it being worth it fiscally.
Ofcourse, that's not what im saying either. I'm just saying that this is a rumor just like any other.


The observation stems from Nvidia's tradition of making major revamps of it's architecture each 2 generations or so. While AMD dose not , their design philosophy dictates incremental improvements and small changes to a base design , because of this their drivers never showed any major improvement and probably will never do , but performance doesn't degrade either because of it.

Making major changes comes with the inevitable effect of putting later iterations of hardware on a back-burner more often with Nvidia than with AMD as a result. We can all agree Kepler falls short now , a particular title Doom shows this more than anything where a 780ti can get you 60fps at 1080 on ultra just so, while something like a 970 which is on par computationally speaking delivers way more than that. It may difficult to discern the reason for the loss in performance over time , but it's evident that most of it comes from lack of driver optimization. Not saying it's new but it's more pronounced here.
They don't show additional improvement because that's not how their driver works at a fundamental level. It's already very efficient at what it does. The benefits they apply is on a game by game basis by making it run as it should.

AMD has a "problem" with not having the same driver and relies more on hardware to make their product work. So when you specifically have to tell a game what to do is where AMD get their gains. You are almost guaranteed not being able to use 100% of the performance because of it.

Again, a GTX 970 can do so much more than a 780 Ti could computationally. They are absolutely not "on par". There are no optimizations to be done since the end result is not worth it.
Have you tried developing with OpenCL/CUDA on Kepler versus Maxwell? The difference is night and day. As soon as compute is put on the GPU Kepler just gives up. (especially OpenCL, it's just trash)
 
Again, a GTX 970 can do so much more than a 780 Ti could computationally. They are absolutely not "on par". There are no optimizations to be done since the end result is not worth it.
Have you tried developing with OpenCL/CUDA on Kepler versus Maxwell? The difference is night and day. As soon as compute is put on the GPU Kepler just gives up. (especially OpenCL, it's just trash)

I have to agree with this even in mining the cards have been a joke. I mean the 780ti pulls numbers similar to the 960, which wasn't exactly a good card.
 
Again, a GTX 970 can do so much more than a 780 Ti could computationally. They are absolutely not "on par". There are no optimizations to be done since the end result is not worth it.
Have you tried developing with OpenCL/CUDA on Kepler versus Maxwell? The difference is night and day. As soon as compute is put on the GPU Kepler just gives up. (especially OpenCL, it's just trash)

We seem to talking about different things , I know Kepler falls short when it comes down to GPGPU. But that's not what I have been referring to , Kepler as in with anything Nvidia had works with no drawbacks for DirectX/OpenGL , as a matter of fact on OpenGL Nvidia is known to consistently outperform AMD on a regular basis. That's why I gave you the example with Doom which runs on OpenGL yet the performance is not what should have been with something like a 780ti. Poor CUDA and OpenCL performance it's due to lack of hardware scheduling but that doesn't matter remotely as much for games as some people might be led to believe.

I am referring strictly to gaming and the lack of driver optimization is evident to me. For GPGPU yes Kepler is inherently slower , but again that has hardly anything to do with the drivers.
 
Last edited:
We seem to talking about different things , I know Kepler falls short when it comes down to GPGPU. But that's not what I have been referring to , Kepler as in with anything Nvidia had works with no drawbacks for DirectX/OpenGL , as a matter of fact on OpenGL Nvidia is known to consistently outperform AMD on a regular basis. That's why I gave you the example with Doom which runs on OpenGL yet the performance is not what could have been with something like a 780ti.

I am referring strictly to gaming and the lack of driver optimization is evident to me. For GPGPU yes Kepler is inherently slower , but again that has hardly anything to do with the drivers.
OpenCL is used in all OpenGL-based games. Again, a feature Kepler is rubbish at.

If you need to run anything compute-related on the GPU. Kepler has a natural disadvantage.
 
Kinda strange to see AMD actually doing worse in the GPU market than they are in the CPU market lately. Their GPU lineup desperately needs something on par with Ryzen. Otherwise Nvidia will be content to sit on their complacency like Intel.
 
Kinda strange to see AMD actually doing worse in the GPU market than they are in the CPU market lately. Their GPU lineup desperately needs something on par with Ryzen. Otherwise Nvidia will be content to sit on their complacency like Intel.

It might get worse , cryptocurrency is screwing up massively GPU pricing and availability. Vega could barley make it to the average Joe that just wants a very fast card for gaming. And that's not going to help their reputation. Another reason to why AMD is stalling , they would be trading relevance and reputation for sales, mining wont last forever and right now they the reputation.
 
It might get worse , cryptocurrency is screwing up massively GPU pricing and availability. Vega could barley make it to the average Joe that just wants a very fast card for gaming. And that's not going to help their reputation. Another reason to why AMD is stalling , they would be trading sales for reputation , mining wont last forever and right now they the reputation.
Reminds me about that similar toy money craziness few years ago.

I guess there's no need to upgrade from my 970 yet.
 
Back
Top