• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA to Unveil "Ampere" Based GeForce Product Next Month

Makes more sense for sure, it's been a year since the GTX 1080 Ti was unveiled after all.

Who knows, maybe time and Nvidia don't wait for the competition to show up.

They don't, Nvidia has seen what can happen to companies who slow down progress. The public opinion shifts rapidly and you may end up digging a hole you can never work your way out of (queue ATI/AMD, or any of the old GPU brands now gone)

Besides, the demand for a powerful 4K card is growing, so it'd be crazy not to cater to that because it represents the highest margin segment of the whole stack. And to do that, you need to move the whole stack forward. Another argument: Nvidia has a proven successful release and business model right now, why change it?
 
What I'm waiting for is passable 4k performance for x60. The next gen won't do that, so I'm not really waiting for anything this round.
Thing is, this is a white whale that will be chased to the ends of the Earth but never caught.

As time goes on, games will only get more intensive at all resolutions, while GPU’s are very likely to maintain their distances from replication of a smooth 60fps gaming experience at 4K.

So while you are looking at 2 gens ahead for this to happen, games will be also advancing at that rate, as they have always done.
 
Personally I don't really care about mid-end cards. x70 is the bare minimum for me. Probably going x80 tho.

Yeah, if everyone was after the same thing, we'd only get one SKU per generation ;)
Thing is, this is a white whale that will be chased to the ends of the Earth but never caught.

As time goes on, games will only get more intensive at all resolutions, while GPU’s are very likely to maintain their distances from replication of a smooth 60fps gaming experience at 4K.

So while you are looking at 2 gens ahead for this to happen, games will be also advancing at that rate, as they have always done.

I've been gaming since the days of CGA graphics. And whether I was waiting for 640x480, 1024x768, 1280x1024 or FHD, at some point there was always an affordable mid range card to do the job.
 
Yeah, if everyone was after the same thing, we'd only get one SKU per generation ;)


I've been gaming since the days of CGA graphics. And whether I was waiting for 640x480, 1024x768, 1280x1024 or FHD, at some point there was always an affordable mid range card to do the job.
But not at the top resolution. Even top end cards cannot achieve the 4K 60fps nirvana on every game (with settings set like game makers intended them to be seen). Every new gen people think that will happen.

If we cannot even achieve that, because new, more intensive games are continually released, how is a mod-range going to achieve that? Heck, mid-range cards haven’t even always been able to do full HD at 1080p at 60fps with all the visuals on.
 
Yeah, if everyone was after the same thing, we'd only get one SKU per generation ;)

But seems kinda strange that you only want to pay for a mid-end card, yet expect it to run a very high res. Even 1080 Ti struggles with 2160p at 60 fps in many demanding games..

If you can only afford mid-end then maybe you should settle with 1440p?
 
But not at the top resolution. Even top end cards cannot achieve the 4K 60fps nirvana on every game (with settings set like game makers intended them to be seen). Every new gen people think that will happen.

If we cannot even achieve that, because new, more intensive games are continually released, how is a mod-range going to achieve that? Heck, mid-range cards haven’t even always been able to do full HD at 1080p at 60fps with all the visuals on.
Depends on how you look at it.
"All the visuals on" usually mean "enable something devs threw in there knowing fully well it was over the top". Some options can be taken down a notch or two with minimal or no visual impact. Some options give you +10% or more fps when taken down a notch. I've always bought mid-range and always managed to get anything working just fine, there's no need to show me it can't be done ;)
But seems kinda strange that you only want to pay for a mid-end card, yet expect it to run a very high res. Even 1080 Ti struggles with 2160p at 60 fps in many demanding games..

If you can only afford mid-end then maybe you should settle with 1080p-1440p.
Like I said, I don't expect that now. I also said I expect passable performance. High-end is there for a reason. It's just above what I'm willing to spend on a video card.
Also 60fps is more of an arbitrary threshold these days. People say it's not enough for fast shooters (which I haven't played in a while). At the same time slower paced titles are ok with <60fps and adaptive V-Sync.
 
@bug 60 fps is enough, and more is better. Its rather similar to resolution in that regard I think.
 
@bug 60 fps is enough, and more is better. Its rather similar to resolution in that regard I think.
I've never been hindered by 60fps, but people that play more competitively than I do seem to think otherwise. Since I've never been in their shoes, I can't refute their claims.
 
I've never been hindered by 60fps, but people that play more competitively than I do seem to think otherwise. Since I've never been in their shoes, I can't refute their claims.

I do play competitively but the niche for which higher than 60 fps leads to a noticeable (documented!) increase in scores or placements is smaller than the % of gamers on 4K right now. For most its a placebo in that regard, but the additional smoothness is something most people do notice. And, as with resolution, the higher you go, the smaller the added benefit.

99% of all games have something in the pipeline that updates slower than 60x per second anyway, most notably server tickrates, so anyone who claims otherwise, well... Facts say something else :)
 
I've always bought mid-range and always managed to get anything working just fine, there's no need to show me it can't be done ;)

You are right, but difference between 1920x1200 and 3840x2160 is pretty huge in terms of GPU power requirements

60 fps is fine for most games, but more is better and I prefer 100+ in shooters

It will probably take ~5 years before x60 will do 2160p @ 60 fps in demanding games without hitching (and without getting dated too fast)

I still see 1440p as the sweet spot between performance and IQ
 
Depends on how you look at it.
"All the visuals on" usually mean "enable something devs threw in there knowing fully well it was over the top". Some options can be taken down a notch or two with minimal or no visual impact. Some options give you +10% or more fps when taken down a notch. I've always bought mid-range and always managed to get anything working just fine, there's no need to show me it can't be done ;)

Like I said, I don't expect that now. I also said I expect passable performance. High-end is there for a reason. It's just above what I'm willing to spend on a video card.
Also 60fps is more of an arbitrary threshold these days. People say it's not enough for fast shooters (which I haven't played in a while). At the same time slower paced titles are ok with <60fps and adaptive V-Sync.
Fair enough! :) I threw in 60fps because for the majority that is where they at least have to be or “zomg! What’s wrong with my performance!”

I will take 45 or 50 also if it’s smooth, as long as I can get the visuals that were programmed. I say that we’re programmed because I’m not a visuals whore; gameplay and story are most important for me. I just believe in using all of them when they are provided.
 
  • Like
Reactions: bug
You are right, but difference between 1920x1200 and 3840x2160 is pretty huge in terms of GPU power requirements

60 fps is fine for most games, but more is better and I prefer 100+ in shooters

It will probably take ~5 years before x60 will do 2160p @ 60 fps in demanding games without hitching (and without getting dated too fast)

I still see 1440p as the sweet spot between performance and IQ
Yeah, 1440 is something I'd like to skip because of my photo processing. But it would be a sensible intermediary step for pretty much anyone else.
 
They don't, Nvidia has seen what can happen to companies who slow down progress. The public opinion shifts rapidly and you may end up digging a hole you can never work your way out of (queue ATI/AMD, or any of the old GPU brands now gone)

Besides, the demand for a powerful 4K card is growing, so it'd be crazy not to cater to that because it represents the highest margin segment of the whole stack. And to do that, you need to move the whole stack forward. Another argument: Nvidia has a proven successful release and business model right now, why change it?

And now that the old 16nm stock is sold out to miners, they can build new cards with 12nm FFN(/FFC) tsmc's manufacturing process with new gddr6 memory, without major shortages of crucial component supply or manufacturing capacity. Which made me wonder what kind of memory they will use on gtx xx70 and lesser, if gdd5 they might stumble with short supply. Gddr5x or even better slower gddr6 would probably be with better supply.
 
Actually RTG is doing great. They managed to increase their GPU shipments market share by 8% Quarter to Quarter, and this probably doesn't even include the Raven Ridge APUs ... It appears their GPUs are efficient enough for the people buying them, but you are welcome to prove me otherwise.
First of all get out of your head that someone has to prove you something . I told you once and i will do it again , this is a childish way of thinking . We are simply discussing about tech news here , we can give our opinions but at the end none of us can pretend hold the truth .This being said let's move on the real deal .

You say RTG is doing great ( AMD in this case) because they managed to increase by 8% quarter to quarter their market share ( on graphics chip wich includes integrated graphics ) , fair enough you have the right to think so . On my side i do believe that reading numbers is good , understanding what they mean is even better .

Why do i say this ? Because you can't judge the financial health of an entity simply by comparing quarter to quarter results . It's like trying to predict a stock based on one day results , it simply doesn't works . When you look at AMD's market share from last year in order to have a greater picture you can see that AMD is still in the red . Not only this but as it is explained in Jon Peddie's article this relative AMD/RTG sucess this quarter is mainly driven by mining . There in no need to say you don't build your business on something as volatile as mining. Gamers are a much more stable platform , quote from the article "gaming has been and will continue to be the primary driver for GPU sales said Dr. Jon Peddie, President of Jon Peddie research." hence why those last quarter results mean litle to the financial health of AMD/RTG in a long term perspective .

On top of that as i said before RTG is focused on descrete graphics and when you look at descrete graphics market share https://cdn.wccftech.com/wp-content...17-Discrete-GPU-Market-Share-NVIDIA-AMD_1.png you can say anything but RTG is doing great ! Especialy when it comes to gaming , there is not even a need to look at numbers when you know that Nvidia is ready to launch a new arch while RTG's greatest and latest GPU barely manages to keep up with a close to two years old gtx 1080 .


Now i do believe peoples are here to discuss mainly about Nvidias next generation gaming gpus (not about if RTG is doing great or not ) so this will be my last reply on that matter .
 
Last edited:
First of all get out of your head that someone has to prove you something . I told you once and i will do it again , this is a childish way of thinking . We are simply discussing about tech news here , we can give our opinions but at the end none of us can pretend hold the truth .This being said let's move on the real deal .

First of all, please stop calling people "childish" when they don't agree with you.

Gamers are a much more stable platform , quote from the article "gaming has been and will continue to be the primary driver for GPU sales said Dr. Jon Peddie, President of Jon Peddie research." hence why those last quarter results mean litle to the financial health of AMD/RTG in a long term perspective .

On top of that as i said before RTG is focused on descrete graphics and when you look at descrete graphics market share https://cdn.wccftech.com/wp-content...17-Discrete-GPU-Market-Share-NVIDIA-AMD_1.png you can say anything but RTG is doing great ! Especialy when it comes to gaming , there is not even a need to look at numbers when you know that Nvidia is ready to launch a new arch while RTG's greatest and latest GPU barely manages to keep up with a close to two years old gtx 1080 .

APUs (incl. consoles) and Intels MCM have GPUs too and you can play games on them. I don't see why they should be ignored when looking at market share ...

AMD doesn't need to launch anything until the demand is so high. They can easily skip one gen or do a refresh and spend the saved money on a future product: Navi or something that comes after it.

We will see in summer what nVidia can do with the "12 nm" process, it's only an improved "16 nm" process after all. They might create a gaming only arch or the performance jump will be small ...
 
First of all, please stop calling people "childish" when they don't agree with you.

You know very well i didn't called you "childish" because you don't agree with me (since i made clear the reasons ) , so please stop acting like you don't understand !

We will see in summer what nVidia can do with the "12 nm" process, it's only an improved "16 nm" process after all. They might create a gaming only arch or the performance jump will be small ...

Yes what they call 12nm is indeed "only" an improved 16nm process ( quote from tsmc page : This process maximizes die cost scaling by simultaneously incorporating optical shrink and process simplification. Furthermore, 12nm FinFET Compact Technology (12FFC) drives gate density to the maximum ) but it should non the less alow higher core count or/and higher clocks , combined to the possible introduction of an entirely new architecture it should logicaly yield good results . This being said we have of course to wait and see .
 
Interesting to see what these offer, i wonder if the midrange XX104 chip will beat the GTX 1080 Ti.
Did midrange Pascal (16nm) beat 980Ti (28nm)?
 
Last edited:
Back
Top