• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GTC 2019 Kicks Off Later Today, New GPU Architecture Tease Expected

According to the GPU database here the 1080 Ti and 2080 Ti launches were 1 1/2 years apart.

apologies I meant 980 ti to 1080 ti... was a little over 2 years apart.

turing is not getting replaced for 2-3 years I promise you that.
 
  • Like
Reactions: 64K
I too bet this conference is focused on AI, and their Automotive progress...

No, don't announce any more Nvidia Certified products like... Nvidia Certified Gaming chairs!!!!
 
apologies I meant 980 ti to 1080 ti... was a little over 2 years apart.

turing is not getting replaced for 2-3 years I promise you that.

No need to apologize. We all make mistakes.

Nvidia may have to if Intel manages to bring some serious gaming GPUs next year but that may just be wishful thinking.
 
They only launched turing gf and quadro, so it's time for tesla T100 to replace V100.
 
  • Like
Reactions: ppn
Good point but I disagree about the big die Kepler. They eventually did release the 780 Ti which had more cores than the Titan and was faster than the Titan until the Titan Black was released but the 780 Ti wasn't good for compute.

I am of the opinion, though I can't prove it, that if Turing didn't need die space for the RT and Tensor cores then there would have been more CUDA cores and we would have seen the kind of performance increase of Pascal over Maxwell.

Absolutely correct, though technically that was the Kepler Refresh and it had its own product stack with a refreshed GK104 as well (faster memory), and even included the first gen Maxwell chips (750(ti)). In addition, it wasn't a given that Nvidia was going to release Big Kepler at any point in time when Kepler was launched. The GTX 690 was the halo card with dual GPU 104's. And even when they did launch it, they teased people into buying Titans at a major price bump. The 780 was probably only released to counter AMD in the end.

As for the second bit, yes, I think that is clear as day, though one tiny caveat here is TDP budgets. Nvidia doesn't like to exceed 250W for the fastest part, and they only do this marginally with their FE 2080ti (260W I think?). I strongly doubt that more CUDA, or the same amount of CUDA as they could fit on a similar die size would have been possible at this TDP.
 
lol,every god damn time the conversation eventually needs to include the same ppl dicussing Kepler SKUs for the hundredth time.

yaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaawn
 
Looking forward to a price tease.

But i guess not, cuz leather jackets are expensive.
 
lol,every god damn time the conversation eventually needs to include the same ppl dicussing Kepler SKUs for the hundredth time.

yaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa9aaaaaaaaaaaaaaaaaaaaaaaawn

Y U No Talk About Maxwell?! Soon to be 4 yrs and 980ti still taking names.
 
Last edited:
Looking forward to a price tease.

But i guess not, cuz leather jackets are expensive.
Sure thing. Maybe it's time for alligator skin-jackets :)
 
Why Y U No Talk About Maxwell?! Soon to be 4 yrs and 980ti still taking names.
Lol,you are here again,this thread is going to the toilet fast.
 
Lol,you are here again,this thread is going to the toilet fast.

It's an nvidia topic. It's already shit. They can't announce any new consumer cards, b/c they can't get rid of turding (plus new turing cards are yet to be released, so this is pretty obvious).

Rumors for clicks, again.
 
Jacket Leather back at it again
 
They need more rtx games,it's been just two and one is a mp shooter.This technology will be dead if this continues.

Thats perfectly fine and rather good for the gaming market. We need ray tracing that can run on all hardware rather than nvidia using its market dominance to push its proprietary technology on developers to drive out fair competition.
 
It's an nvidia topic. It's already shit. They can't announce any new consumer cards, b/c they can't get rid of turding (plus new turing cards are yet to be released, so this is pretty obvious).

Rumors for clicks, again.
More like a cry for attention from the forgotten crowd.
 
Not to mention that the flagship, $1200 2080ti was dying on people shortly after launch.

That was a QC issue, and there weren't any more failed cards than any other launch from what anyone can tell. It's the internet echo chamber that blew that piece of "news" out of proportion.

Also, the people concerned about the 2080 Ti cards that have Hynix memory... Nothing to be worried about there folks, it works just fine, it just doesn't universally clock as well as the Samsung GDDR6 does.
 
well 280 series too was "just released" 6 months before nvidia moved to 285 and dropped the price from 650 to $350 on that part, so 2080 is pretty much going to be the same thing. Then 65 to 55nm provided only small shrinkage from 576 to 476 sq.mm, this time 12nm to 7nm provides 545 to 220 sq.mm shrink. and that can barely fit any 256 bit bus, so I guess they will add more cores for a total of 4096 and bump clocks to 2.45 Ghz.
 
That was a QC issue, and there weren't any more failed cards than any other launch from what anyone can tell. It's the internet echo chamber that blew that piece of "news" out of proportion.

Also, the people concerned about the 2080 Ti cards that have Hynix memory... Nothing to be worried about there folks, it works just fine, it just doesn't universally clock as well as the Samsung GDDR6 does.

Even 2060s were dying. Micron RAM is garbage. People got multiple replacements after launch with the same problem. Nvidia just continued to deny it as a systemic problem.
 
Nvidia needs 7nm process more than what some people think. it's not the matter of competition.
They will probably release the 7nm 80Ti model along with the 80 and 70 class just like what they did with turing because if their top-end turing barley manages to handle RT at 1440p in 2019 it will probably only be enough at 1080p by 2020.
It's the matter of showing off RTX at its best for nvidia, and nothing AMD related (except for consoles maybe?)
 
The micron memory thing is coming back up again? Samsung equipped cards died too in a similar manner... come on now... let's get with it. :)

But they would work normally when mem was downclocked.... All of the reports I read were micron.
 
GTC is a compute oriented conference. Nvidia usually reveal something new, and a new architecture is quite possible, but consumer products on GTC is fairly rare.
 
lol,every god damn time the conversation eventually needs to include the same ppl dicussing Kepler SKUs for the hundredth time.

yaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaawn

Those Keplars. Well the things is Kepler was the first time nvidia went for compute only gpu(gk210). After that there have been two heavy compute gpu chips(GP100, GV100), which haven't released to consumers at all(I don't count Titan V as consumer card with $3000 price tag). Release gap between GP100 and GV100 was only a year(Because of contracts) and GV100 is now two years old chip. So it might be time for next gen compute heavy card, now on 7nm. One is certain though, it won't be released right a way. Nvidia usually introduce new things on GTC and release actual products much later timeline(What it took on GV100 Teslas? GTC may 2017 and actual products at Q4 2017/Q1 2018?).

Nvidia needs 7nm process more than what some people think. it's not the matter of competition.
They will probably release the 7nm 80Ti model along with the 80 and 70 class just like what they did with turing because if their top-end turing barley manages to handle RT at 1440p in 2019 it will probably only be enough at 1080p by 2020.
It's the matter of showing off RTX at its best for nvidia, and nothing AMD related (except for consoles maybe?)

Well 12nm TSMC is proven process for producing gigantic chips. So it might actually be cheaper to Nvidia to produce current size Turings on that process than moving to much more denser and unproven 7nm TSMC with a die shrink. They might actually skip plain 7nm altogether and wait for 7nm EUV before full range 7nm gpu release. with that mind I very much doubt Nvidia will release any consumer 7nm graphics card this year, tu116 was just released and tu117 is to be released soon. Navi might come earliest at the end of Q2 rumors say October and it won't probably be any faster than Vega 7. And even when it will be very competitive with Nvidia counter parts, nvidia got their six month head start and with their mind share they can wait for Q1 2020 for 7nm.
 
Back
Top