• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Releases First Internal Performance Benchmarks for RTX 2080 Graphics Card

2020
Nvidia Ampere is 7nm (3000 Series) Second Generation Ray Tracing.

Vs Intel Arctic Sound
Vs AMD Navi

Unfortunately no competition 2018 and 2019

Vega 20 best won't match 2080Ti performance by no means.

Nvidia has market Monopoly till 2020.
 
Why are they comparing 2080 to 1080, when 2080 is essentially 1080 Ti replacement and 2080 Ti is Titan replacement. They should be comparing 2070 to 1080. Sure it'll still be better but nowhere near as much as they are claiming. Cynical attempt to fool people with absurdly priced cards with features most people don't need. Who the hell needs Tensor cores for gaming? That should be a Ti and Quadro feature only. Even the ray tracing is a gimmick at this stage and should also be eclusive to the Ti. The lower models could have been much smaller chips without RTX and tensor cores and much cheaper. GTX 2080 and 2070 to directly replace 1080 and 1070 and RTX 2080 Ti replaces 1080Ti and Titan. AMD can make a killing if they get the 7nm update pricing and performance right.


Well the tensor core are there to accelerate Anti Aliasing, that why the DLSS mode is so much faster, because they use the AI engine to figure where to use AA and where not to, thus increasing performance. Also the tensor core is used to denoise the Ray Traced images. If FPS is all you care about how about getting 2x Vega 64 and call it a day.
 
BTW, why would anyone need any kind of AA on 4K displays unless they're >40-inch TVs?

Might be setting the ground work for BFGD.

The PR releases and the crap slides combined remind me of AMD publicity, complete with everyone’s reactions! :laugh:

I can't see RTX without thinking AMD.
 
when are reviewers getting cards?
 
Why are they comparing 2080 to 1080, when 2080 is essentially 1080 Ti replacement and 2080 Ti is Titan replacement. They should be comparing 2070 to 1080. Sure it'll still be better but nowhere near as much as they are claiming. Cynical attempt to fool people with absurdly priced cards with features most people don't need. Who the hell needs Tensor cores for gaming? That should be a Ti and Quadro feature only. Even the ray tracing is a gimmick at this stage and should also be eclusive to the Ti. The lower models could have been much smaller chips without RTX and tensor cores and much cheaper. GTX 2080 and 2070 to directly replace 1080 and 1070 and RTX 2080 Ti replaces 1080Ti and Titan. AMD can make a killing if they get the 7nm update pricing and performance right.
Because 2080 is NOT 1080Ti replacement, that's why. it is 1080's replacement.

Don't get confused just because one level down usually outperforms the previous gen one level higher. Models always replace the exact model from prior, only one or more levels better in performance. That is just part of improvement, otherwise there would be no point in ever making new models.
 
when are reviewers getting cards?

They got them at the show (according to youtubers) and others are being sent out. The leaked slide show has an embargo date of September 14. 4 weeks from presentation to reviews.
 
From what I understood from the reveal is that Nvidia will process each game in their labs for the deep learning/AI training. The resulting DLSS profile for each title will then be delivered through driver updates. I could be wrong, though.
Now that you mentioned it i think i remember something like that from the presscon or article, maybe someone has to play the game nonstop(LEL) or maybe assets are analyzed etc so it can develop a way on how to smoothen the jaggies in different scenarios

Also more info
NVIDIA-TU102-GPU-Block-Diagram.jpg

NVIDIA-Turing-vs-Pascal-Shader-Performance.jpg
 
one thing both nvidia and amd taught me over 2 decades of their presentation observation is: if presentation slides do not show a stable +75% or more (vs previous Gen) in their cherry picked titles - then in real world it will be sub 30% gains. wich would be fine (altough questionable - because of the huge time gap from prev Gen), but why the hell a +50% price increase? what/who can justify that? those "RTX ON" 30fps fullhd gameplays?
 
It looks like DLSS allows to move all the AA and upscaling to tensor units, giving regular shader unit space to breathe (or even allow to render at lower resolution eventually). This can explain such a big performance difference, since 4xMSAA can cost you half the FPS
 
It's normalized for Christs sake. That why it's the same for all. God if someone that is already skeptical doesn't know how to interpret these charts I can only imagine what sort of effect it has on less "tech savy" individuals.
What does normalized mean?
 
You know it's sad that these new techniques might have been met with positive interest if it wasn't for the offensive pricing.
 
@W1zzard What date will your RTX reviews be published?
 
We want actual numbers and fps not power point graphs XD .
 
Ha... Nvidia has got the trick to empty the wallets of tech victims. Great news for technology and for us. So, AMD is almost gone from the market, nvidia is doing crazy things with pricing and everybody seems happy. Things are going great.
 
Last edited:
Has anyone seen the Jayztwocents video about the new RTX cards??


Found it kinda of interesting and it seemed to make a little sense, price hikes and all.. Just wondered your opinions :)
Although I do find Jay to be entertaining, the thing to keep in mind is Jay is somewhat Nvidia biased. He even said last year he would no longer accept AMD review samples. So, just take what he says with “two cents” worth of salt.
 
Although I do find Jay to be entertaining, the thing to keep in mind is Jay is somewhat Nvidia biased. He even said last year he would no longer accept AMD review samples. So, just take what he says with “two cents” worth of salt.

A dead giveaway is also the fact that he feels the need to point out that he isn't biased.
 
Although I do find Jay to be entertaining, the thing to keep in mind is Jay is somewhat Nvidia biased. He even said last year he would no longer accept AMD review samples. So, just take what he says with “two cents” worth of salt.

Its gotten worse. I used to find him entertaining as well. When he starts to sly'ly complain about free hardware he gets for his custom builds. Comes across as a entitled child.

One of his recent videos had another Tech youtuber disagreeing with him (not the ones that hang around each other) in the comments and it was the most up voted. Its no longer there looks like most of the ones that agreed with the other youtuber got deleted
 
Extremely expensive ray traced AA thats barely as good as regular SSAA. Worth avoiding.

It's anything but expensive. It's very fast AI based AA method running through tensor cores. If it's any good by picture perspective, that is the real question. So it works well in scripted demo but will it work well in real world games.
 
"Unfortunately, due to multiple non-disclosure agreements we signed, we can only tell you that Shadow of the Tomb Raider looks stunning with ray tracing turned on. "

Where have I heard this one before?!?!? Ohh yeah... remember that NDA that said the reviewers can only write things that benefits NVIDIA.

So you get to release an article a month before everybody to get the maximum amount of traffic to your site ($$$$$) and the only thing you have to do is get your article approved by NVIDIA. This is pretty much what people said would happen and there you have it.

Wanna bet that other review sites don't want that early release traffic ($$$$$) on their site? gtfooh
 
Last edited:
Back
Top