• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Releases First Internal Performance Benchmarks for RTX 2080 Graphics Card

old [----]
new [-------]

any questions?

I bet a 2080Ti some will still get confused even with this level of simplification :laugh::laugh::laugh:
 
Unimpressed with the price, that's enough to really put me off.. Must be due to the mining maybe that the prices are so high or is it because they've gone from GTX to RTX?? Maybe the bosses who thought that up needed a bigger bonus or something..

Either ways, I'll stick with my first word of the post, unimpressed....
 
Wth is DLSS ????????????
 
300W+ under load I bet throw in overclocking and you might scratch 350-400W
Puts it about in line with a decent 1080ti, I'd be willing to bet that the 2080ti just edges out the titan xp service pack 3 or whatever we're on at this point. Pre laugh it seems like rtx is setting up for failure and become cards people say aren't gaming oriented. I personally am incredibly interested in the fact that the cards come with tensor cores in them and ignoring the ray tracing crap, I think the 2070 and 2080 may be go to cards for ai hobbiests and devs in the future.
 
Let's wait for some figures regarding 2080 vs 1080 without all the tensor/RT features. This more 'raw' comparison is what will really show the performance of the new generation, not the cherry-picked FPS scores which shows off the tech in the new cards. Looking at the FLOPs of the 20 series, they're not far beyond the 10 series, but with the boosted memory bandwidth, we may see only 20% to 30% gains over the outgoing generation.

GTX 1080 8227.8 GFLOPS vs RTX 2080 8920.3 GFLOPS at base clocks.
 
Wth is DLSS ????????????

It stands for Deep Learning Super Sampling, it takes the idea of a simple graph and tries to understand why some people don't get it.

Joking aside, it's a new AA technique that uses those shiny new tensor cores to create a better image, how it pans out in practice remains to be seen.
 
It stands for Deep Learning Super Sampling, it takes the idea of a simple graph and tries to understand why some people don't get it.

Joking aside, it's a new AA technique that uses those shiny new tensor cores to create a better image, how it pans out in practice remains to be seen.
Thanks for the update. I'm still confused on how do you gain that huge performance increase by enabling that??
 
Not looking good. Focus on the dark green bars (minus DLSS, why do you need AA at 4K?). Consider that not a lot of these are current AAA titles, and that you're comparing a 215W TDP chip to a 180W TDP chip (GTX 1080), or 19% higher TDP right off the bat. Certainly doesn't warrant these prices.
 
BTW, why would anyone need any kind of AA on 4K displays unless they're >40-inch TVs?
 
Not looking good. Focus on the dark green bars (minus DLSS). Consider that not a lot of these are current AAA titles, and that you're comparing a 215W TDP chip to a 180W TDP chip (GTX 1080), or 19% higher TDP right off the bat. Certainly doesn't warrant these prices.

Exactly my thought.

Cherry picked titles with the shiny DLSS bar on top the actual dark green bar to divert public attention... 19% higher TDP, 40 - 50% higher MSRP for an average 30% (dark green bar only) performance increase, not worth it.

Thanks for the update. I'm still confused on how do you gain that huge performance increase by enabling that??

As far as I understand it (very basic), DLSS (a type of AA) involves heavy matrix operations and these tensor cores are specialised in making them as fast as possible.
 
Last edited:
BTW, why would anyone need any kind of AA on 4K displays unless they're >40-inch TVs?
No clue. I haven't used AA in a long while. Not at 1080p on a 22" screen, not at 1440x900 on this 19" tv, not at 1680x1050 on a 20" screen. In fact the only time I've thought something could use some AA is when playing PS2 games... on an actual PS2. In PCSX2, just up the resolution (even 3x native, which is close to 1920x1080) looks really good. Some of the lines on the actual PS2 though kinda look like, well, this

99649-01-1000.jpg
 
Wth is DLSS ????????????
Imagine a hardware part dedicated for super sampling ( i think this was being imagined many years ago like free 4x AA with no performance hit), but like they say it has yet to be seen in action, and how much it affects the 2080ti, 2080 and 2070.

Them tensor cores are deep learning AI processors and it tries to predict the ideal smooth image, per frame using its knowledge and algorithm (probably setup by the devs of the game), so its like its trying to fill up the pixels and eliminate jaggies based on what it understands what it looks smooth
 
On higher resolutions SMAA injected with SweetFX it's the best AA tech ever.
 
The PR releases and the crap slides combined remind me of AMD publicity, complete with everyone’s reactions! :laugh:
 
These "benchmarking results" look very shady to me. There's no mention of what settings are used, there's no mention what the baseline on the graphs is... and the titles themselves look incredibly dodgy to me. They're either "HDR-tested" whereas we know Nvidia GPUs can have trouble with HDR and performance losses, are games or applications that aren't out yet, or have unclear testing methodology (like dx11 or dx12 was used for hitman?), Ark Survival is on the graph on the left, but we don't have FPS numbers for the game which we could compare with existing 10xx line benchmarks. We don't have PUBG numbers either. Not a single reliable information.

I have no doubt it'll be better, since I doubt the performance regressed, but these "internal benchmarks" are all fishy to me.
 
all internal benchmarks are fishy just wait for a month before them reviewers gets their hands on it, or maybe some leaks
 
Considering the card's prices, if the 2070 is not faster or at least same perf levels as the 1080Ti, I will consider those cards to be a complete failure from nVidia.
 
Thanks for the update. I'm still confused on how do you gain that huge performance increase by enabling that??

I don't see that anyone took a stab at answering that. If I understand correctly, the benchmarks were run with some unspecified level of AA, and that AA was then replaced by DLSS running on the "other" cores, freeing up previously used resources to render more frames.

So, the benchmarks were run at probably a very high/expensive level of AA + 4K, something you wouldn't normally ask a 1080FE to do ...
 
Considering the card's prices, if the 2070 is not faster or at least same perf levels as the 1080Ti, I will consider those cards to be a complete failure from nVidia.
Yeah, x70 cards are normally faster than the prev high end in raw performance, but they are saying that the 2070 is faster than the ol titan but in RTX or other metrics rather than raw performance
 
I don't see that anyone took a stab at answering that. If I understand correctly, the benchmarks were run with some unspecified level of AA, and that AA was then replaced by DLSS running on the "other" cores, freeing up previously used resources to render more frames.

So, the benchmarks were run at probably a very high/expensive level of AA + 4K, something you wouldn't normally ask a 1080FE to do ...
8x MSAA then blur fest for 20 series, sounds about right for nvidia marketing
 
should have released these slides on Day 1 itself...
there would'nt have been so much negativity/uncertanity then...
 
Why are they comparing 2080 to 1080, when 2080 is essentially 1080 Ti replacement and 2080 Ti is Titan replacement. They should be comparing 2070 to 1080. Sure it'll still be better but nowhere near as much as they are claiming. Cynical attempt to fool people with absurdly priced cards with features most people don't need. Who the hell needs Tensor cores for gaming? That should be a Ti and Quadro feature only. Even the ray tracing is a gimmick at this stage and should also be eclusive to the Ti. The lower models could have been much smaller chips without RTX and tensor cores and much cheaper. GTX 2080 and 2070 to directly replace 1080 and 1070 and RTX 2080 Ti replaces 1080Ti and Titan. AMD can make a killing if they get the 7nm update pricing and performance right.
 
Them tensor cores are deep learning AI processors and it tries to predict the ideal smooth image, per frame using its knowledge and algorithm (probably setup by the devs of the game), so its like its trying to fill up the pixels and eliminate jaggies based on what it understands what it looks smooth
From what I understood from the reveal is that Nvidia will process each game in their labs for the deep learning/AI training. The resulting DLSS profile for each title will then be delivered through driver updates. I could be wrong, though.
 
Back
Top