Wednesday, November 22nd 2017

NVIDIA GeForce RTX 2060 Shows Up in Final Fantasy XV Benchmarks

The RTX family debuted with top of the line graphics cards, but the Turing era is just started and there will be new members joining those first products. One of the most expected is the RTX 2060, and now this new graphics card has been seen in Final Fantasy XV benchmarking database. This information should be taken with a grain of salt, but in the past this listing has showed us upcoming products such as the Radeon RX 590, so the evidence is quite interesting. According to this data, the RTX 2060 would perform slightly below the Radeon RX Vega 56 and NVIDIA GeForce GTX 1070, but its numbers are quite better than those of the GTX 1060.

NVIDIA itself confirmed there would be a "mainstream" product in the Turing family in the future, and although the company seems now focused on selling out their excess inventory of mid-range Pascal graphics cards -Black Friday could help there-, the new GPU could be announced in the next few weeks and some analysts expect it to be available on Q1 2019. It'll be interesting to confirm if the data in our TPU database is correct, but we're specially curious about the price point it'll have.
Source: Overclock 3D
Add your own comment

121 Comments on NVIDIA GeForce RTX 2060 Shows Up in Final Fantasy XV Benchmarks

#26
Turmania
this year has been a major disappointment in GPU tech. Still, best of the worst is still Nvdia with their RTX range.RTX 2070 altough not a world beater, it is quite good compared to performance and price and even power usage for small form factor case users. almost gtx 1080ti performance with less heat and less price.
Posted on Reply
#27
M2B
Th3pwn3r said:
Really? By beat you mean it's price per performance ratio is better? I guess we should all consider the most pointless card ever then.
Cheapest Custom RTX 2070:
https://m.newegg.com/products/N82E16814932091

Cheapest Custom Vega 64:
https://m.newegg.com/products/N82E16814932031

RTX 2070 and Vega 64 both are in the same price range while 2070 is way more energy efficient, faster, less noisy and runs cooler in general.
And no, no one would buy Reference Vega 64 because it's hot garbage.
Posted on Reply
#28
Th3pwn3r
M2B said:
Cheapest Custom RTX 2070:
https://m.newegg.com/products/N82E16814932091

Cheapest Custom Vega 64:
https://m.newegg.com/products/N82E16814932031

RTX 2070 and Vega 64 both are in the same price range while 2070 is way more energy efficient, faster, less noisy and runs cooler in general.
And no, no one would buy Reference Vega 64 because it's hot garbage.
That's actually an extremely good deal for the 2070 if you factor in what BF V would cost. I wonder if that's just a Black Friday price. And you're wrong about nobody buying reference Vega64 because plenty of people already have and will purchase it.
Posted on Reply
#29
Mistral
Pretty sure this isn't going to be called RTX. We already know how good the 2070 is BF5, I can't imagine the 2060 wit the extra effects on.

On the plus side, it'll probably be a lot more reliable card.
Posted on Reply
#30
moproblems99
M2B said:
The most pointless card ever beats AMD's fastest card ever, interesting.
Keep looking up the graph for Vega. Granted, I don't what what Vega's they are.
Posted on Reply
#31
Vayra86
xkm1948 said:
Dominating the opponent in absolute performance, energy efficiency. Bringing in awesome new features to move the industry forward. Yet according to some people they are “losers” . Talking about mental gymnastics.
Did Huang give you a free T shirt or something?

Where is this industry moving forward, I might ask? Where's that awesome content with RT in it? The only thing I'm seeing is stagnation in performance and a price point that is well out reach for the vast majority, for questionable IQ differences.

The mental gymnastics are with you, thinking that a 1200 dollar product that is half viable is somehow going to change the game. This RT implementation is a dead end and a cheap knockoff from the Quadro line, which in turn is a cheap knockoff from Volta. Cheap at 1200 bucks, mind you.

All we have today is empty promises and lackluster performance. If that is progress, then yes, Nvidia is doing fantastic.
Posted on Reply
#32
Th3pwn3r
Vayra86 said:
Did Huang give you a free T shirt or something?

Where is this industry moving forward, I might ask? Where's that awesome content with RT in it? The only thing I'm seeing is stagnation in performance and a price point that is well out reach for the vast majority, for questionable IQ differences.

The mental gymnastics are with you, thinking that a 1200 dollar product that is half viable is somehow going to change the game. This RT implementation is a dead end and a cheap knockoff from the Quadro line, which in turn is a cheap knockoff from Volta. Cheap at 1200 bucks, mind you.

All we have today is empty promises and lackluster performance. If that is progress, then yes, Nvidia is doing fantastic.
If you can't admit Nvidia had a great idea then you must be in denial. The problem was the execution.
Posted on Reply
#33
R0H1T
Great idea like PhysX? Like I said in the other thread RT RT may well be too computationally intensive to be viable for high end gaming, cheap effects sure but when we hit the Si (physics) wall it'd be interesting to see where's RT RT wrt mainstream gaming & other evolutionary tech like AI.
Posted on Reply
#34
john_
I guess as we are moving forward many games will get ray tracing support where the low settings will be enough for the 2060@1080p, medium settings will be for 2070 and high, ultra settings for 2080 and 2080 ti. Nvidia will try to sell ray tracing the same way it tried to sell PhysX in the past.
Posted on Reply
#36
T4C Fantasy
CPU & GPU DB Maintainer
People are looking at the mac vegas and thinking 2060 is better than the real vega 56/64
Posted on Reply
#37
theoneandonlymrk
Th3pwn3r said:
If you can't admit Nvidia had a great idea then you must be in denial. The problem was the execution.
What , you think that one guy thought up all Nvidias patents, they screwed over companies and bought them, this raytracing thing is also not his idea , it's quite old in fact ,and In use years, just not in games, (much, sic)
The idea to sell AI tech that could technically Path trace to consumer's Was his idea.
I applaud him, pushing rays is in earnest something i want in the world but im not paying his bonus this time out , he's taking the #@ss.

Nice post change, that's not what you Did say.
Posted on Reply
#38
medi01
Pruny said:
Rip rx590
Yeah, that slower than 1070 card at 1070's price will surely show it!

Even goddamn 1060's street price is at 590's levels, how much would this piece of... hardware cost...
Posted on Reply
#39
krykry
Th3pwn3r said:
If you can't admit Nvidia had a great idea then you must be in denial. The problem was the execution.
You know what the problem about Nvidia's "great idea" is? It's that they DID NOT make raytracing for gamers. Raytracing in games is merely a side project to them. The whole thing with raytracing and AI cores was for professional GPUs, especially for movie-making industry (where raytracing is BIG and everyone used CPUs for until now).
Then they came up with rebranding those GPU chips and making it a gaming feature despite the fact this tech isn't really ready for wide adoption. As a result you lose 50% of performance in exchange for making mirrors out of puddles of water and every other, even a little reflective surface.
Posted on Reply
#40
Valantar
Pruny said:
Rip rx590
You think this will cost $279? The 2070 is $599...
Posted on Reply
#41
stimpy88
M2B said:
The most pointless card ever beats AMD's fastest card ever, interesting.
If it has RTX then I suppose you will be more than happy to game at 720p 30Hz for "only" $400. I however, won't.

But what has AMD go to do with this? I don't remember bringing them up...
Posted on Reply
#42
Valantar
IMO, Nvidia has painted themselves into a corner here. The RTX cards have no real performance advantage over their similarly priced predecessors, just a superficial step down in naming tier. Efficiency is also identical, though at idle they're worse thanks to the massive die. Then there are these massive dice in question, making them very expensive to produce. And we know RTRT isn't viable even on the 2070, so including it on the 2060 would be a waste of silicon. So, where do they go from here? Launch a 2060 without RT that matches the 1070, costs the same, uses the same power, but has a new name? If so, what justified the price for a 60-tier card? Why not just keep the 1070 in production? Or do they ditch RT but keep the tensor cores for DLSS? That would do something I suppose, but you'd still expect a hefty price drop from losing the RT cores. The thing that makes the most sense logically is to launch a card without RT that matches the 2070 at a cheaper price (losing both RT and normal gaming perf ought to mean double savings, right?), but then nobody would buy the 2070, as RTRT won't be viable for years.

Knowing Nvidia and the realities of Turing, they'll launch an overpriced mid-range card that performs decently but has terrible value.

I'll wait for Navi, thanks.
Posted on Reply
#43
T4C Fantasy
CPU & GPU DB Maintainer
I think tensors cores are already adopted in folding proteins because my 2080ti folds 2 to 3x faster than the fastest 1080ti which makes pricing justified in folding rigs msrp vs msrp

My 1080ti folds for 850k no cpu slot
2080ti 2.6m no cpu
Posted on Reply
#44
theoneandonlymrk
T4C Fantasy said:
I think tensors cores are already adopted in folding proteins because my 2080ti folds 2 to 3x faster than the fastest 1080ti which makes pricing justified in folding rigs msrp vs msrp

My 1080ti folds for 850k no cpu slot
2080ti 2.6m no cpu
They're quite useful in academia and Ai circles too, but more expensive.
My vega does 750k at conservative, reduced clocks and load (1500core/1000mem) so it is doing well there(the 2080ti).
Posted on Reply
#45
Vya Domus
Maybe I am mistaken but isn't this like the nth time the "2060 showed up" in the same Final Fantasy benchmark ?

Th3pwn3r said:
If you can't admit Nvidia had a great idea then you must be in denial. The problem was the execution.
First of all the "idea" was always there, nothing revolutionary on that front. And the problem was the execution ? Then what portion of it was alright ? Might as well say it all failed in it's entirety.

Vayra86 said:

Where is this industry moving forward, I might ask?
It certainly is moving, slightly backwards. PC gaming is starting to look like the fashion industry, 1080p under 60fps is the new black.

Vayra86 said:

The mental gymnastics are with you, thinking that a 1200 dollar product that is half viable is somehow going to change the game.
I honestly wouldn't blame him, who wouldn't try to justify purchasing said product. But yeah the shirt must have been worth it.
Posted on Reply
#46
T4C Fantasy
CPU & GPU DB Maintainer
Vya Domus said:
Maybe I am mistaken but isn't this like the nth time the "2060 showed up" in the same Final Fantasy benchmark ?
I dunno if the benchmark is real but the 2060 is, is excited :D
Posted on Reply
#47
Kamgusta
krykry said:
You know what the problem about Nvidia's "great idea" is? It's that they DID NOT make raytracing for gamers. Raytracing in games is merely a side project to them. The whole thing with raytracing and AI cores was for professional GPUs, especially for movie-making industry (where raytracing is BIG and everyone used CPUs for until now).
Then they came up with rebranding those GPU chips and making it a gaming feature despite the fact this tech isn't really ready for wide adoption. As a result you lose 50% of performance in exchange for making mirrors out of puddles of water and every other, even a little reflective surface.
Raytracing in videogames is the step in photoreality we have been expecting for 15 years.
http://www.cse.chalmers.se/~uffe/xjobb/Readings/GPURayTracing/Ray%20Tracing%20Fully%20Implemented%20on%20Programmable%20Graphics%20Hardware.PDF
Yes, current RTXs have horrible raytracing performance, but this is going to be better and better.
I remember the first AA, T&L, BM implementations (GeForce 3 from early 2000's, boys) and they had over 50% performance hits. Now they are "free".
And I must say, raytracing effects add really a lot to realism.
Posted on Reply
#48
Th3pwn3r
theoneandonlymrk said:
What , you think that one guy thought up all Nvidias patents, they screwed over companies and bought them, this raytracing thing is also not his idea , it's quite old in fact ,and In use years, just not in games, (much, sic)
The idea to sell AI tech that could technically Path trace to consumer's Was his idea.
I applaud him, pushing rays is in earnest something i want in the world but im not paying his bonus this time out , he's taking the #@ss.

Nice post change, that's not what you Did say.
LOL. Where does my post say Edited? It doesn't, lay off the booze haha. I never edited that post, I had no reason to.
Posted on Reply
#49
krykry
Kamgusta said:
Raytracing in videogames is the step in photoreality we have been expecting for 15 years.
http://www.cse.chalmers.se/~uffe/xjobb/Readings/GPURayTracing/Ray Tracing Fully Implemented on Programmable Graphics Hardware.PDF
Yes, current RTXs have horrible raytracing performance, but this is going to be better and better.
I remember the first AA, T&L, BM implementations (GeForce 3 from early 2000's, boys) and they had over 50% performance hits. Now they are "free".
And I must say, raytracing effects add really a lot to realism.
The thing is, that we don't get much of raytracing anyway. It's just some reflective surfaces that are being implemented. To get a true feel of raytracing, we need ten times the GPU power we have now. Which is why I say that this isn't ready for adoption. I completely agree with AMD saying that lower and mid ends also need to be at the level of being able to adopt it before they actually do start implementing first features.
Posted on Reply
#50
T4C Fantasy
CPU & GPU DB Maintainer
krykry said:
The thing is, that we don't get much of raytracing anyway. It's just some reflective surfaces that are being implemented. To get a true feel of raytracing, we need ten times the GPU power we have now. Which is why I say that this isn't ready for adoption. I completely agree with AMD saying that lower and mid ends also need to be at the level of being able to adopt it before they actually do.
This isnt a bad move by nvidia though, you need to start from somewhere or it never gets adopted in the first place, the pricing is insane but the tech is what we needed to begin with.
Posted on Reply
Add your own comment