Wednesday, November 22nd 2017

NVIDIA GeForce RTX 2060 Shows Up in Final Fantasy XV Benchmarks

The RTX family debuted with top of the line graphics cards, but the Turing era is just started and there will be new members joining those first products. One of the most expected is the RTX 2060, and now this new graphics card has been seen in Final Fantasy XV benchmarking database. This information should be taken with a grain of salt, but in the past this listing has showed us upcoming products such as the Radeon RX 590, so the evidence is quite interesting. According to this data, the RTX 2060 would perform slightly below the Radeon RX Vega 56 and NVIDIA GeForce GTX 1070, but its numbers are quite better than those of the GTX 1060.

NVIDIA itself confirmed there would be a "mainstream" product in the Turing family in the future, and although the company seems now focused on selling out their excess inventory of mid-range Pascal graphics cards -Black Friday could help there-, the new GPU could be announced in the next few weeks and some analysts expect it to be available on Q1 2019. It'll be interesting to confirm if the data in our TPU database is correct, but we're specially curious about the price point it'll have.
Source: Overclock 3D
Add your own comment

121 Comments on NVIDIA GeForce RTX 2060 Shows Up in Final Fantasy XV Benchmarks

#51
moproblems99
Valantar said:
IMO, Nvidia has painted themselves into a corner here. The RTX cards have no real performance advantage over their similarly priced predecessors, just a superficial step down in naming tier.
I don't think they painted themselves into a corner. They have to be Jedis to pull off this master of a release considering a lot of the cards are sold out...They even managed to sell their three year old cards at MSRP with crypto in the toilet? Seriously, they must be Jedis....or Siths?........hmmmmm

Kamgusta said:
Raytracing in videogames is the step in photoreality we have been expecting for 15 years.
First, I don't have an RTX series and I have only seen pictures BUT.........what they are doing right now is not photo realistic at all and looks like absolute trash. Worse than not using RTRT. It looks like mirrors...everywhere....Like that, Master Mirror would. Yeesssssss.
Posted on Reply
#52
Th3pwn3r
krykry said:
You know what the problem about Nvidia's "great idea" is? It's that they DID NOT make raytracing for gamers. Raytracing in games is merely a side project to them. The whole thing with raytracing and AI cores was for professional GPUs, especially for movie-making industry (where raytracing is BIG and everyone used CPUs for until now).
Then they came up with rebranding those GPU chips and making it a gaming feature despite the fact this tech isn't really ready for wide adoption. As a result you lose 50% of performance in exchange for making mirrors out of puddles of water and every other, even a little reflective surface.
Who said Nvidia MADE raytracing for gamers? Did someone say Nvidia made raytracing? Nvidia have STARTED IMPLEMENTING raytracing and it's definitely not great, the idea behind it is, I believe the intentions were good. Not to mention they may actually make it perform decent with upcoming drivers.


stimpy88 said:
If it has RTX then I suppose you will be more than happy to game at 720p 30Hz for "only" $400. I however, won't.

But what has AMD go to do with this? I don't remember bringing them up...
I don't know anyone personally that wants to game at 720p 30hz period. AMD always has something do with with Nvidia since they both make GPU, we need AMD and Nvidia in order to create a competitive market for us consumers, it's just not really competitive right now.

Valantar said:
IMO, Nvidia has painted themselves into a corner here. The RTX cards have no real performance advantage over their similarly priced predecessors, just a superficial step down in naming tier. Efficiency is also identical, though at idle they're worse thanks to the massive die. Then there are these massive dice in question, making them very expensive to produce. And we know RTRT isn't viable even on the 2070, so including it on the 2060 would be a waste of silicon. So, where do they go from here? Launch a 2060 without RT that matches the 1070, costs the same, uses the same power, but has a new name? If so, what justified the price for a 60-tier card? Why not just keep the 1070 in production? Or do they ditch RT but keep the tensor cores for DLSS? That would do something I suppose, but you'd still expect a hefty price drop from losing the RT cores. The thing that makes the most sense logically is to launch a card without RT that matches the 2070 at a cheaper price (losing both RT and normal gaming perf ought to mean double savings, right?), but then nobody would buy the 2070, as RTRT won't be viable for years.

Knowing Nvidia and the realities of Turing, they'll launch an overpriced mid-range card that performs decently but has terrible value.

I'll wait for Navi, thanks.
Ha, you can say Nvidia has painted themselves into a corner but the corner they painted themselves into happen to be a corner that's 75% of the whole area.

T4C Fantasy said:
I think tensors cores are already adopted in folding proteins because my 2080ti folds 2 to 3x faster than the fastest 1080ti which makes pricing justified in folding rigs msrp vs msrp

My 1080ti folds for 850k no cpu slot
2080ti 2.6m no cpu
Awesome, that's a pro that doesn't get mentioned at all. First time I'm hearing about how powerful the 2080ti can be.

Vya Domus said:
Maybe I am mistaken but isn't this like the nth time the "2060 showed up" in the same Final Fantasy benchmark ?



First of all the "idea" was always there, nothing revolutionary on that front. And the problem was the execution ? Then what portion of it was alright ? Might as well say it all failed in it's entirety.



It certainly is moving, slightly backwards. PC gaming is starting to look like the fashion industry, 1080p under 60fps is the new black.



I honestly wouldn't blame him, who wouldn't try to justify purchasing said product. But yeah the shirt must have been worth it.
How many prototypes do you have laying around that work right off the bat? How many of your prototypes do you have that are perfect right out of the build house? That's the point and others have made the point as well, you and others have to get it through your heads, we have to start somewhere.


krykry said:
The thing is, that we don't get much of raytracing anyway. It's just some reflective surfaces that are being implemented. To get a true feel of raytracing, we need ten times the GPU power we have now. Which is why I say that this isn't ready for adoption. I completely agree with AMD saying that lower and mid ends also need to be at the level of being able to adopt it before they actually do start implementing first features.
Attempting to actually run raytracing right now is just silly, there's a lot of work that needs to be done. Had Nvidia mentioned it and advertised it better it would be interesting to see how it all panned out BUT they tried to sell people on the 'feature', that's where they went wrong and people that bought into it are super salty I guess.


T4C Fantasy said:
This isnt a bad move by nvidia though, you need to start from somewhere or it never gets adopted in the first place, the pricing is insane but the tech is what we needed to begin with.
"you need to start from somewhere"- Well that's just ridiculous! We all deserve the best right now and we don't expect any further advancements in this 'new' technology and 'feature'. /sarcasm

Unfortunately a lot of people just don't understand how things are made. You would think people would be a lot more understanding with all the revisions hardware goes through and especially on a tech forum but....nope!

moproblems99 said:
I don't think they painted themselves into a corner. They have to be Jedis to pull off this master of a release considering a lot of the cards are sold out...They even managed to sell their three year old cards at MSRP with crypto in the toilet? Seriously, they must be Jedis....or Siths?........hmmmmm



First, I don't have an RTX series and I have only seen pictures BUT.........what they are doing right now is not photo realistic at all and looks like absolute trash. Worse than not using RTRT. It looks like mirrors...everywhere....Like that, Master Mirror would. Yeesssssss.
People like to believe that Nvidia are both the devil that they hate and are doing poorly with their sales. Sure they're not the most consumer friendly company and care more about the dollars they make BUT that hasn't slowed down sales of these failed cards and technology.

The majority of consumers that they're selling these cards(2080ti) to probably don't give a damn about what we say here on techpowerup or any other forum. They probably have lots of disposable income and buy a couple of cards. They might not even know how to enable raytracing in the one game or application that they can. :D
Posted on Reply
#53
T4C Fantasy
CPU & GPU DB Maintainer
Th3pwn3r said:
Who said Nvidia MADE raytracing for gamers? Did someone say Nvidia made raytracing? Nvidia have STARTED IMPLEMENTING raytracing and it's definitely not great, the idea behind it is, I believe the intentions were good. Not to mention they may actually make it perform decent with upcoming drivers.




I don't know anyone personally that wants to game at 720p 30hz period. AMD always has something do with with Nvidia since they both make GPU, we need AMD and Nvidia in order to create a competitive market for us consumers, it's just not really competitive right now.



Ha, you can say Nvidia has painted themselves into a corner but the corner they painted themselves into happen to be a corner that's 75% of the whole area.



Awesome, that's a pro that doesn't get mentioned at all. First time I'm hearing about how powerful the 2080ti can be.



How many prototypes do you have laying around that work right off the bat? How many of your prototypes do you have that are perfect right out of the build house? That's the point and others have made the point as well, you and others have to get it through your heads, we have to start somewhere.


Attempting to actually run raytracing right now is just silly, there's a lot of work that needs to be done. Had Nvidia mentioned it and advertised it better it would be interesting to see how it all panned out BUT they tried to sell people on the 'feature', that's where they went wrong and people that bought into it are super salty I guess.




"you need to start from somewhere"- Well that's just ridiculous! We all deserve the best right now and we don't expect any further advancements in this 'new' technology and 'feature'. /sarcasm

Unfortunately a lot of people just don't understand how things are made. You would think people would be a lot more understanding with all the revisions hardware goes through and especially on a tech forum but....nope!



People like to believe that Nvidia are both the devil that they hate and are doing poorly with their sales. Sure they're not the most consumer friendly company and care more about the dollars they make BUT that hasn't slowed down sales of these failed cards and technology.

The majority of consumers that they're selling these cards(2080ti) to probably don't give a damn about what we say here on techpowerup or any other forum. They probably have lots of disposable income and buy a couple of cards. They might not even know how to enable raytracing in the one game or application that they can. :D



First one was my 1080ti with a cpu combined score
Posted on Reply
#55
Valantar
moproblems99 said:
I don't think they painted themselves into a corner. They have to be Jedis to pull off this master of a release considering a lot of the cards are sold out...They even managed to sell their three year old cards at MSRP with crypto in the toilet? Seriously, they must be Jedis....or Siths?........hmmmmm
None of what you said here applies to what I was saying at all, but sure, Nvidia is very good at selling overpriced expensive gear. Still doesn't mean their current portfolio can extend downwards in any reasonable way.

Th3pwn3r said:
Ha, you can say Nvidia has painted themselves into a corner but the corner they painted themselves into happen to be a corner that's 75% of the whole area.
Sure, it's not a small corner, but as I pointed out, there still isn't a sensible route out of it - no way of giving an actual performance upgrade to the millions who own 1060s (or 980s) for an acceptable price, so these people will just keep their current hardware.

When I bought my $700 Fury X in 2015, I didn't expect I'd have to pay significantly more to get a tangible performance upgrade nearly four years later. That doesn't make sense no matter how you look at it.
Posted on Reply
#56
Chloe Price
I'm pretty sure that this is going to be a cut-down RTX 2070 with 6GB 192-bit memory, since 2070 doesn't share the same die with 2080.

And x06 chips has been the -60 cards' chip for years, this time though it was for 2070 instead of the familiar x04 for -80 card.
Posted on Reply
#57
Vya Domus
Th3pwn3r said:

How many prototypes do you have laying around that work right off the bat? How many of your prototypes do you have that are perfect right out of the build house?
You're funny. Don't try to pass these things as "prototypes", this ain't no prototype. It's a product out there, on the market. The even funnier thing is that not even Nvidia would agree with you, they claim Turing has been in development for 10 years, that's plenty of time to work out the feasibility of the features they wanted to include. A prototype, after a decade in development ? Wow, I am amazed by all the things you people come up with to put your favorite brand in a better light.

Th3pwn3r said:

we have to start somewhere.
Of course, but get this : some starting points are better than others.
Posted on Reply
#58
moproblems99
Valantar said:
Still doesn't mean their current portfolio can extend downwards in any reasonable way.
Ah but it does. And it can. Watch these lower cards because if they had asses that is all you will see....headed out the door without your phone number. Every single one will sell. Regardless of the strengths/weaknesses.
Posted on Reply
#59
Th3pwn3r
T4C Fantasy said:



First one was my 1080ti with a cpu combined score
"IMPRESSIVE!"

Tsukiyomi91 said:
@Th3pwn3r well said
Thank you sir. I just try to be logical and rational when thinking about and discussing things. Even in politics I usually don't have any bias,
others like to scream and raise pitchforks!

Valantar said:



Sure, it's not a small corner, but as I pointed out, there still isn't a sensible route out of it - no way of giving an actual performance upgrade to the millions who own 1060s (or 980s) for an acceptable price, so these people will just keep their current hardware.
Okay, so just because you don't agree with pricing there is no way of an actual performance upgrade? What? Should people be made at Rolls Royce because they can't upgrade from their Honda Civic or Ford Focus? Your post sounds just as silly albeit not as extreme. If you want things you have to pay for them, the upgrades we want are rarely out of necessity, we just want better things.

Vya Domus said:
You're funny. Don't try to pass these things as "prototypes", this ain't no prototype. It's a product out there, on the market. The even funnier thing is that not even Nvidia would agree with you, they claim Turing has been in development for 10 years, that's plenty of time to work out the feasibility of the features they wanted to include. A prototype, after a decade in development ? Wow, I am amazed by all the things you people come up with to put your favorite brand in a better light.



Of course, but get this : some starting points are better than others.
I used the term prototype hoping you might actually grasp what I was saying, it clearly didn't work. No they are not prototypes, yes these cards are out on the market BUT they're still in the early stages, the cards AND TECH in these cards are still in the early stages. 10 years is quite a long time, I'm sure they had BF V and ray tracing for all of those 10 years too, they just didn't even try I guess. Classic go to for some of you guys is to make someone seem like or call them a fan boy, I have an AMD powered laptop, a desktop with an R9 290 in it and have had a few of phenom processors. I am not the kind of person that has a favorite brand, I buy what I believe is best. Sorry to burst your bubble but I have no brand loyalty, well, unless you're talking about car audio, then I purchase all Digital Designs amplifiers and subwoofers.
Posted on Reply
#60
Valantar
moproblems99 said:
Ah but it does. And it can. Watch these lower cards because if they had asses that is all you will see....headed out the door without your phone number. Every single one will sell. Regardless of the strengths/weaknesses.
Anything Nvidia makes sells. That's not being questioned. Uninformed people with too much money is not a scarce resource. What is being questioned is whether it'll be anything but a terrible value - which the current series at least doesn't bode well for. Matching the 1070's performance for the same price, but with a bunch of die area spent on RT cores that can't be used for anything at a playable resolution? I'd rather buy a 1070, thanks. Matching the 1070's performance at the same price while ditching RT entirely? Why bother, why spend the money on designing a new chip and not keep making the 1070? And what legitimizes the price not dropping more from the 2070 if it lacks RT entirely? The point is that the only sensible solution is to cut RT out entirely for this tier, and price it at a proper upper-midrange level, at around $300. Which is never, ever going to happen. Not only 'cause it leaves a whopping $300 gap in Nvidia's lineup, but because they've shown time and time again that they don't give a damn about providing any kind of value, and for this generation they seem laser-focused on keeping price-performance parity (or even an increase) with the previous one.
Posted on Reply
#61
Vya Domus
Th3pwn3r said:

I used the term prototype hoping you might actually grasp what I was saying, it clearly didn't work.
I know what you are saying, that these things are great under the hood and all their faults are to be overlooked because they are just first generation technologies that have great prospects. And yes, it didn't work because why would it ? Am I mandated to agree with that ?

If Nvidia would have waited just one more year, or perhaps even less than that, they could have come out with a product better equipped to fulfill the role that they wanted to boast about. After all everyone is convinced AMD wont have anything on the table for millennia to come and even if they would have had, it's not like it would have marked the end of Nvidia's massive market share. Yet, they didn't, they ventured on with features that clearly provide a poor experince.

So what was the hurry ? Once you start looking at in a slightly more than superficial way you realize this was indeed a bad starting point and, no, it wont have great prospects because of that.

Th3pwn3r said:
I have an AMD powered laptop, a desktop with an R9 290 in it and have had a few of phenom processors. I am not the kind of person that has a favorite brand, I buy what I believe is best.
Nah, that argument never works. I have a 1060 and have been called more than a couple of times an AMD fanboy.
Posted on Reply
#62
Valantar
Th3pwn3r said:
Okay, so just because you don't agree with pricing there is no way of an actual performance upgrade? What? Should people be made at Rolls Royce because they can't upgrade from their Honda Civic or Ford Focus? Your post sounds just as silly albeit not as extreme. If you want things you have to pay for them, the upgrades we want are rarely out of necessity, we just want better things.
Frankly, I have no idea what you're trying to say here. Technological development has always worked more or less like this: A top-of-the-line product exists with performance level X, at price level A. The required technologies improve, and a new product is launched, with added performance (perhaps X+y%), but stays at price level A. Slower products slot in below A, with ones matching the previous product's performance now being cheaper. This makes sense both in terms of production costs, economics of scale, maintaining sales, and driving customer interest, and you build and maintain an expectation that for spending A each generation, you get a y% performance increase, or if you skip two generations you get double that. Instead, this generation you get a side-grade, with zero added value.

What Nvidia is doing now is instead keeping subsequent generations at price-performance parity, so that for both generations, you get performance X at price A, with added price tiers above A for added performance. This kind of change only makes sense if the product in question is so revolutionary as to legitimize this price. RTX isn't, and first-gen RTX will never be - it isn't powerful enough, as evidenced by BFV. And the price levels they're operating at are high enough that they're never going to sell as many 2070s as they sold 1070s. No way, no how. Sales match price tiers very closely, so it's likely a 2060 priced to match a 1070 will also roughly match its sales. The 2080Ti likely sells more than the Titan Xp did, but only because it's the only tangible performance upgrade from the previous generation, and at that point you're in the "wealthy enthusiast" segment anyhow, where price matters far less than it does below $600 or so. This is not smart. Period.
Posted on Reply
#63
moproblems99
Valantar said:
What Nvidia is doing now is instead keeping subsequent generations at price-performance parity, so that for both generations, you get performance X at price A, with added price tiers above A for added performance.
That is the whole point. They fully understand that they can sell turd for the price of gold. And they do. And people love it. They have made it to Apple status.

Customers: It's too expensive!
Nvidia: Well, that is because the fan shroud is made from Unicorn tears. They are sad because we strip mined their natural habitat so that we could put extra star dust into these RT cores.
Customers: Take my money! It's awesome!

They have hit the point to where what their product does or costs doesn't matter anymore. Only that it is new and shiny! And might have a new feature...that you might be able to use. Someday. But likely you will have to buy the next gen new and shiny thing.

I guess I should really say silver instead of turd as there is a decent non-rtrt increase (2080ti only).

Edited for spelling.
Posted on Reply
#64
Valantar
moproblems99 said:
That is the whole point. They fully understand that they can sell turd for the price of gold. And they do. And people love it. They have made it to Apple status.

Customers: It's too expensive!
Nvidia: Well, that is because the fan shroud is made from Unicorn tears. They are sad because we strip mined their natural habitat so that we could put extra star dust into these RT cores.
Customers: Take my money! It's awesome!

They have hit the point to where what their product does or costs doesn't matter anymore. Only that it is new and shiny! And might have a new feature...that you might be able to use. Someday. But likely you will have to buy the next gen new and shiny thing.

I guess I should really say silver instead of turd as there is a decent non-rtrt increase (2080ti only).

Edited for spelling.
You're probably not wrong. I guess they looked back at the 700-series (which also had no new process node or arch compared to the previous gen) and though "we didn't make enough money with that. How about we tack on some stuff that barely works and charge double for it?"
Posted on Reply
#66
TheGuruStud
There's a reason this bench was chosen if anyone can figure it out...it's b/c the game is garbage.
Posted on Reply
#67
Valantar
TheGuruStud said:
There's a reason this bench was chosen if anyone can figure it out...it's b/c the game is garbage.
I don't know if the game is, but the benchmark has been proven to be so a long time ago, yeah. Entirely unreliable.
Posted on Reply
#68
T4C Fantasy
CPU & GPU DB Maintainer
Valantar said:
I don't know if the game is, but the benchmark has been proven to be so a long time ago, yeah. Entirely unreliable.
The benchmark isnt bad at all, the scores can vary because of cpu and memory just like 3dmark, from experience its a reliable benchmark
Posted on Reply
#69
lexluthermiester
T4C Fantasy said:
The benchmark usnt bad at all, the scores can vary because of cpu and memory just like 3dmark, from experience its a reliable benchmark
Exactly correct.
Posted on Reply
#71
T4C Fantasy
CPU & GPU DB Maintainer
EarthDog said:
3dmark is pretty consistent/repeatable. It also responds to other system changes like CPU speed, cores/threads, memory speed/timings. For those, the cpu score (as you guys know) plays a role in the overall.


RE: FF XV - https://www.gamersnexus.net/game-bench/3224-ffxv-disingenuous-misleading-benchmark-tool-gameworks-tests

For comparing it to the game, it's a poor bench. I dont think it was fixed(?) but a DLSS version came out! :p
its consistent if you do the presets like 3dmark, and there really is no difference except ff15 benchmark looks better.

gamers nexus did custom presets, and ruined the rep of the benchmark. atleast in my thread its an even playing field where your score is only allowed under standard 1080p
https://www.techpowerup.com/forums/threads/post-your-final-fantasy-xv-benchmark-results.242200/
Posted on Reply
#72
Prima.Vera
jormungand said:

god, i remember when i was in love with these cards




Duuude, I had exactly the same cards. Ironically, the HD 5870 was the last AMD card I have ever own sadly...
Posted on Reply
#73
Valantar
T4C Fantasy said:
The benchmark isnt bad at all, the scores can vary because of cpu and memory just like 3dmark, from experience its a reliable benchmark
lexluthermiester said:
Exactly correct.
A benchmark that doesn't do proper object culling and renders offscreen objects at all times is bad. Period. Poorly developed, poorly implemented, poorly optimized.
Posted on Reply
#74
efikkan
HTC said:
Is there any confirmation of whether or not it's GTX 2060 or RTX 2060?

The difference would be RTX has RT capabilities while GTX hasn't. If it doesn't, will nVidia still stick to the RTX naming scheme anyway?
No, and I wouldn't read too much into the naming, even if this benchmark itself is legit. In nearly all cases, unreleased cards will show up with just a device id in pre-release drivers. Even if someone have gotten their hands on an engineering sample or even a qualification sample identical to the upcoming product, it's not going to display the final product name. This is something people should keep in mind when reading rumors about any GPU, both from AMD and Nvidia.
Posted on Reply
#75
Vayra86
Th3pwn3r said:
If you can't admit Nvidia had a great idea then you must be in denial. The problem was the execution.
Ideas dont mean a thing if you cannot execute them properly. At that point an idea just turns into a scammy, crappy product and promise.
Posted on Reply
Add your own comment