• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2060 Shows Up in Final Fantasy XV Benchmarks

If this is not another salty lie i guess it will be great card for 200-250$.. ups sorry i meant 400$.
 
this year has been a major disappointment in GPU tech. Still, best of the worst is still Nvdia with their RTX range.RTX 2070 altough not a world beater, it is quite good compared to performance and price and even power usage for small form factor case users. almost gtx 1080ti performance with less heat and less price.
 
Really? By beat you mean it's price per performance ratio is better? I guess we should all consider the most pointless card ever then.


Cheapest Custom RTX 2070:
https://m.newegg.com/products/N82E16814932091

Cheapest Custom Vega 64:
https://m.newegg.com/products/N82E16814932031

RTX 2070 and Vega 64 both are in the same price range while 2070 is way more energy efficient, faster, less noisy and runs cooler in general.
And no, no one would buy Reference Vega 64 because it's hot garbage.
 
Last edited:
Cheapest Custom RTX 2070:
https://m.newegg.com/products/N82E16814932091

Cheapest Custom Vega 64:
https://m.newegg.com/products/N82E16814932031

RTX 2070 and Vega 64 both are in the same price range while 2070 is way more energy efficient, faster, less noisy and runs cooler in general.
And no, no one would buy Reference Vega 64 because it's hot garbage.

That's actually an extremely good deal for the 2070 if you factor in what BF V would cost. I wonder if that's just a Black Friday price. And you're wrong about nobody buying reference Vega64 because plenty of people already have and will purchase it.
 
Pretty sure this isn't going to be called RTX. We already know how good the 2070 is BF5, I can't imagine the 2060 wit the extra effects on.

On the plus side, it'll probably be a lot more reliable card.
 
The most pointless card ever beats AMD's fastest card ever, interesting.

Keep looking up the graph for Vega. Granted, I don't what what Vega's they are.
 
Dominating the opponent in absolute performance, energy efficiency. Bringing in awesome new features to move the industry forward. Yet according to some people they are “losers” . Talking about mental gymnastics.

Did Huang give you a free T shirt or something?

Where is this industry moving forward, I might ask? Where's that awesome content with RT in it? The only thing I'm seeing is stagnation in performance and a price point that is well out reach for the vast majority, for questionable IQ differences.

The mental gymnastics are with you, thinking that a 1200 dollar product that is half viable is somehow going to change the game. This RT implementation is a dead end and a cheap knockoff from the Quadro line, which in turn is a cheap knockoff from Volta. Cheap at 1200 bucks, mind you.

All we have today is empty promises and lackluster performance. If that is progress, then yes, Nvidia is doing fantastic.
 
Did Huang give you a free T shirt or something?

Where is this industry moving forward, I might ask? Where's that awesome content with RT in it? The only thing I'm seeing is stagnation in performance and a price point that is well out reach for the vast majority, for questionable IQ differences.

The mental gymnastics are with you, thinking that a 1200 dollar product that is half viable is somehow going to change the game. This RT implementation is a dead end and a cheap knockoff from the Quadro line, which in turn is a cheap knockoff from Volta. Cheap at 1200 bucks, mind you.

All we have today is empty promises and lackluster performance. If that is progress, then yes, Nvidia is doing fantastic.

If you can't admit Nvidia had a great idea then you must be in denial. The problem was the execution.
 
Great idea like PhysX? Like I said in the other thread RT RT may well be too computationally intensive to be viable for high end gaming, cheap effects sure but when we hit the Si (physics) wall it'd be interesting to see where's RT RT wrt mainstream gaming & other evolutionary tech like AI.
 
I guess as we are moving forward many games will get ray tracing support where the low settings will be enough for the 2060@1080p, medium settings will be for 2070 and high, ultra settings for 2080 and 2080 ti. Nvidia will try to sell ray tracing the same way it tried to sell PhysX in the past.
 
People are looking at the mac vegas and thinking 2060 is better than the real vega 56/64
 
Last edited:
If you can't admit Nvidia had a great idea then you must be in denial. The problem was the execution.
What , you think that one guy thought up all Nvidias patents, they screwed over companies and bought them, this raytracing thing is also not his idea , it's quite old in fact ,and In use years, just not in games, (much, sic)
The idea to sell AI tech that could technically Path trace to consumer's Was his idea.
I applaud him, pushing rays is in earnest something i want in the world but im not paying his bonus this time out , he's taking the #@ss.

Nice post change, that's not what you Did say.
 
Last edited:
Rip rx590
Yeah, that slower than 1070 card at 1070's price will surely show it!

Even goddamn 1060's street price is at 590's levels, how much would this piece of... hardware cost...
 
If you can't admit Nvidia had a great idea then you must be in denial. The problem was the execution.
You know what the problem about Nvidia's "great idea" is? It's that they DID NOT make raytracing for gamers. Raytracing in games is merely a side project to them. The whole thing with raytracing and AI cores was for professional GPUs, especially for movie-making industry (where raytracing is BIG and everyone used CPUs for until now).
Then they came up with rebranding those GPU chips and making it a gaming feature despite the fact this tech isn't really ready for wide adoption. As a result you lose 50% of performance in exchange for making mirrors out of puddles of water and every other, even a little reflective surface.
 
The most pointless card ever beats AMD's fastest card ever, interesting.

If it has RTX then I suppose you will be more than happy to game at 720p 30Hz for "only" $400. I however, won't.

But what has AMD go to do with this? I don't remember bringing them up...
 
IMO, Nvidia has painted themselves into a corner here. The RTX cards have no real performance advantage over their similarly priced predecessors, just a superficial step down in naming tier. Efficiency is also identical, though at idle they're worse thanks to the massive die. Then there are these massive dice in question, making them very expensive to produce. And we know RTRT isn't viable even on the 2070, so including it on the 2060 would be a waste of silicon. So, where do they go from here? Launch a 2060 without RT that matches the 1070, costs the same, uses the same power, but has a new name? If so, what justified the price for a 60-tier card? Why not just keep the 1070 in production? Or do they ditch RT but keep the tensor cores for DLSS? That would do something I suppose, but you'd still expect a hefty price drop from losing the RT cores. The thing that makes the most sense logically is to launch a card without RT that matches the 2070 at a cheaper price (losing both RT and normal gaming perf ought to mean double savings, right?), but then nobody would buy the 2070, as RTRT won't be viable for years.

Knowing Nvidia and the realities of Turing, they'll launch an overpriced mid-range card that performs decently but has terrible value.

I'll wait for Navi, thanks.
 
I think tensors cores are already adopted in folding proteins because my 2080ti folds 2 to 3x faster than the fastest 1080ti which makes pricing justified in folding rigs msrp vs msrp

My 1080ti folds for 850k no cpu slot
2080ti 2.6m no cpu
 
I think tensors cores are already adopted in folding proteins because my 2080ti folds 2 to 3x faster than the fastest 1080ti which makes pricing justified in folding rigs msrp vs msrp

My 1080ti folds for 850k no cpu slot
2080ti 2.6m no cpu
They're quite useful in academia and Ai circles too, but more expensive.
My vega does 750k at conservative, reduced clocks and load (1500core/1000mem) so it is doing well there(the 2080ti).
 
Maybe I am mistaken but isn't this like the nth time the "2060 showed up" in the same Final Fantasy benchmark ?

If you can't admit Nvidia had a great idea then you must be in denial. The problem was the execution.

First of all the "idea" was always there, nothing revolutionary on that front. And the problem was the execution ? Then what portion of it was alright ? Might as well say it all failed in it's entirety.

Where is this industry moving forward, I might ask?

It certainly is moving, slightly backwards. PC gaming is starting to look like the fashion industry, 1080p under 60fps is the new black.

The mental gymnastics are with you, thinking that a 1200 dollar product that is half viable is somehow going to change the game.

I honestly wouldn't blame him, who wouldn't try to justify purchasing said product. But yeah the shirt must have been worth it.
 
Last edited:
Maybe I am mistaken but isn't this like the nth time the "2060 showed up" in the same Final Fantasy benchmark ?
I dunno if the benchmark is real but the 2060 is, is excited :D
 
You know what the problem about Nvidia's "great idea" is? It's that they DID NOT make raytracing for gamers. Raytracing in games is merely a side project to them. The whole thing with raytracing and AI cores was for professional GPUs, especially for movie-making industry (where raytracing is BIG and everyone used CPUs for until now).
Then they came up with rebranding those GPU chips and making it a gaming feature despite the fact this tech isn't really ready for wide adoption. As a result you lose 50% of performance in exchange for making mirrors out of puddles of water and every other, even a little reflective surface.
Raytracing in videogames is the step in photoreality we have been expecting for 15 years.
http://www.cse.chalmers.se/~uffe/xj...emented on Programmable Graphics Hardware.PDF
Yes, current RTXs have horrible raytracing performance, but this is going to be better and better.
I remember the first AA, T&L, BM implementations (GeForce 3 from early 2000's, boys) and they had over 50% performance hits. Now they are "free".
And I must say, raytracing effects add really a lot to realism.
 
What , you think that one guy thought up all Nvidias patents, they screwed over companies and bought them, this raytracing thing is also not his idea , it's quite old in fact ,and In use years, just not in games, (much, sic)
The idea to sell AI tech that could technically Path trace to consumer's Was his idea.
I applaud him, pushing rays is in earnest something i want in the world but im not paying his bonus this time out , he's taking the #@ss.

Nice post change, that's not what you Did say.

LOL. Where does my post say Edited? It doesn't, lay off the booze haha. I never edited that post, I had no reason to.
 
Raytracing in videogames is the step in photoreality we have been expecting for 15 years.
http://www.cse.chalmers.se/~uffe/xjobb/Readings/GPURayTracing/Ray Tracing Fully Implemented on Programmable Graphics Hardware.PDF
Yes, current RTXs have horrible raytracing performance, but this is going to be better and better.
I remember the first AA, T&L, BM implementations (GeForce 3 from early 2000's, boys) and they had over 50% performance hits. Now they are "free".
And I must say, raytracing effects add really a lot to realism.
The thing is, that we don't get much of raytracing anyway. It's just some reflective surfaces that are being implemented. To get a true feel of raytracing, we need ten times the GPU power we have now. Which is why I say that this isn't ready for adoption. I completely agree with AMD saying that lower and mid ends also need to be at the level of being able to adopt it before they actually do start implementing first features.
 
Last edited:
Back
Top