Tuesday, January 7th 2020

EVGA GeForce RTX 2060 KO Pictured, Possibly NVIDIA's First Response to RX 5600 XT

At CES, we went hands-on with the EVGA GeForce RTX 2060 KO graphics card, and its price came as the biggest surprise: USD $299. This could very well be NVIDIA's first response to AMD's Radeon RX 5600 XT: a new line of RTX 2060 graphics cards under $300, with RTX support being the clincher. The EVGA card looks like it's severely built to a cost. A 20-ish centimeter length, a simple twin-fan cooling solution, and just three connectors, including a legacy DVI-D. It still has a full-length back-plate. The KO ticks at NVIDIA-reference clock-speeds for the RTX 2060. EVGA is planning a premium KO Ultra SKU with factory-overclocked speeds comparable to the RTX 2060 iCX, priced at a small premium. EVGA says that the RTX 2060 KO will launch next week (January 13 or later).
Add your own comment

95 Comments on EVGA GeForce RTX 2060 KO Pictured, Possibly NVIDIA's First Response to RX 5600 XT

#51
FordGT90Concept
"I go fast!1!11!1!"
notb
Image sharpening improves image quality?
Are you one of those people who think upscaling photos makes them look better? :)
RIS makes rendered edges sharper which counters the graininess from upscaling. Look up reviews of people that tested the feature: it works great and will likely be standard in Scarlette/PS5.


This is off topic so I'm dropping it.
Posted on Reply
#52
cucker tarlson
FordGT90Concept
RIS makes rendered edges sharper which counters the graininess from upscaling. Look up reviews of people that tested the feature: it works great and will likely be standard in Scarlette/PS5.


This is off topic so I'm dropping it.
I can vouch for that,dlss with sharpening produces a much better image overall.
Posted on Reply
#53
Zubasa
bug
As opposed to other techniques that speed it up?
Tesselation incurs a performance hit. Shading incurs a performance hit. Lighting incurs a performance hit. If it weren't for all these pesky techniques, we'd be enjoying Wolfenstein at 1,000,000 fps by now.

Edit: More on topic, I think Nvidia has squeezed all there was from Turing by now. Going forward it's Ampere or bust (i.e. whoever didn't buy into Turing by now, most likely never will).
Tessellation was actually introduced to improve performance by reducing the number of draw calls for a given level of polygon count,
it also makes it easier to change sub-division etc in the polygon mesh in real time.
As for shading, there are many techniques that are meant to performance, such as given an illusion of surface texture or unevenness without exploding the polygon count by using real geometry.

cucker tarlson
the iq improvement is there,but not every scene will not showcase that.
imo when rt gets adopted and improved over a few years we'll look at rasterized games with disgust.



TBH right now what disgust me the most is so many "influenza" (yes they might as well be a virus) like Digital Foundry keep praising Metro as the best looking game ever just because it has RT in it.
When the texture quality in many places are just facepalm, and it is literally right in the fore ground compare to the RT global illumination and shadows further away.
RT alone does not make a good looking game, it makes more accuate lighting.
The notion has been because something is more accurate it must be better, when games themselves are artistic representations.
Posted on Reply
#54
Vya Domus
Zubasa
RT alone does not make a good looking game, it makes more accuate lighting.
The notion has been because something is more accurate it must be better, when games themselves are artistic representations.
The worst thing is that with stuff such as GI you simply have no sense of how exactly something is supposed to look right because we've had what, 30 years of progress in traditional lighting techniques that have gotten us extremely close to the real thing.

Not only that you'll have to live with the massive hit in performance but you'll also have to convince yourself you're getting something that's actually better.
Posted on Reply
#55
Zubasa
Vya Domus
The worst thing is that with stuff such as GI you simply have no sense of how exactly something is supposed to look right because we've had what, 30 years of progress in traditional lighting techniques that have gotten us extremely close to the real thing.

Not only that you'll have to live with the massive hit in performance but you'll also have to convince yourself you're getting something that's actually better.
Performance hit aside.
When I look at a game I notice the texture quality and details etc the developers put into the environment.
Many great lookin games were already made without RT such as the RE2 remake or MHW etc.
I can tell the developers put the effort in to craft the environment to make it look just as they want it.

What I want to say is:
How often do I care that X lighting is off by 10 degrees to the right?
Or how long do I stare at a character in the mirror that I won't see except in some cut scenes in First Person games?
I am not going to do all the math in my mind to see if the lightning is exactly X degrees and reflected Y times.
There are far more important work that are done in games to make it look amazing, RT itself is just a means to an end.
Posted on Reply
#56
64K
I think it's not a good idea to judge the entire value of RTRT right now. It's fairly new stuff for Developers and they need time to learn how to effectively implement it and how not to use it to the extent that it destroys performance. Probably a lot of Developers haven't any experience with it at all. There will be a time when they do and the RTRT experience will be better than today.

The other big complaint about RTX cards is the expense. Well, when has new hardware not been expensive? The early adopters of GPUs with Tensor and RT cores will pay the price for the rest of us and prices should come down over time just like they did with SSDs and 4K monitors. R&D costs have to be recuperated from some customers.

Having said that I think Nvidia has been overcharging in addition to the above statement due to lack of competition. That will change this year hopefully.
Posted on Reply
#57
milewski1015
notb
And most will start to praise RTRT when AMD starts supporting it. That's the point.
I couldn't care less about it, regardless of which company is supporting it.

notb
In general: people criticize RTRT for not providing enough IQ improvement, but at the same time many assume games should only be played at highest settings.
Going from medium to high/ultra isn't changing much in many AAA titles, while fps can drop by 30% or even more.

Imagine the situation, when "medium" is the best setting we're used to and suddenly Nvidia adds a "magic feature" that provides higher modes (with the performance cost we observe today).
Bloodbath on forums.
RTRT doesn't provide enough IQ improvement and brings with it too much of a performance hit (at this point in time anyway) in my opinion to warrant it being a deciding factor when buying a GPU. Similar things could be said about other graphics settings too - I'll happily sacrifice things like uber-realistic shadows and volumetric lighting for better performance. As you mention, everybody feels like they have to play on ultra, when a majority of the time you can have a game that looks essentially the same as ultra while playing on a combo of medium/high while performing significantly better. I guess it's just dependent on each person's preferences.
Posted on Reply
#58
Cheeseball
Not a Potato
nguyen
EVGA is Nvidia most prominent AIB so they have some leeway on how to make their own product aside from Nvidia product stack, like this 2070 Super with 15.5gbps GDDR6
https://www.evga.com/products/Specs/GPU.aspx?pn=7e7e3085-1fe0-4dc1-98ec-74be1221441e
I guess this version of the 2060 exist within EVGA only.
There was a 2070 Super with the faster GDDR6? Where the hell was this 6 months ago when I was deciding between the 5700 XT and this Super?

And what the hell is this name: EVGA GeForce RTX 2070 SUPER FTW3 ULTRA+

Wait, nevermind. Screw this pricing of $609 (the MSRP of the 2070 Super is $499), I would rather get the 2080 Super instead.
Posted on Reply
#59
Assimilator
Is this the weekly "RTRT is a worthless technology by evil NVIDIA to artificially increase graphics card prices, waaaah" thread?

Allow me to repeat for the millionth time, nobody is forcing you to buy NVIDIA's products so why the f**k do you crybabies care about the cost? Buy an AMD GPU and in the process you'll solve 3 problems: you won't be giving money to "NGREEDIA", you won't be paying more for features that are supposedly useless (since AMD cards don't have them), AND you'll no longer have a reason to whine endlessly on forums. Everyone wins - especially the people who are tired of literally every GPU thread getting drowned by a fecal matter torrent of AMD fanboys telling us that RTRT is useless for the thousandth time.
Posted on Reply
#60
medi01
cucker tarlson
the iq improvement is there,but not every scene will not showcase that.
imo when rt gets adopted and improved over a few years we'll look at rasterized games with disgust.



Looking at these for 2 minutes, I'm still not sure which of the two is supposed to be disgusting, cough. :D
Posted on Reply
#61
$ReaPeR$
Assimilator
Is this the weekly "RTRT is a worthless technology by evil NVIDIA to artificially increase graphics card prices, waaaah" thread?

Allow me to repeat for the millionth time, nobody is forcing you to buy NVIDIA's products so why the f**k do you crybabies care about the cost? Buy an AMD GPU and in the process you'll solve 3 problems: you won't be giving money to "NGREEDIA", you won't be paying more for features that are supposedly useless (since AMD cards don't have them), AND you'll no longer have a reason to whine endlessly on forums. Everyone wins - especially the people who are tired of literally every GPU thread getting drowned by a fecal matter torrent of AMD fanboys telling us that RTRT is useless for the thousandth time.
so, anyone that criticizes nshitias RTRT is an amd fanboy.. but you, the wise and objective judge of character support a feature that even the top of the line 2080ti can barely use. but the critics are the "fanboys".. sad.

(on topic) the segmentation of the gpu market is becoming more ridiculous by the day. it will be very funny when we reach the point of each respective gpu offering 1fps difference from the previous or the next one in line.
Posted on Reply
#62
bug
Super XP
I agree. These image enhancements do take a performance hit, just not as deep as Ray Tracing. RT ain't polished yet.
We must remember things differently then. Pixel shading didn't incur as big a performance hit, because it was adopted gradually over 10 years or so, but tessellation's performance hit was so big it took 7 years between ATI's first implementation and DX adding support for it. To this day, we still cringe when we hear about HairWorks or TressFX
Super XP
It needs to mature, I believe next gen consoles will be the answer because they really have no choice as both M$ and Sony have been touting about RT support.
No arguing there, but saying "it needs to mature" about a tech at its first generation is a truism.
Super XP
M$'s latest info that came out is 4k/120 w/ RT enabled. And they also spoke about 8k support. 8k is useless now and for the foreseeable future.
Sony's PlayStation 5 claims 4k/60 w/ RT enabled.
Despite what marketing would have you believe, consoles won't do anything remotely resembling 4k. They'll do what they always do: upscale.

FordGT90Concept
Image Sharpening and Boost. Boost dynamically lowers render resolution get more FPS. Image can make a lower resolution render look like it is higher at little frame time cost.
I was talking about stuff that improves image quality. It would be shocking to see something that downgrades quality to incur any kind of performance hit.
Posted on Reply
#63
notb
FordGT90Concept
RIS makes rendered edges sharper which counters the graininess from upscaling. Look up reviews of people that tested the feature: it works great and will likely be standard in Scarlette/PS5.
OMG. So now we'll improve quality by up-scaling and sharpening edges. That is just sad. :o
Have you ever (I mean: ever) read anything about digital photography editing? Even an article in Playboy?

You see. That's why RTRT has such a hard time to be understood. Because it's just impossible to convince some people that more pixels, more sharpness and more saturation doesn't improve image quality.
It's not like I'm that surprised since many people tend to prefer photos from smartphones over those from high-end cameras for the same reason.

So as I said: RTRT is just not for everyone. But it's also not compulsory, so no harm done, right? :)
Posted on Reply
#64
nguyen
Cheeseball
There was a 2070 Super with the faster GDDR6? Where the hell was this 6 months ago when I was deciding between the 5700 XT and this Super?

And what the hell is this name: EVGA GeForce RTX 2070 SUPER FTW3 ULTRA+

Wait, nevermind. Screw this pricing of $609 (the MSRP of the 2070 Super is $499), I would rather get the 2080 Super instead.
https://www.newegg.com/evga-geforce-rtx-2070-super-08g-p4-3175-kr/p/N82E16814487477?Description=evga%202070%20super%20sc%20ultra%2b&cm_re=evga_2070_super_sc_ultra%2b-_-14-487-477-_-Product

The SC2 Ultra+ model is currently being sold for 560usd (after 20usd off), so only 40usd more expensive than normal SC2. This model could very well compete with 2080 on equal footing.
Posted on Reply
#65
Zubasa
notb
OMG. So now we'll improve quality by up-scaling and sharpening edges. That is just sad. :eek:
Have you ever (I mean: ever) read anything about digital photography editing? Even an article in Playboy?

You see. That's why RTRT has such a hard time to be understood. Because it's just impossible to convince some people that more pixels, more sharpness and more saturation doesn't improve image quality.
It's not like I'm that surprised since many people tend to prefer photos from smartphones over those from high-end cameras for the same reason.

So as I said: RTRT is just not for everyone. But it's also not compulsory, so no harm done, right? :)
There is a bit of misconception on RIS, it doesn't actually reduce the native resolution unlike DLSS.
Some people use RIS to off-set the blurness of rendering game at a lower resolution is another story. RIS does not change the native resolution on its own.
Right now there is not way to stop DLSS from rendering the game at lower resolution and then upscale it.
Posted on Reply
#66
SIGSEGV
wtf. KO?
Ti, Super, KO, 11, 10, 20, 21.... :confused::confused::confused:
I expect 2960 WO edition, then I am gonna buy it. :laugh:
Posted on Reply
#67
Cheeseball
Not a Potato
nguyen
https://www.newegg.com/evga-geforce-rtx-2070-super-08g-p4-3175-kr/p/N82E16814487477?Description=evga%202070%20super%20sc%20ultra%2b&cm_re=evga_2070_super_sc_ultra%2b-_-14-487-477-_-Product

The SC2 Ultra+ model is currently being sold for 560usd (after 20usd off), so only 40usd more expensive than normal SC2. This model could very well compete with 2080 on equal footing.
I'm avoiding buying stuff from NewEgg due to their warranty and returns policies.

EDIT: I found an Amazon listing. God damn it @nguyen this is tempting.
Posted on Reply
#68
xkm1948
Cheeseball
I'm avoiding buying stuff from NewEgg due to their warranty and returns policies.

EDIT: I found an Amazon listing. God damn it @nguyen this is tempting.
I thought you already have a fairly good 5700XT?
Posted on Reply
#69
Super XP
bug
We must remember things differently then. Pixel shading didn't incur as big a performance hit, because it was adopted gradually over 10 years or so, but tessellation's performance hit was so big it took 7 years between ATI's first implementation and DX adding support for it. To this day, we still cringe when we hear about HairWorks or TressFX

No arguing there, but saying "it needs to mature" about a tech at its first generation is a truism.

Despite what marketing would have you believe, consoles won't do anything remotely resembling 4k. They'll do what they always do: upscale.


I was talking about stuff that improves image quality. It would be shocking to see something that downgrades quality to incur any kind of performance hit.
Next Generation Consoles WILL do true 4K, that I'm 110% sure of.
4k TVs are a dime a dozen nowadays.
Posted on Reply
#70
Cheeseball
Not a Potato
xkm1948
I thought you already have a fairly good 5700XT?
I do, but I want to try out 2070 Super and higher cards.
Posted on Reply
#71
bug
Super XP
Next Generation Consoles WILL do true 4K, that I'm 110% sure of.
Because if a 2080Ti barely handles 4k and HDR at the same time, consoles will totally have no trouble breezing through that :kookoo:
Super XP
4k TVs are a dime a dozen nowadays.
Totally unrelated, but ok. And you're probably thinking TVs without proper HDR support, those aren't that cheap.
Posted on Reply
#72
cucker tarlson
bug
Because if a 2080Ti barely handles 4k and HDR at the same time, consoles will totally have no trouble breezing through that :kookoo:
they will
upscaled,at medium-high pc setting

new ps5 gpu is a 9tflop rdna one,so basically a 5700xt with RT support.
Posted on Reply
#73
bug
cucker tarlson
they will
upscaled,at medium-high pc setting

new ps5 gpu is a 9tflop rdna one,so basically a 5700xt with RT support.
That's what I said, but then SuperXP felt the need to post he's 110% sure next gen consoles will do "real 4k".
Posted on Reply
#74
cucker tarlson
bug
That's what I said, but then SUperXP felt the need to post he's 110% sure next gen consoles will do "real 4k".
he's 100% sure amd will do 4k 60 RT with no RT hardware cause RT cores do nothing
Posted on Reply
#75
medi01
Assimilator
nobody is forcing
Nobody is forcing you to read everyone's comment, right?

[MEDIA=imgur]wjTaWEC[/MEDIA]

The thread is about NVs alleged answer to AMDs product.
The main differentiating factor here is RTX.
If you think RTX on 2060 is viable, good for you.
If it isn't, heck, good for you.

Why shut people up.
Posted on Reply
Add your own comment