• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2000 Series Specifications Pieced Together

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,674 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Later today (20th August), NVIDIA will formally unveil its GeForce RTX 2000 series consumer graphics cards. This marks a major change in the brand name, triggered with the introduction of the new RT Cores, specialized components that accelerate real-time ray-tracing, a task too taxing on conventional CUDA cores. Ray-tracing and DNN acceleration requires SIMD components to crunch 4x4x4 matrix multiplication, which is what RT cores (and tensor cores) specialize at. The chips still have CUDA cores for everything else. This generation also debuts the new GDDR6 memory standard, although unlike GeForce "Pascal," the new GeForce "Turing" won't see a doubling in memory sizes.

NVIDIA is expected to debut the generation with the new GeForce RTX 2080 later today, with market availability by end of Month. Going by older rumors, the company could launch the lower RTX 2070 and higher RTX 2080+ by late-September, and the mid-range RTX 2060 series in October. Apparently the high-end RTX 2080 Ti could come out sooner than expected, given that VideoCardz already has some of its specifications in hand. Not a lot is known about how "Turing" compares with "Volta" in performance, but given that the TITAN V comes with tensor cores that can [in theory] be re-purposed as RT cores; it could continue on as NVIDIA's halo SKU for the client-segment.



The RTX 2080 and RTX 2070 series will be based on the new GT104 "Turing" silicon, which physically has 3,072 CUDA cores, and a 256-bit wide GDDR6-capable memory interface. The RTX 2080 Ti is based on the larger GT102 chip. Although the maximum number of CUDA cores on this chip is unknown the RTX 2080 Ti is reportedly endowed with 2,944 of them, and has a slightly narrower 352-bit memory interface, than what the chip is capable of (384-bit). As we mentioned earlier, NVIDIA doesn't seem to be doubling memory amounts, and so we could expect 8 GB for the RTX 2070/2080 series, and 11 GB for the RTX 2080 Ti.

View at TechPowerUp Main Site
 
Do you really need 16 GB of RAM on your graphics card? We can't even max out the 8 GB the 1080 has.

Ofc we can max them out xD
 
Theres only a handful of games that I would see 9GB used (at 4K+). Most stay under 6GB.

Not fused about more VRAM just yet, the increase in bandwidth is welcome and I am sure I'll be wanting more the next round, but for now that 11GB will do for 4K and I would think 8GB for lower wont be a problem for another gen or 2.
 
So the real time raytracing feature is for consumers..? How is it going to work for games? Is it going to require specific Nvidia features to be coded into them (like Hairworks, etc)?
 
One of those two extremely similar 2080s seem pointless.
 
Really would have liked a model with 22Gbs of ram, or even 16GB would have been fine.
 
So the real time raytracing feature is for consumers..? How is it going to work for games? Is it going to require specific Nvidia features to be coded into them (like Hairworks, etc)?

Yes

DX12 + DXR + GameWorks / RTX

Like Physics acceleration a la PhysX

Nvidia said:
The upcoming GameWorks SDK — which will support Volta and future generation GPU architectures — enable ray-traced area shadows, ray-traced glossy reflections and ray-traced ambient occlusion.
 
Last edited:
Maybe the Cards will get the same "able to share texture Cache" as the Quadros, then you just buy a card more and a NVlink bridge and BOOOM you have 22GB....

I maxed out my 8GB 1080 when playing RawData Early dev with Tweaked rendering distance, but haven’t seen that with my 1080TI
 
why would they only use 12gb, for the Titan V, intentional hampering maybe?
 
5 hours to go and we will be blown away by the new GPU's from nVidia...or maybe not? Not much left to go
 
1 channel is broken. Raytracing use less memory as oposed to regular textures that can get overbloated.

RTX 4000 series next year 2048/4096/6144 core or stays the same 1536/3072/4608, just shrinked to 1/2 the size.
 
Based on those specs, i don't understand why they call that TI card a 2080, that beast of a card is more like a 2090...
 
Ofc we can max them out xD
I can confirm that I can max out my 1080ti in FF15 XD.

Cards with 11 GB of VRAM are not being maxed out, and there is no reason to have more for the foreseeable future.

You see “maxed out” only because the GPU is storing graphics there because it can, because it is there, NOT because it needs to.
 
My bet is RTX 2080 on the level of GTX 1080Ti performance and RTX 2080Ti some 30%~35% ahead.
 
I wanna see how is going to perform the 2080 vs the 2080+ with more cuda.
Price is important too. If is right i will think about it .
 
5 hours to go and we will be blown away by the new GPU's from nVidia...or maybe not? Not much left to go
Why? We already know the performance numbers for the Titan V. Which are not impressive in games at all tbh. Do you think that the 2080 Ti will be faster or, worst, the 2080 vanilla??
I don't think so. So overall, I think this will turn out the be one of the most boring generations of cards in recent years...
 
Why? We already know the performance numbers for the Titan V. Which are not impressive in games at all tbh. Do you think that the 2080 Ti will be faster or, worst, the 2080 vanilla??
I don't think so. So overall, I think this will turn out the be one of the most boring generations of cards in recent years...
Titan V is not really a gaming card and Volta is a bit different than Turin so it remains to be seen
 
Wow same amount of GRam....

So is it another same old same old...

Except it's GDDR6, which can almost double the bandwidth of GDDR5X. So, yeah, same amount of memory (which, as has been stated, can't even be maxed out in most gaming scenarios), but much faster at what it does.
 
Based on those specs, i don't understand why they call that TI card a 2080, that beast of a card is more like a 2090...

This is just my guess, but the 9 was probably used to indicate two GPUs on the same card, like the GTX 295, the GTX 590 or the GTX 690. Since the 700 series, that spot was taken once by the Titan Z and after that no other dual GPU card was ever released for consumers, so Nvidia never used the 9 again to indicate model number within series.

I bet Nvidia would only consider that if AMD suddenly launched some overwhelmingly powerful and efficient GPU... which will probably be not the case for a long time.

EDIT: Honestly, Nvidia should just make the 2080+ card the "default" 2080, they are so similar I don't see the point of launching both of them as different products...
 
Back
Top