Tuesday, July 7th 2020

NVIDIA GeForce RTX 3070 and RTX 3070 Ti Rumored Specifications Appear

NVIDIA is slowly preparing to launch its next-generation Ampere graphics cards for consumers after we got the A100 GPU for data-centric applications. The Ampere lineup is getting more and more leaks and speculations every day, so we can assume that the launch is near. In the most recent round of rumors, we have some new information about the GPU SKU and memory of the upcoming GeForce RTX 3070 and RTX 3070 Ti. Thanks to Twitter user kopite7kimi, who had multiple confirmed speculations in the past, we have information that GeForce RTX 3070 and RTX 3070 Ti use a GA104 GPU SKU, paired with GDDR6 memory. The cath is that the Ti version of GPU will feature a new GDDR6X memory, which has a higher speed and can reportedly go up to 21 Gbps.

The regular RTX 3070 is supposed to have 2944 CUDA cores on GA104-400 GPU die, while its bigger brother RTX 3070 Ti is designed with 3072 CUDA cores on GA104-300 die. Paired with new technologies that Ampere architecture brings, with a new GDDR6X memory, the GPUs are set to be very good performers. It is estimated that both of the cards would reach a memory bandwidth of 512 GB/s. So far that is all we have. NVIDIA is reportedly in Design Validation Test (DVT) phase with these cards and is preparing for mass production in August. Following those events is the official launch which should happen before the end of this year, with some speculations indicating that it is in September.
Sources: VideoCardz, TweakTown, kopite7kimi (Twitter)
Add your own comment

104 Comments on NVIDIA GeForce RTX 3070 and RTX 3070 Ti Rumored Specifications Appear

#101
cat1092
How about some real video upgrades & not just more & faster memory, as well as bandwidth?

We've had HDMI 2.0(b) & Displayport 1.4, as well as HDR for capable hardware since at least the GTX 1000 series or late 2015/early 2016 (even the lowly 2GB MSI GT 1030 GDDR5 can do this as a x4 card) & some of the 900 series can deliver the same with a Displayport firmware update released 2-3 years ago. Since then, 4K TV's have came a long way, there's several models now boasting HDMI 2.1 running 4K at 120 Hz. These cards cannot keep up with HDMI 2.1 speed, unless future spec sheets shows otherwise.

There's many of us who aren't hardcore gamers (if at all), rather video enthusiasts & in this respect, the RTX 3000 series (as well as 2000 series) doesn't deliver. Cannot run 4K at 120/144 Hz! There'll be more & more of these 4K TV's with HDMI 2.1, will trickle down to value models, such as VIZIO. 4K at 120 Hz is the official spec of HDMI 2.1, as well as 8K at 60 Hz (there's quite a few of these on the market as well). Try gaming on these & will be disappointed when one cannot keep up with native refresh rate of the TV. Pricing is getting competitive, soon we'll be seeing these for less than the cost of a mid range (RTX 3070) GPU. Furthermore, HDMI 2.1 will make 2K gaming obsolete, these old monitors are pricey & few TV's shipped with the standard (the only one I've seen was at Walmart & didn't look close). 4K is dropping in price, as well as a few 8K models. BTW, eARC is also a spec of HDMI 2.1 & that alone is a big deal! HDMI 2.0(b) ARC runs at only 1 Mbps, eARC doubles this by 48 times! More than that vs connecting by optical cable.

www.digitaltrends.com/home-theater/hdmi-arc-explained-works-care/

Every LG 4K TV released this year has HDMI 2.1 & all of it's official specs in glory. :clap:

www.techradar.com/news/hdmi-arc-vs-earc

Therefore, let's slow down with simply beefing up the memory, bandwidth, bus & get in tune with 2020 HDMI 2.1/DP 2.0 video standards. After all, these are video cards (not 100% gaming), why are we 5 years w/out an HDMI/DP upgrade? We must demand more from not only NVIDIA, also AMD in this regard. While I've been a fan of EVGA for 7-8 years, this could change if another brand could provide a GPU in tune with the times. Am not a fanboy of any brand, other than the one which meets my need (why I've not upgraded my Z97 build with i7-4790K). I have both Intel & AMD computers (most self-built), any of my best ones could go another 5 years with an up to date GPU.:D

Cat
Posted on Reply
#102
Valantar
cat1092
How about some real video upgrades & not just more & faster memory, as well as bandwidth?

We've had HDMI 2.0(b) & Displayport 1.4, as well as HDR for capable hardware since at least the GTX 1000 series or late 2015/early 2016 (even the lowly 2GB MSI GT 1030 GDDR5 can do this as a x4 card) & some of the 900 series can deliver the same with a Displayport firmware update released 2-3 years ago. Since then, 4K TV's have came a long way, there's several models now boasting HDMI 2.1 running 4K at 120 Hz. These cards cannot keep up with HDMI 2.1 speed, unless future spec sheets shows otherwise.

There's many of us who aren't hardcore gamers (if at all), rather video enthusiasts & in this respect, the RTX 3000 series (as well as 2000 series) doesn't deliver. Cannot run 4K at 120/144 Hz! There'll be more & more of these 4K TV's with HDMI 2.1, will trickle down to value models, such as VIZIO. 4K at 120 Hz is the official spec of HDMI 2.1, as well as 8K at 60 Hz (there's quite a few of these on the market as well). Try gaming on these & will be disappointed when one cannot keep up with native refresh rate of the TV. Pricing is getting competitive, soon we'll be seeing these for less than the cost of a mid range (RTX 3070) GPU. Furthermore, HDMI 2.1 will make 2K gaming obsolete, these old monitors are pricey & few TV's shipped with the standard (the only one I've seen was at Walmart & didn't look close). 4K is dropping in price, as well as a few 8K models. BTW, eARC is also a spec of HDMI 2.1 & that alone is a big deal! HDMI 2.0(b) ARC runs at only 1 Mbps, eARC doubles this by 48 times! More than that vs connecting by optical cable.

www.digitaltrends.com/home-theater/hdmi-arc-explained-works-care/

Every LG 4K TV released this year has HDMI 2.1 & all of it's official specs in glory. :clap:

www.techradar.com/news/hdmi-arc-vs-earc

Therefore, let's slow down with simply beefing up the memory, bandwidth, bus & get in tune with 2020 HDMI 2.1/DP 2.0 video standards. After all, these are video cards (not 100% gaming), why are we 5 years w/out an HDMI/DP upgrade? We must demand more from not only NVIDIA, also AMD in this regard. While I've been a fan of EVGA for 7-8 years, this could change if another brand could provide a GPU in tune with the times. Am not a fanboy of any brand, other than the one which meets my need (why I've not upgraded my Z97 build with i7-4790K). I have both Intel & AMD computers (most self-built), any of my best ones could go another 5 years with an up to date GPU.:D

Cat
A) it is entirely expected that upcoming GPUs feature HDMI 2.1.
B) You're misrepresenting things by saying we've gone five years without an I/O upgrade - current gen cards mostly launched 1-2 years ago.
C) AFAIK there still isn't a certification process for HDMI 2.1 source devices, just for signal sinks (TVs etc.). How are they then to make something compliant? They certainly couldn't when they last launched a series of GPUs.
D) eARC isn't relevant for GPUs (unless you for some reason want your TV/monitor to send its audio back to your PC?), only for TVs/monitors and receivers/amplifiers/soundbars. As long as your TV and amp support eARC, it doesn't matter if your PC doesn't.
E) You're saying GPU makers should take it easy with performance increases (which are always wanted and "necessary" inasmuch as anything with a GPU can be said to be) to prioritize ... a trivial upgrade that likely represents <1% of the effort of said performance increases. I don't know about you, but I fully expect them to be able to deliver both.
Posted on Reply
#103
medi01
People should realize DVI is royalty free, while HDMI is not, on top of being crippled by "OMG people might see it for free" useless (hacked wide open as usual) encryption that complicates it unnecessarily.
Posted on Reply
#104
Valantar
medi01
People should realize DVI is royalty free, while HDMI is not, on top of being crippled by "OMG people might see it for free" useless (hacked wide open as usual) encryption that complicates it unnecessarily.
Did you mean DP? DVI tops out at 2560x1600@60Hz for dual-link cables/connectors. Hardly a modern interface.
Posted on Reply
Add your own comment