• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Readies TU104-based GeForce RTX 2070 Ti

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,727 (7.42/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Update: Gigabyte themselves have come out of the gates dismissing this as a typo on their part, which is disappointing, if not unexpected, considering that there is no real reason for NVIDIA to launch a new SKU to a market virtually absent of competition. Increased tiers of graphics card just give more options for the consumer, and why give an option that might drive customers away from more expensive graphics card options?

NVIDIA designed the $500 GeForce RTX 2070 based on its third largest silicon based on "Turing," the TU106. Reviews posted late Tuesday summarize the RTX 2070 to offer roughly the the same performance level as the GTX 1080 from the previous generation, at the same price. Generation-to-generation, the RTX 2070 offers roughly 30% more performance than the GTX 1070, but at 30% higher price, in stark contrast to the GTX 1070 offering 65% more performance than the GTX 970, at just 25% more price. NVIDIA's RTX varnish is still nowhere in sight. That said, NVIDIA is not on solid-ground with the RTX 2070, and there's a vast price gap between the RTX 2070 and the $800 RTX 2080. GIGABYTE all but confirmed the existence of an SKU in between.





Called the GeForce RTX 2070 Ti, this SKU will likely be carved out of the TU104 silicon, since NVIDIA has already maxed out the TU106 with the RTX 2070. The TU104 features 48 streaming multiprocessors in comparison to the 36 featured by TU106. NVIDIA has the opportunity to carve out the RTX 2070 Ti by cutting down SM count of the TU104 to somewhere in the middle of those of the RTX 2070 and the RTX 2080, while leaving the memory subsystem untouched. With these, it could come up with an SKU that's sufficiently faster than the GTX 1070 Ti and GTX 1080 from the previous generation, and rake in sales around the $600-650 mark, or exactly half that of RTX 2080 Ti.

View at TechPowerUp Main Site
 
Aren't Ti cards usually released a year after the initial release of the series? I wonder what's up with Nvidia trying to fill up the product stack ASAP
 
Nice another Ti to fill the performance gap
 
Nvidia is starting to develop a weird fetish of closing 20% performance gaps between cards with another card. The 1070 Ti is a prime example of it.

Price gap is also not so big between the 500-600$ RTX2070 and 750-900$ RTX2080. This basically leaves a 700$ slot. Just bizarre stuff.

Nvidia, what about the sub 400$ market instead of shoving another overpriced Turing to this stack?

The only thing scary this Halloween is going to be RTX prices
 
You need to sacrifice a LOT of candys for "free" t-shirt.
 
The RTX 2070 Ti/2070 cards, the most pointless cards ever released by nGreedia.
 
The RTX 2070 Ti/2070 cards, the most pointless cards ever released by nGreedia.
What exactly makes them pointless? From what I have seen in reviews and gaming press so far, 2070s so far seem to perform a little better than 1080 for the same price and with some cards (notably EVGAs) coming at MSRP they are not that angry at the price either. 2070 actually seems to end up being a reasonable Turing.
 
If anyone asks, I was the guy in the distance saying "what?".

What are they thinking? I hope RTG pulls their act together and crushes Nvidia's windpipe. Because ever since the launch of Turing, everything is just a ****show in terms of pricing. Come to think of it a lot is going on a downwards spiral since the last mining craze.

It's that time again where I am just not impressed by what has happened in the past 1.5 years of the tech world.

I think they are trying to prove us right, that since they don't have competition they're just gonna price the cards up the bum and that's fair.
 
Last edited:
let's not forget that NVIDIA's product stack has 13 different models; if you combine both Pascal lineup & Turing. With 2070Ti making an announcement, this makes it to 14. with a $650 price tag, it's gonna be somewhere between "untouchable" to "barely afford it". Gonna stick with my soon-to-be-watercooled GTX970 until Turing becomes relevant in another... 2-3 years, which is by the time most of the consumers have adopted ray-tracing tech.
 
What exactly makes them pointless?

The price... lowest non-blower ASUS DUAL starts from 699€, while the STRIX starts from 769€. While you can have a decent GTX 1080 for as low as 500€.

It consumes the same amount of power and is barely faster then a 2 year old GTX 1080, while the TU106 core will be probably close to useless for RTX as far the demos we have seen so far. So apart for DLSS which is basically ML assisted upscaling what incentive would one have to go for an RTX 2070 instead of a GTX 1080 right now?
 
There is something completely wrong here. It's turned into a pure money harvester.

Is it intentionally they have done the 2070 slower than the 1080ti, but pricier, so that the 2070ti will bee justified.

And for what....there are no games out there to support the fanzy ray-tracing......
 
What exactly makes them pointless? From what I have seen in reviews and gaming press so far, 2070s so far seem to perform a little better than 1080 for the same price and with some cards (notably EVGAs) coming at MSRP they are not that angry at the price either. 2070 actually seems to end up being a reasonable Turing.
it is pointless - because it is gtx 1080 performance for almost gtx 1080 ti price... how is than not pointless card? all reviews quote the MSRP price "499$" - even though it never will be seen in wild. That is all nividia wanted - people like you see: "+5% vs gtx 1080", see the "499$" add to that "but it is RTX future proof" and conclude: "well it is not so bad at all, why u hate". and and now nvidia will relase this rtx 2070 ti = "+12% vs gtx 1080" and maybe "519$" (like who cares, it is not like than number needs to be any where close market prices), and you will be like: "now that is a deal of the century"
 
This is definitely a generation worth skipping. It's sad. I have the funds to pick up a 2080 ti, but I would never buy it for that price. If the card had 2 to 3 times the performance of a 1080ti, I may have been able to justify the reasoning of breaking the $1000 spending threshold on a gpu. But 2 years development for a 15-20 fps increase for that price makes my stomach turn. Right now I have a Asus Strix 1080ti I brought on launch day and drove to newegg to personally pickup and throw into my living room gaming rig. Then I got bored last year and and picked up a Zotac 1080 ti mini and threw it into my bedroom computer (Dell 3850 mini tower). My hobby is upgrading my system, but Nvidia definitely lost my vote of confidence this generation. The 1080 TIs run all my games at 4k/60 already, or so close to 60 that the upgrade to the 20 series wouldn't matter.

I share others sentiments that if AMD can release a gpu with 2080ti performance for $600-$700 minus the currently useless ray tracing, I would buy my first Radeon card since the HD5870.
 

Attachments

  • 20180920_130558.jpg
    20180920_130558.jpg
    874.3 KB · Views: 443
  • 20180616_195958.jpg
    20180616_195958.jpg
    2.6 MB · Views: 479
getting a "fresh" Pascal GPU right now isn't going to do justice when developers starts implementing ray tracing to their games. Just no. What happens when games & 3D rendering programs only uses ray tracing as their go-to rendering choice? Your Pascal or Volta card may just be a relic of a silicon. As much as I hated how Nvidia does for their new product lineup, one thing I'll be certain is that next year Pascal cards will flood the used GPU market coz everyone wanted to ride the ray-tracing bandwagon. So it's either you future-proof now or regret later.
 
getting a "fresh" Pascal GPU right now isn't going to do justice when developers starts implementing ray tracing to their games. Just no. What happens when games & 3D rendering programs only uses ray tracing as their go-to rendering choice? Your Pascal or Volta card may just be a relic of a silicon. As much as I hated how Nvidia does for their new product lineup, one thing I'll be certain is that next year Pascal cards will flood the used GPU market coz everyone wanted to ride the ray-tracing bandwagon. So it's either you future-proof now or regret later.

I'm not sure if this is a joke post, but I'll bite. Are you saying that next year's cards won't have even better ray tracing capabilities so we'd be crazy not to buy now? Or how about when there are about I don't know like 5 ray tracing compatible games available to purchase in around mid 2020. How long did it take for tessellation to get fully implemented in games? Well in 2018 after all this time tessellation is still not in most games that are released. You almost certainly have a couple of years and a few gpu generations before ray tracing is applied in game development in any meaningful way. Why? Because developers and publishers want to make money, and with nearly the entire customer base not having the capabilities or horsepower (even with this 1st generation of rtx cards) to effectively run ray tracing (without resolution and fps in half) they still want to sell games. I would chose 4k/60 with HDR over 1080p/30 with ray tracing any day of the week, and the current gpus are not strong enough where we don't have to choose. Pascal cards are going to be fine until the new consoles hit in 2020. That would be a great year to upgrade, but not now.
 
getting a "fresh" Pascal GPU right now isn't going to do justice when developers starts implementing ray tracing to their games. Just no. What happens when games & 3D rendering programs only uses ray tracing as their go-to rendering choice? Your Pascal or Volta card may just be a relic of a silicon. As much as I hated how Nvidia does for their new product lineup, one thing I'll be certain is that next year Pascal cards will flood the used GPU market coz everyone wanted to ride the ray-tracing bandwagon. So it's either you future-proof now or regret later.


I totally get where you are coming from but at the same time games won't be "ray tracing only" for a good few years it would not make sense for developers to limit their market to the small % of people who will be getting RTX cards compared to the masses that have older gens. there will be easily another few generations of GPUs out before they make that switch. it would be like the government taking away all petrol stations because of electric cars are starting to come out. I know I'm keeping my 1070 for the next few years, you cannot just buy every new tech because of "future proofing" because you would be buying every new gen GPU that comes out which would not make it future proofing.
 
1080 TIs available on ebay $499 Buy it Now price, and they are faster and cheaper than the Rtx 2070.
 
Aren't Ti cards usually released a year after the initial release of the series? I wonder what's up with Nvidia trying to fill up the product stack ASAP
I've been thinking this as well. This and other factors suggest that NVidia might believe, or know, that AMD is about to release something big to answer their line up. It's is also equally possible that this hurried release schedule is a new business plan.
 
Imho it's not worth to buy now those gpus. I expect 7nm revision or 3000 series next year. Additionaly there are new AMD gpus in next year as another reason to refresh current gpus. Probably they rushed with all products to milk as many as possible before new ones come with 7nm.
 
I expected to hear about 2060Ti cut-down TU106.
Yeah on 7 nm the nonexistent 1000mm2 chip with 6144 CUDa gets shrinked to 300mm2 and then to 250mm2 on 7N+ and that is the 70/80 104-class chip. So everything we see today is pretty much obsoleted. Im not buying until 40 series it seems. Not unless I see 2060 (half 2080 no RT), at 200$.
 
The pricing with this generation of cards is insane. Each performance gain is like a price hike of the same percentage. Anybody remenber those ASUS Ares cards or the first Titan that had a price tag of 1k and didn't we back then think it was pure insanity buying those cards for those prices? How has it become something normal to pay that much for a single componenent now...
 
Imho it's not worth to buy now those gpus. I expect 7nm revision or 3000 series next year. Additionaly there are new AMD gpus in next year as another reason to refresh current gpus. Probably they rushed with all products to milk as many as possible before new ones come with 7nm.

I totally agree, seems Nvidia knows something we don't...
 
Yelds are low, so this is a way to get rid of faulty 2080/2080Ti chips.
 
I noticed that nearly everyone here has their panties on too tight. The article is pure speculation.

Has everyone got too much free time, so that they can spend it getting upset about speculation by our editor in chief (who sourced exactly nothing)?
 
Back
Top