• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Stock Falls 2.1% After Turing GPU Reviews Fail to Impress Morgan Stanley

The shoe's on the other foot now. Usually, it's AMD that comes up with something new and has to wait for games to use it to take full advantage of any speed-ups it enables. With RTX, it's nVidia's turn to wait for games to use it so they can have that advantage. AMD has had many problems with their instances: let's see how nVidia fares with theirs.

I see two problems with nVidia's approach:

1 - RTX seems to be only for top end cards which accounts for quite a minority of the whole cards ecosystem, so i seriously doubt game developers will spend so much resources for the benefit of the very few

2 - by pricing the cards that have RTX so high, it only exacerbates the problem of #1 because it drives away potential buyers. Most people will skip 2080 because it's performance is mostly tied with 1080 Ti while the 2080 is quite a bit more expensive (assuming they already have a 1080 Ti) and the 2080 Ti is far too pricey
 
Last edited:
same people are angry because RTX 2080 not 30% faster than GTX 1080 Ti in old games?

Come again, 2080, that costs like 1080Ti, a two years old card, is how much faster than 1080Ti, which is, wait, still a 2 years old card?

#StopOffendingBritneyNvidia

iu
 
Last edited:
Please buy our totally new Turinglake, totally different to Pascalake and Maxwellake. We increased prices because it has some new features we can't show you yet, but believe us, they are great.
 
Please buy our totally new Turinglake, totally different to Pascalake and Maxwellake. We increased prices because it has some new features we can't show you yet, but believe us, they are great.

More even: buy more and save money while doing so ...
 
Please buy our totally new Turinglake, totally different to Pascalake and Maxwellake. We increased prices because it has some new features we can't show you yet, but believe us, they are great.

Sadly, this will work. There isn't anything to turn people away from Nvidia.
 
Sadly, this will work. There isn't anything to turn people away from Nvidia.

Yes there is, price. Its really quite simple :D

Nvidia probably knows this, too. Its a deliberate pricing structure that suits the early adoption status of RTRT and keeps Pascal relevant.
 
Yes there is, price. Its really quite simple :D

Nvidia probably knows this, too. Its a deliberate pricing structure that suits the early adoption status of RTRT and keeps Pascal relevant.

I guess we'll see the results in a few months. I still think enough people will buy this, despite what Wall Street says at the moment.
 
I guess we'll see the results in a few months. I still think enough people will buy this, despite what Wall Street says at the moment.

Well it depends on where you look. If you look at the whole marketplace the top-end GPUs are really a tiny minority and they always have been. And if you look at the whole of console+PC gaming, its even worse. But in our direct environment yes, look at TPU alone, despite how bad the proposition really is. This is also why many on this forum say new tech is the next best thing, they forget that there is a world outside of the enthusiast scene and in the end its thát world that needs to carry these changes.

One of the most telling figures you can find right now is 4K monitor adoption. I believe its still below 5%. Among TPU users I bet its easily around 30% or more.
 
I had been using 2 GTX 560Ti 2Gb editions in SLi since 2012 and one burned so finally I upgraded to Maxwell few months ago GTX 970 Reference Card @1500 MHZ I know it was old generation but i am loving it So All that RTX is not important to me.
 
I had been using 2 GTX 560Ti 2Gb editions in SLi since 2012 and one burned so finally I upgraded to Maxwell few months ago GTX 970 Reference Card @1500 MHZ I know it was old generation but i am loving it So All that RTX is not important to me.
Considering Turing seems to be Maxwell 1.3 with Tensor on top of it, you're not losing much.
 

True, but the machine learning bits that make it basically able to happen are pretty much NVIDIA's proprietary baby.
Exactly! That was precisely my idea behind that claim (on that given rather hypothetical scenario)!

I thought it was disappointing initially but, actually, it is brilliant. They already have a stock pile of the previous generation that are as good, if not better, than the competition depending on use case. Now, they released a product that basically costs twice as much with a feature that you can't test and benchmark until long after people buy it. That feature, to top it off, is basically useless because no one plays at 1080 when they are the type that purchases top tier.

So, they will probably have a limited availability because the dies are so large and yields will be poor but they get to completely test the arch and RTX AND make a large profit doing it because they priced it so high. Meanwhile, everyone that doesn't want terrible value to go with high performance will buy all the stock of the previous generation greater than MSRP. Following that, 10n, or 7n, whatever they end up using will be ready and they can release this same product on the node shrink and gain even more performance without having to develop another arch. Plus, the enhancements they will get by switching nodes and some enhancements they learn from this beta test will allow the RTX tech to be useful when 3000 series can go.

They will get to sell out all their old stock without having to drop prices, sell all the 2000 series at insane prices, and beta test RTX all in one go. …
I have had that train of thoughts already the very moment they announced the RTX 2xxx-cards about a month ago and even well before that date. It instantly popped into my head and I literally saw their intentions that very moment. Though to be fair, I knew already back then they were sitting on that pile of greed (for profit) so it was actually pretty clear. It was reading it like a open book and I was like „Oh Jensen you dirty Sunnova Bitch! That's what you're going to do, hu? You're going to force your own customer to pay for your everlasting avarice just because YOU greedy basterds have yourselves squandered on the mining-boom, eh?“

So I wasn't really after stealing any limelight from anybody, especially you, but unfortunately I actually have to – since I was already saying this since weeks to month, but no-one believed me. Nah, just kidding!
augen15x18.gif


I'm rather glad I'm not the only one who was seeing something like that is going to happen quite soon (or the intentions behind the whole schedule). At first I thought I was going mad or even insane but quite soon I realised how this is going to unfolg itself. That's by the way why nVidia postponed their high-end Volta for the consumer market in the last nick of time just hours before they were initially scheduled to release Turing Volta for the masses on Hot chips 30 back then. I knew it that day something was coming since they already were sitting of huge stocks of GTX 10x0-cards.

The whole thing actualy is indeed brilliant and pretty dirty though!
madsm15x18.gif


As far as I recall it was only disgruntled AMD fans that bashed Nvidia for lacking async and telling us that it was such a great feature to have... while in real life the performance difference is zero. […]
The funny thing is, that the impacts (or non-existing improvements, however you want to see it from) nVidia had on Vulkan wouldn't have been there if they wouldn't have had dropped the hardware-scheduler they was already featuring on Kepler (?). They dropped it (and instead slammed just more execution units/shaders on the space now becoming available) to literally enforce the game-industry staying with DirectX 11 (since nVidia GPUs were faster in only that) rather than step up and build something for the future.

So they literally crippled their cards (for the future, like for Vulkan and DirectX 12) to gain any speed-improvement in current day's DirectX 11 – and this made made completely on purpose.
 
Last edited:
The funny thing is, that the impacts (or non-existing improvements, however you want to see it from) nVidia had on Vulkan wouldn't have been there if they wouldn't have had dropped the hardware-scheduler they was already featuring on Kepler (?). They dropped it (and instead slammed just more execution units/shaders on the space now becoming available) to literally enforce the game-industry staying with DirectX 11 (since nVidia GPUs were faster in only that) rather than step up and build something for the future.

So they literally crippled their cards (for the future, like for Vulkan and DirectX 12) to gain any speed-improvement in current day's DirectX 11 – and this made made completely on purpose.

Would have - but the reality is that the adoption rate of these new APIs is extremely low, developing for them is more expensive if at any time you want to extract a performance advantage, and the purpose of DX12 was not to advance gaming per se, but to elevate the low-end to greater performance. Many core, low clockspeed CPUs such as the ones you find in mobile devices, laptops, but also the consoles. Its no coincidence DX12 happily coincided with the Xbox One (no, it wasn't Mantle pushing for change that caused MS to push out DX12, it was the console deal).

You have shown a good bit of insight that I can only agree with, but let's apply that here, too... ;) The trend has never changed: Nvidia optimizes around the market's (software) common denominator, and AMD optimizes around new or improved hardware features - the software is expected to follow and make use of it. Its no surprise that the latter is the more risky approach. Its simply a choice in how you develop hardware and its the way these two companies try to differentiate.

The irony: RTX is completely counter intuitive to all of that :D
 
If I could like this post 10 times I would. I was thinking the same thing, but you put it into words better than I...

Don't forget Intel is entering the dedicated GPU market in 2020. We will have 3 players playing ball as 7nm Vega 2 also hits in late 2019. If intel is only focusing on mid tier only who knows, but I doubt it. Arizona should have Intel's 7nm factory built by 2021 or so... times might be changing around 2023 or so.

that being said... im doing 7nm amd cpu and gpu in late 2019 and not caring anymore for 5+ years.
 
Don't forget Intel is entering the dedicated GPU market in 2020. We will have 3 players playing ball as 7nm Vega 2 also hits in late 2019. If intel is only focusing on mid tier only who knows, but I doubt it. Arizona should have Intel's 7nm factory built by 2021 or so... times might be changing around 2023 or so.

that being said... im doing 7nm amd cpu and gpu in late 2019 and not caring anymore for 5+ years.

I thought I read somewhere that Intel was first going after content producers/high end.. but I hope they'd eventually hit the mid-tier.
 
I thought I read somewhere that Intel was first going after content producers/high end.. but I hope they'd eventually hit the mid-tier.

high end to start would be best. to make Nvidia actually innovate some. instead of milking us.
 
I thought I read somewhere that Intel was first going after content producers/high end.. but I hope they'd eventually hit the mid-tier.

They should because the approach of only targeting midrange is doomed to fail.
 
So after all this nonsense they back over Morgan's $273 prediction, but AMD are currently down 5% OMG!!!1

The joys of the stock market.
 
It's so easy you'll laught. It will be the exact thing they did with the 9xx series vs 7xx. They will purposely cripple the performance on the new titles, and call it "no more optimization done for older generations" ;)
Thats...not gimping. Gimping a card would decrease its performance on OLDER titles, reducing already established performance.

Not enhancing performance for new titles is not gimping, its being put on long term support. Not optimizing != crippling. Same thing AMD did with every generation up to GCN, and same thing nvidia has done every single generation going back to the geforce 256 days.

If you guys want never ending driver updates, perhaps you shouldnt be complaining about a $1200 price tag.
 
Thats...not gimping. Gimping a card would decrease its performance on OLDER titles, reducing already established performance.

Not enhancing performance for new titles is not gimping, its being put on long term support. Not optimizing != crippling. Same thing AMD did with every generation up to GCN, and same thing nvidia has done every single generation going back to the geforce 256 days.

If you guys want never ending driver updates, perhaps you shouldnt be complaining about a $1200 price tag.
You should be working for nGreedia PR department dude, you would be perfect match there with this callous atitude against customers. ;)
 
Back
Top