Thursday, September 14th 2023

NVIDIA GeForce RTX 4070 Could See Price Cuts to $549

NVIDIA's GeForce RTX 4070 12 GB graphics card finds itself embattled against the recently launched AMD Radeon RX 7800 XT, and board partners from NVIDIA's ecosystem plan to do something about it, reports Moore's Law is Dead. A GIGABYTE custom-design RTX 4070 Gaming OC graphics card saw a $549 listing on the web, deviating from the $599 MSRP for the SKU, which hints at what the new pricing for the RTX 4070 could generally look like. At $549, the RTX 4070 would still sell for a $50 premium over the RX 7800 XT, probably banking on better energy efficiency and features such as DLSS 3. NVIDIA partners could take turns to price their baseline custom-design RTX 4070 product below the MSRP on popular online retail platforms, and we don't predict an official price-cut that applies across all brands, forcing them all to lower their prices to $549. We could also see NVIDIA partners review pricing for the RTX 4060 Ti, which faces stiff competition from the RX 7700 XT.
Source: Moore's Law is Dead (YouTube)
Add your own comment

130 Comments on NVIDIA GeForce RTX 4070 Could See Price Cuts to $549

#76
fevgatos
AusWolfYou just contradicted yourself. First you said there's no marketing in GPUs, and now you're bringing examples of it. Of course these are marketing!

And just because you know about the existence of X GPU, you can still go to E3, Gamescom, etc. to drool over it and/or Nvidia/AMD's presentation, and/or try the card first-hand. This is what marketing is.

Or why do you think reviewers get free units? Because they can tell everyone what a great product it is (or at least generate some publicity) so that more people will buy it. Simple.

There's TONS of marketing in GPUs!
Im not contradicting anything. The type of marketing you are referring to, I already described it on my very first post btw, is only directed at people who are already aware of nvidia, amd and technology in general. Both companies have similar amount of exposure when it comes to this type of marketing.
Posted on Reply
#77
Dr. Dro
AusWolfNvidia's partners have been crying about too small profit margins for years, Evga has even dropped out of the game, and now, they suddenly let the 4070 go $50 cheaper out of the kindness of their hearts, with no interference from Nvidia whatsoever. Tsk-tsk-tsk. :rolleyes:

Anyway, a 12 GB card should not cost more than $500 MSRP. At that price, a 16 GB (7800XT) vs DLSS (4070) battle would be exciting. $550 is still too much, imo.
If I may be perfectly honest, I don't think Nvidia feels threatened by the 7800 XT.

The RTX 4080 was never considered a very popular card due to its high cost, it still has more than twice the market share of the 7900 XTX according to Steam. The 4090 outsold both combined. This trend holds throughout the entire stack, with even previous-generation Nvidia GPUs heavily outperforming AMD's most popular models, for example, 3080 Ti at 0.84% of the market share somehow exceeds the amount of Steam Decks "AMD Van Gogh" floating around at 0.81%... and even the ancient 1070 with 1.48% of market share still largely exceeds even the RX 580 which is AMD's all-time best seller and most successful GPU ever (currently at 0.99%).

Steam stats are still the most accurate metrics we have regarding actual GPUs, other than maybe the GPU-Z metrics.
Posted on Reply
#78
AusWolf
fevgatosIm not contradicting anything. The type of marketing you are referring to, I already described it on my very first post btw, is only directed at people who are already aware of nvidia, amd and technology in general.
fevgatosWhat marketing? Gpus aren't even a product that's really marketed anywhere.
You can't have it both ways.
Posted on Reply
#79
fevgatos
AusWolfYou can't have it both ways.
Uhm, okay bro
Posted on Reply
#80
AnotherReader
Dr. DroIf I may be perfectly honest, I don't think Nvidia feels threatened by the 7800 XT.

The RTX 4080 was never considered a very popular card due to its high cost, it still has more than twice the market share of the 7900 XTX according to Steam. The 4090 outsold both combined. This trend holds throughout the entire stack, with even previous-generation Nvidia GPUs heavily outperforming AMD's most popular models, for example, 3080 Ti at 0.84% of the market share somehow exceeds the amount of Steam Decks "AMD Van Gogh" floating around at 0.81%... and even the ancient 1070 with 1.48% of market share still largely exceeds even the RX 580 which is AMD's all-time best seller and most successful GPU ever (currently at 0.99%).

Steam stats are still the most accurate metrics we have regarding actual GPUs, other than maybe the GPU-Z metrics.
I think GPU-Z metrics would skew towards the enthusiast so they would be less likely to give an indication of the mass market. The most accurate metrics would be from retailers across a broad sampling of geographies, but barring that, Amazon might be enough as it's the one with the greatest reach. Unfortunately, that data source is not available so we have to make do with Steam.
Posted on Reply
#81
rv8000
Dr. DroIf I may be perfectly honest, I don't think Nvidia feels threatened by the 7800 XT.

The RTX 4080 was never considered a very popular card due to its high cost, it still has more than twice the market share of the 7900 XTX according to Steam. The 4090 outsold both combined. This trend holds throughout the entire stack, with even previous-generation Nvidia GPUs heavily outperforming AMD's most popular models, for example, 3080 Ti at 0.84% of the market share somehow exceeds the amount of Steam Decks "AMD Van Gogh" floating around at 0.81%... and even the ancient 1070 with 1.48% of market share still largely exceeds even the RX 580 which is AMD's all-time best seller and most successful GPU ever (currently at 0.99%).

Steam stats are still the most accurate metrics we have regarding actual GPUs, other than maybe the GPU-Z metrics.
Ill never understand people who bought a 4080. You get a slower card in rasterization for $200 more than a 7900XTX, and the the gaping hole between the 4080 and 4090 is much worse than previous gens with a smaller price difference. If someone has $1200-$1400 to toss on a 4080, theres no doubt in my mind they have the cash to pony up for a 4090.

As far as feeling threatened, its hard to care about reality when theyre completely out of touch concerning the consumer (gamer oriented) market, thats obviously the least of their concerns. Its clear AIBs realize how poor value the 4070 was/is vs the 7800XT. In the grand scheme the only person that stands to win are Nvidia/AMD in this market anyways.
Posted on Reply
#82
fevgatos
rv8000Ill never understand people who bought a 4080. You get a slower card in rasterization for $200 more than a 7900XTX,
200 extra is peanuts if it allows you to play games the 7900xtx doesn't. Like the previously mentioned cyberpunk
Posted on Reply
#83
rv8000
fevgatos200 extra is peanuts if it allows you to play games the 7900xtx doesn't. Like the previously mentioned cyberpunk
Pretty sure you missed the second point then; if $200 is peanuts, they should have bought a 4090.

No ones interested in your massive Nvidia bias anyways.
Posted on Reply
#84
fevgatos
rv8000Pretty sure you missed the second point then; if $200 is peanuts, they should have bought a 4090.

No ones interested in your massive Nvidia bias anyways.
You just said there is no point in buying a 4080 over a 7900xtx but im the one with the bias?

The delusion on some amd fans...bonkers.
Posted on Reply
#85
ARF
AusWolfDo you have a source on this? This is getting interesting.
Yes.

Demand for graphics cards significantly increased during the pandemic as some people spent more time at home playing games, whereas others tried to mine Ethereum to get some cash. But it looks like now that the world has re-opened and Ethereum mining on GPUs is dead, demand for desktop discrete GPUs has dropped dramatically. In fact, shipments of discrete graphics cards hit a ~20-year low in Q3 2022, according to data from Jon Peddie Research.

www.tomshardware.com/news/sales-of-desktop-graphics-cards-hit-20-year-low

If you keep up with the chatter on PC building forums, you’ve probably sensed the current apathy toward discrete graphics cards. Now there’s data to back up that gut impression. A new report from analysts at Jon Peddie Research shows a nearly 13 percent drop in shipments in the first quarter of 2023 compared to the previous quarter—and almost a 40 percent drop year to year.



www.pcworld.com/article/1947496/oof-desktop-gpu-sales-are-down-almost-40-percent-year-to-year.html
Posted on Reply
#86
The Norwegian Drone Pilot
Again, even with the price cuts on the 'RTX 4070', it doesn't even comes remotely close to what the 'RX 7800 XT' offers in both raw performance and value when you know the 'RX 7800 XT' also comes with 'Starfield' worth 99 dollars. DLSS and RT on the NVIDIA cards are nice to have, but not '70-80% more expensive in cost per frame' nice to have.

RTX 4070 might have got away with it if it had been like 25% (max) more in cost per frame over the RX 7800 XT, but not with 70-80%, LOL.

Here is a video about the price cuts and how the NVIDIA cards stands against AMD.

Posted on Reply
#87
AusWolf
I just got my code for Starfield (it took me 8 days and 2 emails with the store's customer service, but oh well), and it's the premium edition.

That means, it's $100 (£90) off the price I paid, not 70 (£60), which makes the 7800 XT essentially a $400 GPU. Shame the promo only lasts until 30 Sep.
Posted on Reply
#88
las
AusWolfPS and Xbox games don't use the same graphics settings and resolutions as their PC versions. A month ago, I was under the assumption that a 6700-6750 XT or 3070 Ti would be enough for rock solid 60 FPS at 1080p in every game for the next 3-5 years. Then, Starfield came out. So like I said: just in case. ;) Games may or may not need this much VRAM in the near future, and they may or may not run out of GPU horsepower first. But it's better to play it safe. I'm pretty sure similar conversations went through with the 30-series and all the 8 GB cards in it back in the days.
8GB is not the problem in Starfield. It is shitty optimization, across the board. Game is heavily CPU bound and does not even look good. I am playing it right now. It looks far better with DLAA tho. FSR2 implementation is garbage in Starfield and part of default presets, which is kinda funny. FSR is just horrible compared to DLSS, even when modded in. Techpowerup tested this, go read.

Just another AMD sponsored title, where DLSS is not present. AMD hates when FSR is being directly compared with DLSS. Luckily for RTX users, 9 out of 10 new games have DLSS/DLAA and most of the time, the remaining games gets a mod on day one, like Starfield had. Because no RTX users wants to settle with inferior FSR when they can do DLSS/DLAA.

Besides, AMD GPUs don't even render the sun in Starfield -> www.pcgamer.com/in-starfield-the-sun-literally-doesnt-shine-on-amd-gpu-users/
Talk about lowering immersion. Planets look so boring and lifeless without the sun present.

Starfield is overrated and overhyped anyway. Glad I got it for free.

And guess what, the only games that had problems with 8GB cards, has been AMD sponsored titles -> www.amd.com/en/gaming/featured-games.html
Mostly rushed console ports like TLOU Remake.

Coincidence? Probably not :laugh: However patches fixed the issues anyway.

4060 Ti 8GB performs just like the 16GB model in minimum fps in 4K -> www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-16-gb/35.html

The review even states that there's no performance uplift from going with the 16GB version at the end.
  • No significant performance gains from 16 GB VRAM
Meanwhile most AMD users think they need 16GB for 1080p :roll:

So yeah, keep thinking VRAM will save you. It won't. GPU will be the limiting factor long before VRAM becomes a problem 99% of the time.

AMD and AMD users talk about VRAM because they have nothing else to talk about. Sad but true.
Posted on Reply
#89
Eskimonster
lasStop the BS and look at reality. 12GB is plenty for 1440p gaming, and even 4K gaming.

A card like 7800XT can't even utilize 16GB because GPU is too weak. Futureproofing makes no sense when GPU will be what stops you from maxing out games.

More VRAM does not help when GPU is the limiting factor. And this is why 3070 8GB still beats 6700XT 12GB in 4K gaming in 2023 overall with ease -> www.techpowerup.com/review/amd-radeon-rx-7800-xt/32.html

And they launched at same MSRP in 2020/2021, or 20 dollars apart but 3070 launched like 6 months before 6700XT. AMD is always too late to the party really.


A 4070 with DLSS enabled would destroy a 7800XT with ease. They perform within 5% of each other without it and 4070 uses 50 watts less, was also released almost half a year ago. Nothing about 7800XT is impressive if you ask me. We had this kind of performance several years ago. However it's a fine mid-end card, AMD just released it too late because they already talk about RDNA4 in 2024 which will replace 7800XT -> RDNA4 will have no high-end products so Nvidia don't even have to respond before 2025.

The 4070 should have a slightly higher price than 7800XT because of superior features and RT performance. 9 out of 10 people that buy GPU today don't solely look at raster performance like its 2015. Killer features like DLSS/DLAA/DLDSR is missing. No way AMD can ask a higher price when all they have is FSR, VSR and mediocre RT performance and features in general.

AMD is always the cheaper option, with subpar features to follow and less optimized drivers overall. Nothing new here. Resell value is also much lower on AMD GPUs (less demand and AMD pricedrops over time makes them almost unsellable after a few years).

You can keep rambling all you want, what you say is exactly what people said about 6700XT because it had 12GB and it aged like milk anyway because GPU is not good enough to utilize it anyway. 3060 Ti 8GB aged just as well and was 79 dollars less on release and have DLSS to save performance meanwhile 6700XT users have identical raster perf but only FSR to help them when card becomes too slow, and it already is considered slow in many new games.

I can't stop laughing when people think VRAM will make their GPU futureproof... Not a single time this has been the case. Even 3090 24GB feels kinda slow for 4K gaming today and people thought it would be able to max out games in 4K for many yeah because of 24GB VRAM... Yet 4070 Ti 12GB beats it overall in 4K gaming today -> www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html

7900XT is also only 5 fps faster than 4070 Ti in 4K minimum fps across all the games tested. Meaning that 12GB is plenty and 4070 Ti can always enable DLSS or DLAA to make up for that or to improve visuals further than 7900XT is capable of.

AMD severely lacks proper AA like DLAA. Amazing and best AA solution hands down.
Well spoken
Posted on Reply
#90
lexluthermiester
EskimonsterWell spoken
Except that what they said is factually incorrect.
Posted on Reply
#91
kapone32
las8GB is not the problem in Starfield. It is shitty optimization, across the board. Game is heavily CPU bound and does not even look good. I am playing it right now. It looks far better with DLAA tho. FSR2 implementation is garbage in Starfield and part of default presets, which is kinda funny. FSR is just horrible compared to DLSS, even when modded in. Techpowerup tested this, go read.

Just another AMD sponsored title, where DLSS is not present. AMD hates when FSR is being directly compared with DLSS. Luckily for RTX users, 9 out of 10 new games have DLSS/DLAA and most of the time, the remaining games gets a mod on day one, like Starfield had. Because no RTX users wants to settle with inferior FSR when they can do DLSS/DLAA.

Besides, AMD GPUs don't even render the sun in Starfield -> www.pcgamer.com/in-starfield-the-sun-literally-doesnt-shine-on-amd-gpu-users/
Talk about lowering immersion. Planets look so boring and lifeless without the sun present.

Starfield is overrated and overhyped anyway. Glad I got it for free.

And guess what, the only games that had problems with 8GB cards, has been AMD sponsored titles -> www.amd.com/en/gaming/featured-games.html
Mostly rushed console ports like TLOU Remake.

Coincidence? Probably not :laugh: However patches fixed the issues anyway.

4060 Ti 8GB performs just like the 16GB model in minimum fps in 4K -> www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-16-gb/35.html

The review even states that there's no performance uplift from going with the 16GB version at the end.
  • No significant performance gains from 16 GB VRAM
Meanwhile most AMD users think they need 16GB for 1080p :roll:

So yeah, keep thinking VRAM will save you. It won't. GPU will be the limiting factor long before VRAM becomes a problem 99% of the time.

AMD and AMD users talk about VRAM because they have nothing else to talk about. Sad but true.
That 4060TI that you are referencing is only due to the weakness of the GPU. Please tell me you think a 6600XT is as fast as a 6700XT or better yet 6800XT. What you don't understand is AMD targets and markets it's cards for specific resolutions. The 20GB VRAM buffer on the 7900XT is one of the reasons it is a great GPU, regardless of the narrative.
Posted on Reply
#92
Eskimonster
lexluthermiesterExcept that what they said is factually incorrect.
I see this discussion Muh like
Posted on Reply
#93
AusWolf
EskimonsterI see this discussion Muh like
I hate such rankings. What do you rank them by? Price? Performance at 1080p? Performance at 4K? Power consumption? You can't just say one GPU is bad and the other one is good. There are too many factors that keep changing all the time.
Posted on Reply
#94
Eskimonster
AusWolfI hate such rankings. What do you rank them by? Price? Performance at 1080p? Performance at 4K? Power consumption? You can't just say one GPU is bad and the other one is good. There are too many factors that keep changing all the time.
Thats ezzacly what they argue about in the beginnings, there are so many things you realy cant compare. Its more like i give up rating.
Posted on Reply
#95
lexluthermiester
EskimonsterI see this discussion Muh like
There is but one flaw in your reply: You're citing Steve from Hardware Unboxed, someone known to have a lop-sided and screwy perspective on many things. Objectivity, impartiality and logical thinking are NOT his bedfellows. His opinion is NOT to be taken seriously. And no, didn't watch that video, not worth my time. I see Steve, I avoid. Hell, LinusTechTips is a better source of info...
Posted on Reply
#96
AusWolf
lexluthermiesterThere is but one flaw in your reply: You're citing Steve from Hardware Unboxed, someone known to have a lop-sided and screwy perspective on many things. Objectivity, impartiality and logical thinking are NOT his bedfellows. His opinion is NOT to be taken seriously. And no, didn't watch that video, not worth my time. I see Steve, I avoid. Hell, LinusTechTips is a better source of info...
Nobody is a good source of info these days. The best course of action (imo) is to look at review charts with games that you like, or want to play, and draw your own conclusions. Talk is just talk. Screw that.
Posted on Reply
#97
lexluthermiester
AusWolfNobody is a good source of info these days.
While I feel you on that point, we're all humans, everyone is going to have their preferences. However, my point was that Steve is completely off the rails with his expressions very often. I just can't watch him nonsense anymore and I can't in good conscience recommend anyone take him seriously. His channel has become tat and his opinions dubious at best.
Posted on Reply
#98
AusWolf
lexluthermiesterWhile I feel you on that point, we're all humans, everyone is going to have their preferences. However, my point was that Steve is completely off the rails with his expressions very often. I just can't watch him nonsense anymore and I can't in good conscience recommend anyone take him seriously. His channel has become tat and his opinions dubious at best.
I cannot recommend any YouTube channel for reliable, sensationalism-free information, to be honest. They make money from the views, which they only get if they provide "content" (I hate this word). The louder the better. The only thing I can recommend in this day and age is looking at the charts, looking up prices, and skipping the blah-blah.
Posted on Reply
#99
las
kapone32That 4060TI that you are referencing is only due to the weakness of the GPU. Please tell me you think a 6600XT is as fast as a 6700XT or better yet 6800XT. What you don't understand is AMD targets and markets it's cards for specific resolutions. The 20GB VRAM buffer on the 7900XT is one of the reasons it is a great GPU, regardless of the narrative.
The 20GB VRAM is pointless because the GPU is not fast enough to utilize it anyway.

In 5 years when 20GB VRAM might be needed to max out demanding games, 7900XT will be trashbinworthy :laugh:

Alot of VRAM never futureproofed any GPU. GPU is the limiting factor 99% of the time.
Posted on Reply
#100
lexluthermiester
lasThe 20GB VRAM is pointless because the GPU is not fast enough to utilize it anyway.
That is an opinion. Some apps can use it. And if devs wrote better draw-distance code, 16GB & 20GB would be amazing in most games.
lasIn 5 years when 20GB VRAM might be needed to max out demanding games, 7900XT will be trashbinworthy :laugh:
You need to improve your math skills. 8GB is barely enough for 4k gaming right now. The moment the industry hits 8k displays for gaming, 20GB will not be enough.

But I digress.. We seem to be getting a bit off topic...
Posted on Reply
Add your own comment
May 16th, 2024 08:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts