• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4070 Allegedly Launches on April 13

Seeing how 4070Ti is losing to 3080 in some cases, the 4070 will be worse than 3080 in most cases because of the gimped memory. That's called progress in jensen land.
The 7900XT is also losing to 3080 in some cases, even though the XT launched at a price 15% higher than the 4070ti. But I guess thats called progress in lisa land

That must be a very specific use case, because no way that's on average with uncapped framerates.

I play with Vsync so I prefer a fixed clock with a manual undervolt. On my 3080, most games do 200-250 W depending on GPU utilization. I saw as much as 280 W in Metro Exodus EE with max usage.
But in Aliens: Fireteam Elite I saw over 300 W in some scenes. And that was on 1800 MHz @ 0.8 V, which is a crazy undervolt compared to the stock 1.05 V. With a power limit at stock voltage, performance would drop like crazy.

A fixed clock is so much better for a capped framerate. 1800 MHz @ 0.8 V and 1905 MHz @ 0.9 V was a difference of ~200 vs. ~250 W, even though the framerate was identical.

Of course all this is with the "prefer max performance" mode. Adaptive can save you more power, but it's horrible for frametimes.


If the 4070 is $600, I might consider side-grading my 3080. This was my first card with such ridiculous power consumption, and without the undervolt I would've gotten rid of it a long time ago.
It's not impossible. I have my 4090 capped at 320w and it has similar performance to running stock
 
It's not impossible. I have my 4090 capped at 320w and it has similar performance to running stock

That's not 55%. And the 4090 is a top end card that is underutilized in most situations. Find a game that can max out the power draw at stock settings and you will lose more than 5% with a 320 W limit.

power_plus_rt.png
 
Seeing how 4070Ti is losing to 3080 in some cases, the 4070 will be worse than 3080 in most cases because of the gimped memory. That's called progress in jensen land.
I assume you made a typo with that "3080" & meant a 3080 Ti? Even so the 4070 Ti easily surpasses the 3080 Ti & is in neck with the 3090 Ti despite the VRAM & gimped memory bandwidth & memory bus.

The 7900XT is also losing to 3080 in some cases, even though the XT launched at a price 15% higher than the 4070ti. But I guess thats called progress in lisa land

WTF are you & Pumper smoking? Delusional puff puff? :kookoo:
 
I did. Im not even losing 5%

Can you post some data in a 4090 thread? I'm curious what the clocks and voltages are in those scenarios. Unless there's a huge difference in voltage, but very small in the clock speed, it doesn't make sense.
 
Can you post some data in a 4090 thread? I'm curious what the clocks and voltages are in those scenarios. Unless there's a huge difference in voltage, but very small in the clock speed, it doesn't make sense.
Where is the 4090 thread? :P
 
Could you run Superposition 1080p Extreme, and post a screenshot of the result with points and FPS' visible...? I have never been too interested of using the PL slider, but to adjust the voltage directly, which tames down the wattage.

On the other hand, this kind of 160W scenario you mentioned, makes it interesting for laptops. With a TGP of 140W and above, and such amount of cuda cores as 3072 in "4060M", I see a possibility of pumping the GPU to 3000MHz without trouble.

Having 140W TGP equals with logic of more cudas=less frequency. Less cudas=more frequency, to cap with the 140W limitation. :rolleyes:
I'll run that benchmark later today, here are some earlier findings in Fortnite with and without Ray Tracing.

With RT

Without RT
 
The article says that 4070 will have a 192bit data bus, GDDR6 and 12GB RAM. So other than making a mistake in the type of VRAM, GDDR6X vs GDDR6, where am I wrong?
So, probably 384GB/sec bandwidth for the non Ti vs 504GB/sec for the Ti?
OK, $600-$650 then to compensate for the cheaper memory? No. Nope. Still $650-$700.
Nvidia is moving everything up now that it can.
Was a simple typo. fixed now to GDDR6X. Thanks
 
The 4070 will be priced too high for mainstream, and that's a guarantee. It sure as hell won't launch at the $499 (or the $549 inflation-adjusted equivalent) of the 3070. The GDDR6X will also be power hungry in a market segment that wants sub-250W cards.

The 4060 will be crippled by 8GB VRAM. The 3070 already is in several of today games and that number is getting bigger rapidly.

Thanks, Nvidia, but you've already f*cked up and you haven't even launched the products yet.
I'm really hoping AMD doesn't drop the ball with the midrange parts this generation. Nvidia, left unchecked, are a plague upon PC gaming. I cannot tell you how many people I've said "buy a console" to in the last 3 years because of dumb pricing - when previously there were good options you could put in a reasonable sub-$1000 build.
 
I am hoping to see <700 price on 4070.

Beyond that, 4070ti at 900 is the choice.


Expecting 4070 to score around 7750pts in Timespy Extreme (graphics score), lets see how much am I mistaken. ;D Just my guess based solely on the rumoured amount of 5888 cudas. But it would be quite far off from 4070Ti. And 3070 happens to have 5888, so mby it will be something different. If only one ram amount model coming, I would bet on 12GB.
 
Last edited:
The 4070 will be priced too high for mainstream, and that's a guarantee. It sure as hell won't launch at the $499 (or the $549 inflation-adjusted equivalent) of the 3070. The GDDR6X will also be power hungry in a market segment that wants sub-250W cards.

The 4060 will be crippled by 8GB VRAM. The 3070 already is in several of today games and that number is getting bigger rapidly.

Thanks, Nvidia, but you've already f*cked up and you haven't even launched the products yet.
I'm really hoping AMD doesn't drop the ball with the midrange parts this generation. Nvidia, left unchecked, are a plague upon PC gaming. I cannot tell you how many people I've said "buy a console" to in the last 3 years because of dumb pricing - when previously there were good options you could put in a reasonable sub-$1000 build.
Nvidia is a plague, yet the 4070ti is the best raster performance / $ card of this generation. So how can those 2 statement be true at the same time?

And that's excluding the RT performance, DLSS 3, better power draw etc.

The only plague is amd releasing inferior products for more money than nvidia..
 
Nvidia is a plague, yet the 4070ti is the best raster performance / $ card of this generation. So how can those 2 statement be true at the same time?

And that's excluding the RT performance, DLSS 3, better power draw etc.

The only plague is amd releasing inferior products for more money than nvidia..

The xx70 Ti having higher FPS / $ than the xx80 & xx90 is not the accomplishment you pretend it to be.

What generation has that not been true?

"NVIDIA is a plague, yet they released more than one GPU this generation. So how can those 2 statement be true at the same time?"

These are non-sensical points strung together.
 
I'm waiting for a RTX 4070 Super.....

People should not settle for anything less than "super puper" this gen, given the price on those things.

Maybe even "super puper duper" is worth waiting for.

4070ti is the best raster performance / $ card of this generation
Sure, John:

1678822382650.png
 
Nvidia is a plague, yet the 4070ti is the best raster performance / $ card of this generation. So how can those 2 statement be true at the same time?
Uhhh, dude, where are you getting your info from, because it's utter BS.
Here on TPU we use TPU data. If you want to use Nvidia's own marketing numbers, go and argue over on Nvidia.com with the mad fanboys.

1678829263208.png


Even ignoring the price adjustments, both AMD cards are better performance/$ than any AMD 40-series, period.
Taking real-world pricing into account, it's not even funny. All next-gen cards are offensively priced, IMO but when it comes to value, Nvidia are still losing and they're losing hard.

In fact, if you ignore the hypothetical fantasy MSRP of the 4070Ti there, what you have there is a graph where every single AMD card is ahead of every single Nvidia card.
Also, both 3090 cards are ahead of the 4090, the 3080 is ahead of the 4080, and the 3070Ti is ahead of the 4070Ti. Nvidia and value? Please, give me two minutes fetch you a clown costume.
 
The 7900XT is also losing to 3080 in some cases, even though the XT launched at a price 15% higher than the 4070ti. But I guess thats called progress in lisa land


It's not impossible. I have my 4090 capped at 320w and it has similar performance to running stock
Do you avoid the OP (4070) even in Nvidia thread's, wow lol.

It's AMD this, and My 4090 that again, are you serious.

On topic there are no bad card's they say I beg to differ, I wouldn't buy day 1, it's wise to check in with some reputable reviews from Tpu first.

Ps your still on ignore But that shit was so funny I had to say hi.
 
Uhhh, dude, where are you getting your info from, because it's utter BS.
Here on TPU we use TPU data. If you want to use Nvidia's own marketing numbers, go and argue over on Nvidia.com with the mad fanboys.

View attachment 287831

Even ignoring the price adjustments, both AMD cards are better performance/$ than any AMD 40-series, period.
Taking real-world pricing into account, it's not even funny. All next-gen cards are offensively priced, IMO but when it comes to value, Nvidia are still losing and they're losing hard.

In fact, if you ignore the hypothetical fantasy MSRP of the 4070Ti there, what you have there is a graph where every single AMD card is ahead of every single Nvidia card.
Also, both 3090 cards are ahead of the 4090, the 3080 is ahead of the 4080, and the 3070Ti is ahead of the 4070Ti. Nvidia and value? Please, give me two minutes fetch you a clown costume.
Was using techspots numbers - actual prices in EU at launch was 929 for cheapest 4070ti and 1049 for cheapest 7900xt. Actually even your own graph shows the 4070ti beating the 7900xt :roll:
 
Was using techspots numbers - actual prices in EU at launch was 929 for cheapest 4070ti and 1049 for cheapest 7900xt. Actually even your own graph shows the 4070ti beating the 7900xt :roll:
No you weren't, or at least if you were you weren't reading them properly.

1678870513736.png


On that techspot review, you must have missed a very important paragraph:
"Assuming you can buy a RTX 4070 Ti for $800 -- which in the short term is more of a stretch than a given"
That also still fails to excuse your claim that the 4070Ti is the best performance/$ this generation, too. Even Techspot has the 7900XTX ahead, despite using an unrealistic price for the 4070Ti.

Launch prices for the 4070Ti did in fact turn out to be a scam. Very few base models were made at MSRP, in several countries you can look at retailers who have MRSP models listed but they have no customer reviews (because if they're in stock at all, it was in such pathetically irrelevant quantities). It's why Nvidia has staggered the review embargo for the vanilla 4070 in April, to make sure more MSRP cards are made (and sold) at the MSRP.

That's why any review site or channel worth a damn calls the 4070Ti an $850 card. Because that is realistically the lowest price they sell for.
 
Last edited:
No you weren't, or at least if you were you weren't reading them properly.

View attachment 287875

On that techspot review, you must have missed a very important paragraph:
"Assuming you can buy a RTX 4070 Ti for $800 -- which in the short term is more of a stretch than a given"
That also still fails to excuse your claim that the 4070Ti is the best performance/$ this generation, too. Even Techspot has the 7900XTX ahead, despite using an unrealistic price for the 4070Ti.

Launch prices for the 4070Ti did in fact turn out to be a scam. Very few base models were made at MSRP, in several countries you can look at retailers who have MRSP models listed but they have no customer reviews (because if they're in stock at all, it was in such pathetically irrelevant quantities). It's why Nvidia has staggered the review embargo for the vanilla 4070 in April, to make sure more MSRP cards are made (and sold) at the MSRP.

That's why any review site or channel worth a damn calls the 4070Ti an $850 card. Because that is realistically the lowest price they sell for.
But both the msrp prices for the 790xt and the 4070ti weren't realistic. As I've said, actual sale prices in Europe were 929 for the 70ti and 1049 to 1069 for the 7900xt, making the 4070ti the better card in price to performance. And that's just taking only raster into account, no rt, no dlss3, no power consumption.

You are at a point now that you are arguing with facts. I consider this trolling
 
But both the msrp prices for the 790xt and the 4070ti weren't realistic. As I've said, actual sale prices in Europe were 929 for the 70ti and 1049 to 1069 for the 7900xt, making the 4070ti the better card in price to performance. And that's just taking only raster into account, no rt, no dlss3, no power consumption.

You are at a point now that you are arguing with facts. I consider this trolling
If i take MY price in europe (France), the 4070Ti is starting at 955€ and the 7900XT at 860€ (with reduction on amazon, old price is 970€). Now try to sell your NVidia 4070Ti to me.
See how it can change a lot depending on your country. That's why MSRP is used, and even with this, the 4070Ti is not a better choice for price/performance.

I know that you are paid by NVidia to shill here but as a human, try to understand that nothing is black or white, it's all shade of gray. Kinda lame to always have to say that.
 
But both the msrp prices for the 790xt and the 4070ti weren't realistic. As I've said, actual sale prices in Europe were 929 for the 70ti and 1049 to 1069 for the 7900xt, making the 4070ti the better card in price to performance. And that's just taking only raster into account, no rt, no dlss3, no power consumption.

You are at a point now that you are arguing with facts. I consider this trolling
Oh come on, stop digging yourself deeper into a hole of denial.

What country in Europe are you talking about where the 4070Ti is cheaper than a 7900XT? Sure as f*ck isn't France.

You're using made-up pricing that doesn't reflect what anyone else is seeing here, and the reviews you're citing even include disclaimer paragraphs that shed doubt on the likelihood of the MSRP pricing they're using being real.
1678887560829.png
 
Oh come on, stop digging yourself deeper into a hole of denial.

What country in Europe are you talking about where the 4070Ti is cheaper than a 7900XT? Sure as f*ck isn't France.

You're using made-up pricing that doesn't reflect what anyone else is seeing here, and the reviews you're citing even include disclaimer paragraphs that shed doubt on the likelihood of the MSRP pricing they're using being real.
View attachment 287897
Sites like idealo and geizhals, track prices throughout Europe. Obviously the xt is cheaper now, since prices officially dropped a couple of weeks ago. Before the recent price drops the xt was 15% more expensive than the 4070ti. I'm talking about products that were on stock, ready to order, 4070ti was at 929 and the xt was at 1029 to 1049.

You can use for example idealo and see price history, you'll see that until February the xt was over 1k euros while the 4070ti was below 950 just as ive stated repeatedly.

Now can you stop arguing with facts

If i take MY price in europe (France), the 4070Ti is starting at 955€ and the 7900XT at 860€ (with reduction on amazon, old price is 970€). Now try to sell your NVidia 4070Ti to me.
See how it can change a lot depending on your country. That's why MSRP is used, and even with this, the 4070Ti is not a better choice for price/performance.

I know that you are paid by NVidia to shill here but as a human, try to understand that nothing is black or white, it's all shade of gray. Kinda lame to always have to say that.
I wish I was paid by nvidia (or amd). Sadly not the case, I'm just reporting facts
 
Sites like idealo and geizhals, track prices throughout Europe. Obviously the xt is cheaper now, since prices officially dropped a couple of weeks ago. Before the recent price drops the xt was 15% more expensive than the 4070ti. I'm talking about products that were on stock, ready to order, 4070ti was at 929 and the xt was at 1029 to 1049.

You can use for example idealo and see price history, you'll see that until February the xt was over 1k euros while the 4070ti was below 950 just as ive stated repeatedly.
Why on earth are you arguing with outdated pricing on a news article from THIS WEEK about a launch date in the future?

I guess that's the problem here, you're stuck in the past and your "facts" have expired. They are old, irrelevant data that you should have stopped using BEFORE this news article was ever posted - It's not difficult to understand for everyone here except you, apparently.

Other "facts" that are equally useless, just as an example, are that the 4080 12GB will cost $899. Time invalidates many facts, because they're only factually correct until something changes.
 
Why on earth are you arguing with outdated pricing on a news article from THIS WEEK about a launch date in the future?

I guess that's the problem here, you're stuck in the past and your "facts" have expired. They are old, irrelevant data that you should have stopped using BEFORE this news article was ever posted - It's not difficult to understand for everyone here except you, apparently.

Other "facts" that are equally useless, just as an example, are that the 4080 12GB will cost $899. Time invalidates many facts, because they're only factually correct until something changes.
Because launch prices / msrp is the prices products should be compared upon if you are trying to decide which company is scamming people (which was what the post was replying to was arguing). Amds cards dropped in price cause they were worse compared to nvidias products - therefore they weren't selling as much. So it's obvious to me that amd tried to outscam nvidia with their pricing.

It's the same story with the cpus as well. Today both the 7600x and the 7700x are priced decently, but on launch amd tried to pull the heist of the decade.
 
Because launch prices / msrp is the prices products should be compared upon if you are trying to decide which company is scamming people (which was what the post was replying to was arguing). Amds cards dropped in price cause they were worse compared to nvidias products - therefore they weren't selling as much. So it's obvious to me that amd tried to outscam nvidia with their pricing.

It's the same story with the cpus as well. Today both the 7600x and the 7700x are priced decently, but on launch amd tried to pull the heist of the decade.
In case it's not super-duper-turbo obvious, BOTH companies are scamming people.
Also, MSRP's have been meaningless since 2020. If you're relying on them still, you're a fool. Every hardware review site and channel has repeated this incessantly for the last three years. If you ignore that advice, it's entirely on you.

Pricing of the 7900XT is just reflecting the real performance. It's 5/6ths of the 7900XTX so it should cost (at most) 5/6ths the price.
 
Back
Top