• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 4070 Could See Price Cuts to $549

Maybe FSR 3.0 works. I mean, without FSR 3.0, the RTX 4070 still enjoys a nice advantage because of Frame Generation and even if someone doesn't care about RT performance or CUDA, or power consumption or whatever, FG, no matter how we see it, does give RTX 4070 a very nice performance advantage, at least on paper, over Radeon cards and even RTX 3000 cards. But IF FSR 3.0 works, then there is no FG advantage. RTX 4060 and RTX 4070 cards lose an advantage against Radeons and RTX 3000 cards, in fact probably the main advantage Nvidia was pushing for RTX 4000 series in games.
 
FG DLAA gimmicks should be for free. they insist on pushing it not our problem.

100% agree as much as I like it in a couple games it isn't a feature people should be buying any gpu for.
 
100% agree as much as I like it in a couple games it isn't a feature people should be buying any gpu for.
And that's exactly how every feature Nvidia sold me a card for has played out, Physx, DLSs1 /2 , same thing to be fair with AMD but less intense on the big sell, just I mean hair Fx, I say no more.
But a feature, ,, scarcely ever been worth the effort because it'll be superceded way before you feel you got your worth out of the GPU you Bought for it.
And I have, many times.
 
And that's exactly how every feature Nvidia sold me a card for has played out, Physx, DLSs1 /2 , same thing to be fair with AMD but less intense on the big sell, just I mean hair Fx, I say no more.
But a feature, ,, scarcely ever been worth the effort because it'll be superceded way before you feel you got your worth out of the GPU you Bought for it.
And I have, many times.

I like DLSS enough that an AMD alternative would have to be clearly better otherwise other than that I don't care much about any other Nvidia feature.
 
I like DLSS enough that an AMD alternative would have to be clearly better otherwise other than that I don't care much about any other Nvidia feature.
Yeah I didn't.
3 Fg might be great but I bought into 1 got 2 for free then no 3.
So limited use in the 600+ games I own!
But it doesn't suit most games I play or more succinctly I choose other ways to gain Frames If I need too, such is my revulsion for what a 1080 spin(dirt rally2/others) does with it on and if you're not spinning out sometimes, are you sure you're trying hard enough.
I play online FPS too, in groups so higher FPS stabley and highly accurate outweigh all other needs and then fsr or dlss isn't good enough, none are , and before it's said reflex plus native is again faster than reflex and dlss.
All personal tastes really so I'm not arguing my ways it , I am again saying YdY.
 
"could" is a very loose term. I say let all of NoVideo's AIB partners suffer a little more and watch the stocks rot away when it's still more expensive than a 7800XT. (yes, AMD also gets the same treatment too.)
 
Yeah I didn't.
3 Fg might be great but I bought into 1 got 2 for free then no 3.
So limited use in the 600+ games I own!
But it doesn't suit most games I play or more succinctly I choose other ways to gain Frames If I need too, such is my revulsion for what a 1080 spin(dirt rally2/others) does with it on and if you're not spinning out sometimes, are you sure you're trying hard enough.
I play online FPS too, in groups so higher FPS stabley and highly accurate outweigh all other needs and then fsr or dlss isn't good enough, none are , and before it's said reflex plus native is again faster than reflex and dlss.
All personal tastes really so I'm not arguing my ways it , I am again saying YdY.

Me either, for sure everyone should always do what's best for their hobby and always grab the card that fits best with their use case.
 
Maybe FSR 3.0 works. I mean, without FSR 3.0, the RTX 4070 still enjoys a nice advantage because of Frame Generation and even if someone doesn't care about RT performance or CUDA, or power consumption or whatever, FG, no matter how we see it, does give RTX 4070 a very nice performance advantage, at least on paper, over Radeon cards and even RTX 3000 cards. But IF FSR 3.0 works, then there is no FG advantage. RTX 4060 and RTX 4070 cards lose an advantage against Radeons and RTX 3000 cards, in fact probably the main advantage Nvidia was pushing for RTX 4000 series in games.

Agree 100%.
 
Dream on..

At least 1 sale down. You have to look at this talk from the perspective of it being a negotiation between a customer and a supplier.
You don't tell your customer "dream on", because there are other suppliers and you will lose sales :D

People not happy at all, if you ask me.

1694726137236.png

 
Last edited:
And that's exactly how every feature Nvidia sold me a card for has played out, Physx, DLSs1 /2 , same thing to be fair with AMD but less intense on the big sell, just I mean hair Fx, I say no more.
But a feature, ,, scarcely ever been worth the effort because it'll be superceded way before you feel you got your worth out of the GPU you Bought for it.
And I have, many times.
All computer graphics are fakery. They're just mathematic tricks to get you to think something is realistically portrayed.

Better results generally come from more sophisticated formulas. That means increased computational requirements.

Person A: "Hey, there's a new shading model called Gouraud. It looks better than flat shading."
Person B: "Why do I need that? I'm happy with flat shading."

v2-82ca4933f599a3039f56fbdddf498847_r.jpg


Years later.

Person A: "Hey, there's an even better shading model called Phong. It's more realistic than Gouraud."
Person B: "Nah, I upgraded to Gouraud a couple of years ago. I'm fine with that."

A few more years pass.

Person A: "A mathematician by the name of Jim Blinn has altered Phong shading."
Person B: "I'll check it out. How do you spell his name?"
Person A: "B-L-I-N-N"

DLSS, upscaling, frame generation, ray-trace reconstruction, all part of the evolution of computer graphics.

There's a reason why we don't see flat shading in computer games anymore.

Yes, there might not be a usage case for you today for DLSS 3 Frame Generation or DLSS 3.5 Ray Reconstruction. But someday there probably will be for a usage case (e.g., game title) that you care about. The problem is you just don't know when that will happen.

DLSS 1.0 was not embraced at launch. Now all three GPU manufacturers (Nvidia, AMD, Intel) provide it as a tool for developers to tap into often with great effect. Many now consider DLSS 2.0 to have superior results to conventional TAA.

For sure the technology is improving, often in software. And it's not just about who has better/more transistors. A lot of these implementations are heavily influenced by the quality of the developer tools used to harness this technology.
 
Last edited:
And that's exactly how every feature Nvidia sold me a card for has played out, Physx, DLSs1 /2 , same thing to be fair with AMD but less intense on the big sell, just I mean hair Fx, I say no more.
But a feature, ,, scarcely ever been worth the effort because it'll be superceded way before you feel you got your worth out of the GPU you Bought for it.
And I have, many times.
That's why I'm an advocate of hardware-agnostic technologies and winning on pure performance. That makes me an AMD fan in some people's eyes which is laughable considering that maybe 30% of my hardware arsenal is AMD, the rest are Intel/Nvidia. :roll:
 
That's why I'm an advocate of hardware-agnostic technologies and winning on pure performance. That makes me an AMD fan in some people's eyes which is laughable considering that maybe 30% of my hardware arsenal is AMD, the rest are Intel/Nvidia. :roll:

There is no such thing as totally hardware-agnostic technologies in modern computing or consumer electronics.

The reason we have GPUs in the first place is that CPU cores aren't efficient in handling the mathematic calculations for 3D graphics. Many tasks and functions are now handled by specialized ASICs that were formerly handled by the CPU.

Want to see your CPU in action doing 3D graphics calculations? Just run Cinebench. Note the speed that the images are generated. Imagine playing a game with that rendering speed.

If your phone tried to decode video just using CPU cores, the battery life would be minutes, not hours.

When you watch YouTube on your fancy computer, it is using an encoder chip on your graphics card, not the fancy Ryzen 7800X3D CPU. At a fraction of the power (and thus heat). If you forced it to do software decoding handled by the CPU, you'd see a big power spike and complain about the CPU fan being too loud.

At some point, someone came up with algorithms for ray tracing. Originally this was done in software on CPUs. Took forever, not useful for real-time graphics. So it was reserved for still images or a short film (if you had the budget and time) like early Pixar shorts.

At some point, someone said, "hey, let's build a circuit that will handle these calculations more efficiently." Today, we have smartphone SoCs with ray-tracing cores.

Someday in the not too distant future, we'll have some other form of differentiated silicon. MPEG-2 encoders used to be custom ASICs. Today they handle a wide variety of encoding schemes, the latest being AV1. Someday there will something else that succeeds AV1 as the next generation. Performance will suck on today's encoding architecture, will be better with specially modified silicon to help speed things up.

A graphics card purchase is a singular event in time but a usage case may pop up next month that wasn't on the radar last month. We saw this with the crypto mining craze. We also found out what happens when the crypto mining policies change leaving a bunch of cards utterly useless.

I remember buying a Sapphire Pulse Radeon RX 580 new for $180 (down from the original launch MSRP of $230). Six months later during the height of the mining craze, that card was going for 3x what I paid for.
 
Last edited:
The reason we have GPUs in the first place is that CPU cores aren't efficient in handling the mathematic calculations for 3D graphics. Many tasks and functions are now handled by specialized ASICs that were formerly handled by the CPU.

If your phone tried to decode video just using CPU cores, the battery life would minutes, not hours.

There is no such thing as totally hardware-agnostic technologies in modern computing or consumer electronics.
DirectX, OpenGL, programmable shaders... there's a reason why 99% of game technologies run on every CPU and GPU.
 
DirectX, OpenGL, programmable shaders... there's a reason why 99% of game technologies run on every CPU and GPU.
Not sure how many DirectX games run on my iPhone.

Before OpenGL was a standard, it was proprietary IrisGL. It's not like OpenGL was instantly welcomed and adopted by everyone the moment the OpenGL ARB pressed the "publish" button. OpenGL wasn't always "just there" and it's not going to last forever either. Years ago Apple deprecated OpenGL (which was invented by SGI, a defunct 3D graphics company whose heyday was in the Nineties).

My computer (Mac mini M2 Pro) doesn't support OpenGL. My Mac mini 2018 (Intel CPU) might if I installed an old version of the operating system.

And DirectX is really a Microsoft standard that has been forced on the world due to their near monopolistic practices and strangehold on the PC industry (you remember that anti-trust investigation 20 years ago?).

A few years ago Vulkan wasn't on anyone's radar. Today it's important. Someday it'll fall to the side, overtaken by more modern graphics technology that has developed to address the changing needs of the industry and its users.

There are basic concepts that span multiple architectures but even in the products from one company, there isn't even compliance. As an AMD guy you should know that DirectX 12 isn't fully and evenly implemented on every single GPU even within one generation.

And designing any computer architecture is both a combination of hardware and software. The people who understand the hardware the best will have the best software. So Nvidia has hardware engineers that work with software engineers the latter writing drivers, APIs, etc. Apple has done this to great effect.

Remember that just because the industry picks a standard doesn't mean that it will be embraced by all forever and ever. How many DVI and VGA connectors does your RX 7800 XT have? Does it have a VirtualLink port (looks like USB-C)?
 
Last edited:
Not sure how many DirectX games run on my iPhone.

Before OpenGL was a standard, it was proprietary IrisGL. It's not like OpenGL was instantly welcomed and adopted by everyone the moment the OpenGL ARB pressed the "publish" button. OpenGL wasn't always "just there" and it's not going to last forever either. Years ago Apple deprecated OpenGL (which was invented by SGI, a defunct 3D graphics company whose heyday was in the Nineties).

My computer (Mac mini M2 Pro) doesn't support OpenGL. My Mac mini 2018 (Intel CPU) might if I installed an old version of the operating system.

And DirectX is really a Microsoft standard that has been forced on the world due to their near monopolistic practices and strangehold on the PC industry (you remember that anti-trust investigation 20 years ago?).

A few years ago Vulkan wasn't on anyone's radar. Today it's important. Someday it'll fall to the side, overtaken by more modern graphics technology that has developed to address the changing needs of the industry and its users.

There are basic concepts that span multiple architectures but even in the products from one company, there isn't even compliance. As an AMD guy you should know that DirectX 12 isn't fully and evenly implemented on every single GPU even within one generation.

And designing any computer architecture is both a combination of hardware and software. The people who understand the hardware the best will have the best software. So Nvidia has hardware engineers that work with software engineers the latter writing drivers, APIs, etc. Apple has done this to great effect.

Remember that just because the industry picks a standard doesn't mean that it will be embraced by all forever and ever. How many DVI and VGA connectors does your RX 7800 XT have? Does it have a VirtualLink port (looks like USB-C)?
Sure, things don't always (or rather, usually don't) start as universal, but there's some sort of standardisation along the way. Power connectors, car safety standards, there's lots of things that have been put into law, or at least some sort of general agreement. Companies have their own stuff, which get standardised, or die out, or take the Apple route (closed ecosystem with a solid fanbase) with time. I'm all for standardisation and all against the Apple approach (which Nvidia seems to be following lately). I like choice when I'm buying something, and I don't want to be forced to buy Nvidia because of DLSS, or AMD because of whatever.

I'm not an AMD guy. Only my main gaming rig is fully AMD at the moment, but I've got two HTPCs that are both Intel+Nvidia, and I've got lots of various parts lying around from every manufacturer. I generally prefer AMD's open source approach towards new technologies, but that doesn't mean I restrict my choices to only one brand (although prices seem to be doing that for me anyway).

I hope that makes sense. :)
 
Yep, $449 would be closer to the mark for a 12GB 3060 replacement.

We'd need a substantially more competitive market for that the 7800XT would likely needed to launch at the same time as the 4070 at $399 and even then I doubt Nvida would price that low.
 
call me when it gets to 329
 
No, even at 499 dollars the 4070 is a bad deal.
Your opinion. Clearly, not everyone agrees.

The ONLY thing the 4070 does better than the RX 7800 XT in is in power usage (it's gonna be used on stationary desktops, so who really cares anyways?) and ray tracing. And that's it.
And there you go. Those two things, and a few others you left out, are very good reasons to go with a 4070.

I'm not saying the 7800XT isn't a great card, because it is. I'm saying that it depends in what the user needs and wants out of their gaming experience.

call me when it gets to 329
Wait 2 years and buy it used.

People not happy at all, if you ask me.
Those are whiners doing what they do best. The rest of us live in the real world.

Better results generally come from more sophisticated formulas. That means increased computational requirements.
Not always. Frequently method to do the same work in a better, more efficient, ways are developed.
 
Last edited:
There is no such thing as totally hardware-agnostic technologies in modern computing or consumer electronics.

The reason we have GPUs in the first place is that CPU cores aren't efficient in handling the mathematic calculations for 3D graphics. Many tasks and functions are now handled by specialized ASICs that were formerly handled by the CPU.

Want to see your CPU in action doing 3D graphics calculations? Just run Cinebench. Note the speed that the images are generated. Imagine playing a game with that rendering speed.

Short answer: GPUs have far more floating point execution units than CPUs. Long answer: a GPU is designed for highly parallel, low FP-precision computation, such as graphics rendering.
GPUs (graphics processing units) are optimized for parallel processing, which allows them to perform many calculations at once. This is in contrast to CPUs (central processing units), which are typically optimized for sequential processing. Because of this, GPUs are able to perform many more floating point operations per second (FLOPS) than CPUs. Additionally, GPUs have specialized hardware, such as multiple cores and larger caches, that are optimized for the types of calculations that are commonly used in graphics processing, such as matrix operations. This further increases their ability to perform FLOPS.

GPU computing is faster than CPU computing because GPUs have thousands of processing cores, while CPUs have comparatively fewer cores.

Those are whiners doing what they do best. The rest of us live in the real world.

Actually it is the opposite. The "whiners" do live in the real world, while those who support unreal pricing, they lost connection with the world situation right now.
It's called stagflation, the worst, if you still remember this.
 
Actually it is the opposite. The "whiners" do live in the real world, while those who support unreal pricing, they lost connection with the world situation right now.
It's called stagflation, the worst, if you still remember this.
There's a fanciful twist. Reality is accepting things the way they really are. Fantasy is wishing for what you want to be. Those people, as well as many here, are expressing their wishes, fantasies that have zero bearing on actually reality. And some of them are doing so with complaint as a context. Thus, whiners be whining.
 
  • Like
Reactions: ARF
There's a fanciful twist. Reality is accepting things the way they really are. Fantasy is wishing for what you want to be. Those people, as well as many here, are expressing their wishes, fantasies that have zero baring on actually reality. And some of them are doing so with complaint as a context. Thus, whiners be whining.

No, this is simply some capitalists, speaking of nvidia, abusing their monopolistic position in the market.
There are two solutions - vote with your wallet and don't buy (like the "whiners" actually do) which results in an all-time low graphics cards shipments, and if this trend goes on, nivida will be forced to exit the graphics cards market.
 
  • Haha
Reactions: ARF
Back
Top