• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Next Gen GPU's will be even more expensive

Status
Not open for further replies.
After all of the latest leaks it is now pretty much confirmed that the 5900 will cost at least $2500, with all the other models within $100 margin of what I originally posted.
4090 was $1600 MSRP and sell for 2000+
5090 will be $1900-1999 MSRP and will sell 2500+
5080 I'm guessing around $1300~1400 MSRP

BTW
4090 in my EU region are priced 2200~3000+€ right now (No FE cards, this is EU)
4080... 1300~2000+€
4080S... 1100~2000+€ (most of them though 1100~1300€)
 
The cheapest RTX4090 from a reputable store where I live in Europe is $2000 before tax.

The RTX4080S is $850, which is about 40% of the flagship's price.
The RTX4070TiS is $685, a third of the price.

The 7900XTX is selling for $700, also about a third of the 4090.
The 7900XT is $580, less than a third.

Thank you.
 
The wealthy are to blame for the 4090 since they will pay whatever the asking price is, almost without exception.
I think that was the start of it (going back to the Titan X), but then crypto mining, and then the worst of all, AI changed the profit play for Jensen. There was some recovery post the mining craze, but there will be no end to AI for years. We're all screwed. When they have their 2025 allocation sold out and it's still 2024, and they're charging a 30% premium for the allocation... lol in a way we're lucky the 5090 is only going to be 2500k for ex. I don't know what the counter is, buy AMD yall?
 
I think that was the start of it (going back to the Titan X), but then crypto mining, and then the worst of all, AI changed the profit play for Jensen. There was some recovery post the mining craze, but there will be no end to AI for years. We're all screwed. When they have their 2025 allocation sold out and it's still 2024, and they're charging a 30% premium for the allocation... lol in a way we're lucky the 5090 is only going to be 2500k for ex. I don't know what the counter is, buy AMD yall?

It's amazing when you consider there's still no decent mass front end use cases. Open AI revenue is only 3.7Bn in 2024.

I tend to agree, no reason for Nvidia to reduce prices much from 4 series level, yet. It's up to AMD to offer value.

Which they're doing to some degree with price cuts every couple of weeks for months in fairness.
 
glad i got my 6950xt when i did!
 
glad i got my 6950xt when i did!

400Ws - I just couldn’t do that

If my RTX 3080 (300W) already turns my room into a sauna, stepping up to 400W would be the death of me. For the next upgrade, I’ve already made peace with the absurdity of ~350W but come summer, you can bet I’ll be going all in on some hardcore undervolting and modding a giant fan to blast hot air straight out the window.
 
400Ws - I just couldn’t do that

If my RTX 3080 (300W) already turns my room into a sauna, stepping up to 400W would be the death of me. For the next upgrade, I’ve already made peace with the absurdity of ~350W but come summer, you can bet I’ll be going all in on some hardcore undervolting and modding a giant fan to blast hot air straight out the window.
what makes ur card so hot?
 
400Ws - I just couldn’t do that

If my RTX 3080 (300W) already turns my room into a sauna, stepping up to 400W would be the death of me. For the next upgrade, I’ve already made peace with the absurdity of ~350W but come summer, you can bet I’ll be going all in on some hardcore undervolting and modding a giant fan to blast hot air straight out the window.
Custom water for the win my man. My office stays chilly year round no matter how long I game(6800xt). Altho I do occasionally turn the Vornado desk fan on across the room :)
 
what makes ur card so hot?

Long gaming sessions with the GPU running at max utilization and temps reaching around 80°C, naturally warming up my room. Not so bad in the winter but in the summer its a sweat engineer at work. I live in the UK with no air conditioning nor ventilation system. I guess its normal for intensive workloads to generate intolerable heat over time but we have been pushing the power envelope a little more aggressively of late which translates to faster heat build up.

Custom water for the win my man. My office stays chilly year round no matter how long I game(6800xt). Altho I do occasionally turn the Vornado desk fan on across the room :)

I'm alright for now. The 3080 is manageable, with a power profile comparable to the 6800 XT. When the summer heat kicks into high gear, I rely on a couple of standard room fans to keep things in check. One at the door to draw cooler air in and another by the window to push warm air out - does the job.
 
Since in 25 they should be going UDNA
I feel like RDNA4 is like VLIW4 of HD 6000 cards to 'fix' shortcomings of VLIW5 mainly tessellation problems, in RDNA case it's the ray tracing performance before we see the next big change.
 
If AMD can improve performance and reduce prices over the current generation, even if performance is only 10% faster, if prices are 20% lower than it would be great value.

A 8800XT that offers 10% faster performance than the 7900XT at $600 would be amazing. But they need to stack the price ranges from $200 to $600 with many great value cards.

Considering they are using the same manufacturing process, the cheaper GDDDR6 memory and the new cards are supposed to be slightly faster with smaller die size, I think a 20% reduction in prices should be in order.
 
Long gaming sessions with the GPU running at max utilization and temps reaching around 80°C, naturally warming up my room. Not so bad in the winter but in the summer its a sweat engineer at work. I live in the UK with no air conditioning nor ventilation system. I guess its normal for intensive workloads to generate intolerable heat over time but we have been pushing the power envelope a little more aggressively of late which translates to faster heat build up.



I'm alright for now. The 3080 is manageable, with a power profile comparable to the 6800 XT. When the summer heat kicks into high gear, I rely on a couple of standard room fans to keep things in check. One at the door to draw cooler air in and another by the window to push warm air out - does the job.
Buy a room ac

I feel like RDNA4 is like VLIW4 of HD 6000 cards to 'fix' shortcomings of VLIW5 mainly tessellation problems, in RDNA case it's the ray tracing performance before we see the next big change.
Time will only tell
 
10% faster, if prices are 20% lower than it would be great value.
Why? to justify Nvidia to lower their prices so you can still buy an Nvidia GPU just at a better price point? Nvidiots make me laugh, this is literally their go to and then they still over pay like £1000 for a fucking 70 class GPU that is so gimped you need to buy the next gen by the time it comes out..... fuck nvidia and their pricing, let the morons pay £1500-£2500 for high end GPU's whilst I will sit on my RX 6800 until I can get a decent uplift in performance for less than $400, and I absolutely will am never paying $500-$1000 for a fricken GPU, we had titans at that price back in the day, noway I will ever pay close to $1000 for a mid-tier GPU that will be outdone in 12 months time
 
If AMD can improve performance and reduce prices over the current generation, even if performance is only 10% faster, if prices are 20% lower than it would be great value.

A 8800XT that offers 10% faster performance than the 7900XT at $600 would be amazing. But they need to stack the price ranges from $200 to $600 with many great value cards.
The way the 8800XT looks in spec leaks, I'm convinced the performance difference is massively overhyped.
It looks like more of an equal to a vanilla 6800 but with half the FP64, so closer to the 7800XT but worse.
I'm tempted to try again and pick up a 7900XT just to see if I can finally get a card that isn't a complete dud.
By the time a GPU market with 5000 series kicks in, we're seeing price hikes across the board and that option will be out of sight AND reach.
Nvidiots make me laugh, this is literally their go to and then they still over pay like £1000 for a fucking 70 class GPU that is so gimped you need to buy the next gen by the time it comes out...
Oh just wait until you see the sales instructions for these cards. "xx70Ti cards and above are for professionals."
I am not making this shit up. Usually won't pay attention to performance leaks but when retailers are being told HOW to sell, red flag. MASSIVE. On fire.
In today out next year, get out while you can.

1732518323104.png

I will sit on my RX 6800 until I can get a decent uplift in performance for less than $400
Based.

1732521080646.png


Price tracking is fubar but once in a while I see a 6800XT or 6900XT for $400-450ish and some refurbs have a block. It's easy and looks real tempting.
There's no reason a 1080p gamer can't use these cards. If you can't make it work, good chance you're another normie/tourist and have failed.
These are really good cards that were built well and aged well. 7000 series not so much but aesthetics and performance of upper range card still reigns.
More importantly their insane amount of vram. These cards are loaded out with so much memory I can treat them like high dollar workstation stuff.
I'm never going to see anywhere near the max performance of any of these cards but all the more reason to get one.

If you were on an older card and had to choose, would you still go with a vanilla 6800?
 
Long gaming sessions with the GPU running at max utilization and temps reaching around 80°C, naturally warming up my room. Not so bad in the winter but in the summer its a sweat engineer at work. I live in the UK with no air conditioning nor ventilation system. I guess its normal for intensive workloads to generate intolerable heat over time but we have been pushing the power envelope a little more aggressively of late which translates to faster heat build up.



I'm alright for now. The 3080 is manageable, with a power profile comparable to the 6800 XT. When the summer heat kicks into high gear, I rely on a couple of standard room fans to keep things in check. One at the door to draw cooler air in and another by the window to push warm air out - does the job.
invest insome huge blocks of dry ice keep in seperate freezer . break u off a cube put in case for the day
 
I am not too convinced about RX 8800 XT raw power if we look at current specs. But 8800 GT comeback would be great ;)

8800 GT is one of the greatest gpus ever realased.

8800 Ultra (829$)
8800 GTX (649$)
8800 GT (249$)

8800 GT was damn close in terms of performance for less than half or triple the price! And it's all happened in one year time period.

AMD should choose this strategy because of its 12% market share :D
 
Last edited:
I think i am just gonna back down to 1440p and ride this 3080TI for all its worth.
 
If it were me I would have undercut Nvidia just to move units and get them out there.

But that isn't what I went to school for, maybe I should have :rolleyes:

The only people who talk about them are their owners, which are a relatively small minority in the big picture, no offence.

And to experiment with one just to see what it is like is an expensive venture.

For the kind of money we are talking about, most people will just stick with what they know.
 
The wealthy are to blame for the 4090 since they will pay whatever the asking price is, almost without exception.

IMO you don't need a 4090. I've owned a 3090 and found it underwhelming, not because of the insane rendering power it had for its day, but because it was too much for the game developers to cater for.

Art assets and poly counts are optimised for the highest "reasonable" target audience, and that's been the consoles for the last 15+ years. Owning a more powerful PC will give you the luxury of running the console-equivalent "quality" settings at "performance" framerates, but most of the time when you increase the resolution you just see individiual polygons where the developers didn't bother to add more detail because less than 1% of the audience would ever be able to experience it. Same deal with texture resolutions.

The other problem with having so much GPU power on tap is that trying to run a game at many multiples the framerates that the developers originally targeted. I've seen occasional instances of hitching and hiccups trying to chase 240fps in a game that was originally written to run at 60. In most games where there's no issues at higher framerates, you still have the problem that sometimes there is just a non-GPU bottleneck, and no matter what hardware you have you are going to drop momentarily to a much lower number. Trust me, when you're running at 240fps, suddenly dropping to 70fps because of some CPU or bandwitdth issue is really jarring, but if you're bumbling along at a pedestrian 100ish fps, especially with VRR, you're not even going to notice it. The 1% and 0.1% benchmark scores suddenly become a whole lot more important to you at very high framerates.

If you have a lot of (but not unlimited) money then just buy a high-end GPU that offers performance well beyond the curve. Now is a terrible time to buy a high-end GPU with the next generation imminent but for the majority of 2024 a 4070 Ti Super or a 7900XT has been all the GPU you really need, and those are 1/3rd the cost of the stupidly-priced 4090.

Having the $2000 4090 doesn't really add anything significant over the $800 4070Ti Super - Whether you have a 4090 or not, games capped at 60fps are still capped at 60fps, Skylines II will still run at 20fps, you're going to be getting very fluid framerates at max settings, and engine hiccups and other bottlenecks are still going to make your game slow with frame drops and inconsistences that you notice it occasionally - because the GPU has no say in those matters.
Well I must say something about this too. You said that between the RTX 4090 and 4070 Ti Super isn't no more High end GPUs. That's not true cause we do have the RTX 4080 super, and the Radeon RX 7900 XTX.

One or another. The main difference between those cards are the RayTracing performance. The monster of the Radeon card beats the 4080 in Rasterization. And costs actually less then 900 €uros. The green devils card costs still 1 Grand. But there are some Model types of some brands, who are still offered below 1000 bucks.

For those who claims the red team cards are energy eater. The difference is around 50-80 Watts more. And can be reduced easily, buy undervolting the GPU. And this task could be done idiot free. Just let the driver make his own tests.

Hey Would check at what Volts he could reduce. I did this on 2 of mine Radeons. And it just worked perfectly. I have still the RX Vega 64, and the Radeon VII.
Greets
 
This might be true if you run at 1080/1440p/60hz (and don't look at new AAA releases), but up that to 4K/60hz and you can use all the GPU power you can get your hands on. Today things are a bit different, but for a while 4K monitors were more common and cost the same as, or less than, 1440p monitors, at least in my parts of the world.
Just because 4K is more common than it used to be a couple of years ago, it doesn't mean you need one. If you're on 1440p, or even a reasonably sized 1080p, upgrading is a want, not a need.
 
Just because 4K is more common than it used to be a couple of years ago, it doesn't mean you need one. If you're on 1440p, or even a reasonably sized 1080p, upgrading is a want, not a need.

Consumer computers are driven by want. No one needs to play video games. No one needs X3D CPUs. The point is it's easy to buy 4K monitors today (and for a while it was cheaper than 1440p), an depending on where you hang people will absolutely advise getting as many pixels as you can. I've seen a lot of that around here too. "4K is wonderful". It was more common a few years ago though.
 
Well I must say something about this too. You said that between the RTX 4090 and 4070 Ti Super isn't no more High end GPUs. That's not true cause we do have the RTX 4080 super, and the Radeon RX 7900 XTX.

One or another. The main difference between those cards are the RayTracing performance. The monster of the Radeon card beats the 4080 in Rasterization. And costs actually less then 900 €uros. The green devils card costs still 1 Grand. But there are some Model types of some brands, who are still offered below 1000 bucks.

For those who claims the red team cards are energy eater. The difference is around 50-80 Watts more. And can be reduced easily, buy undervolting the GPU. And this task could be done idiot free. Just let the driver make his own tests.

Hey Would check at what Volts he could reduce. I did this on 2 of mine Radeons. And it just worked perfectly. I have still the RX Vega 64, and the Radeon VII.
Greets
I think you misinterpreted my post a little.

I didn't say there was nothing between the 4070 Ti Super and 4090, just that there's very little need to spend more than the $800 for a 4070Ti Super. As powerful as the 4090 is, the only real difference between it and the 4070Ti Super is what level of DLSS you need to run for path-traced games. Take CP2077 RT Overdrive at 4K, as the heaviest path-tracing example I'm aware of right now. The 4090 still needs DLSS and framegen to deliver playable results:

1732879284279.png


So the real question is whether you pay $2400 to run it at DLSS3 "balanced" on a 4090, or $800 to run it at DLSS3 "performance" on a 4070TiS. Both results are a bit blurry with laggy/crawly reflections and shadows, the only real difference is that the 3x more expensive 4090 is slightly less blurry.

If you have $2400 to burn on a 4090 then by all means, burn that money. For everyone else, getting most of the performance for a small fraction of the price is a far better, and far more satisfying purchase.
 
Consumer computers are driven by want. No one needs to play video games. No one needs X3D CPUs. The point is it's easy to buy 4K monitors today (and for a while it was cheaper than 1440p), an depending on where you hang people will absolutely advise getting as many pixels as you can. I've seen a lot of that around here too. "4K is wonderful". It was more common a few years ago though.
I'm sure "4K is wonderful" if you have the money for a high-end GPU as well. My point is that you don't need 4K, and you don't need a high-end GPU, either. Playing new games on your 1080p monitor is exactly the same experience as it was 10 years ago. Therefore, crying about the price of the 4090 is stupid.

TLDR: Upgrading your GPU to play new games on your 1080p screen is a need. Upgrading that 1080p screen for better visuals is a want.
 
@AusWolf
size@distance is what "needs" res, not games/visuals.

anything above 24" will look horrible at FHD, and im +50 without the best eyesight.
maybe you are fine looking at the "screen door" effect from low ppi, im not, especially if sitting close like on a desk (not couch-tv).

or is you moni 480p? right..
 
Last edited:
Status
Not open for further replies.
Back
Top