• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Next Gen GPU's will be even more expensive

Status
Not open for further replies.
AMD is just a parts supplier, like many others, for xbox and playstation. If they could they would
I don't understand this. AMD is the only party that can single handedly make an x86 APU. They could have forced this down MS/Sony's throats.
 
I don't understand this. AMD is the only party that can single handedly make an x86 APU. They could have forced this down MS/Sony's throats.

Sure, but don't forget AMD probably needs those contracts just as badly, they have the money. Maybe not us much now as in the beginning but still, It's not an unbalanced relationship, they need each other and no side can make demands.
 
I seriously hope this is fake! usually, nV puts the price like one hour before their conferences, and these prices are steep in Chinese market, I hope we don't go so north of the pricing of already bad 4000 series, and some miracle happens to AMD and Intel!
Hope dies last,
Last gen were pretty steep already, and there is no point lowering prices for those, except maybe for the 4060s.
 
That and you'd think after 8 years
In 8 years of RT "revolution"
Why do people keep saying 8 years (and "liking" it) in this thread? Turing launched in late 2018, 2024 - 2018 = 6 years. You don't need to stretch the truth to make the point land just as well, and for me stretching it makes it land worse, eh.

Personally I've enjoyed a many a game with RT on with my 3080 since late 2020, but I do agree that I'd have hoped for further progress in RT being less of a perf hit on newer generations (exaggerating that hit doesn't do arguments favours either...), and of course price is an issue.

All of this to some extent exacerbated by the expected 2 year release cadence being blown out by 1-2 quarters - normally we would have had new gen products from both AMD and Nvidia by now which would have had an effect on RNDA3/RTX 40 prices (hopefully downwards but who even knows these days :rolleyes: )
 
Last edited:
AMD is just a parts supplier, like many others, for xbox and playstation. If they could they would
What about all the "the way it's meant to be played" logos on PC games? AMD works with Sony and Microsoft on their custom chips just as much as Nvidia works with game devs on visual tech.
 
Why do people keep saying 8 years (and "liking" it) in this thread? Turing launched in late 2018, 2024 - 2018 = 6 years. You don't need to stretch the truth to make the point land just as well, and for me stretching it makes it land worse, eh.
Wow, yep, you're right 6 years. I must have been thinking Pascal by mistake. Genuine math fail on my part.

Yeah, I was a little shocked when I started up some newish games on my 3080 Ti and when it detected my hardware, it turned off all RT in the settings. In my mind I was like gee this was a $1200 card new (I bought it used).

I guess my biggest sticking point with regards to pricing is that since RT became a thing, prices consistently increased. First with Turing, then again with Lovelace. Meanwhile the mid-range has gotten expensive, and the mainstream market is kind of stagnating on the Nvidia side. If you actually want any decent gen on gen increases, it's basically pay up for the high-end.
 
There's also the fact that AMD has bigger dies for similar performance. Eg 7900XT 530 mm vs 4080S 380 mm.

Similar to huge B580 die compared to 4060 etc. Much higher cost of manufacturing to get similar result.

True, but 7900XT has a lot of area dedicated to facilitating the chiplet design. If I remember correctly, almost half of the MCD's and a portion of the GCD are simply to connect them two. I don't remember the calculations exactly, but off the top of my head the 7900XTX would've been a ~440mm die if it was a monolithic design. Would be faster too.
 
Maybe they bring back sli for raytracing / pathtracing? Two gpus could scale on pathtracing better than raster. This makes moore law easier too.
 
@tugrul_SIMD
at least for the past, getting a single gpu with the next bigger chip was still cheaper, things like scaling varied a lot between different games,
and things like tearing impacting visuals, no thanks.
ignoring the need for distance between slots or slim coolers so not to impede airflow/cooling.

mid 2000s i had a lot of ppl buying parts that couldnt make sense of saving up (kid), to buy the next faster card, instead of the smaller one now, and again in a few month.
 
Maybe they bring back sli for raytracing / pathtracing? Two gpus could scale on pathtracing better than raster. This makes moore law easier too.
Wonder what the power consumption would be with two enthusiast-level cards :rolleyes:
 
Blackwell is not using a mode expensive node than Ada. This is why Nvidia needs to make chips bigger and/or increase TDP to make a decent perf bump this time.

I probably grab a 5090 anyway but I expect RTX 6000 series on 3nm or better to deliver a true next gen jump. 4000 to 5000 series is not going to be huge, outside of 5090 thats going to be a beast and prob will cost 2000 dollars minimum. Don't really care, I am selling my 4090 with ease for 1000 dollars after 2+ years

I don't think 5080 will be 1500 dollars, more like 1200 dollars like 4080 on release.

But yeah, AMD has nothing and Nvidia can do what they want. RDNA5 is AMDs next big hope sadly is like 1-2 years away.

RDNA4 is not going to bring anything new to the table and low to mid-end only. A mere RDNA3 bugfix with improved RT. Top card is like 500 dollars yeah. Probably ends up around 7800XT/7900GRE.
Ye it's almost end of Finet generations, maybe one more on it
after that gaaFet, ps6 and all that
Will mainstream product after 2030s reach 1000 Tflops?
 
What are we at now?

Last I checked was with my old 4890, but 1TF was a selling point I think, same with that big cap on it..
 
What are we at now?

Last I checked was with my old 4890, but 1TF was a selling point I think, same with that big cap on it..
Funny that most people didn't even know what that meant but still that was a selling point. :D
 
What are we at now?

Last I checked was with my old 4890, but 1TF was a selling point I think, same with that big cap on it..

I think they were all part of the Terascale architecture. AMD probably named it that way because they knew they were going to cross 1TF with this family of GPU's which the 4850 finally did.

All in all, they were a fantastic family of GPU's and generally better than nvidia especially 5xxx which served me well for ages
 
@tugrul_SIMD
at least for the past, getting a single gpu with the next bigger chip was still cheaper, things like scaling varied a lot between different games,
and things like tearing impacting visuals, no thanks.
ignoring the need for distance between slots or slim coolers so not to impede airflow/cooling.

mid 2000s i had a lot of ppl buying parts that couldnt make sense of saving up (kid), to buy the next faster card, instead of the smaller one now, and again in a few month.
Yep. I remember when I bought a Radeon HD 7770 just because I wanted to CF it with another 7770 later. When the time came and I had the cash saved up, swapping the 7770 for a 7870 ended up being cheaper, faster, with double the VRAM (1 GB + 1 GB is still only 1 GB in the multi-GPU world), and with no tearing or other issues involved in the limited PCI-e bandwidth (one card uses full x16).
 
If RTX5070ti through RTX5090 are for professionals, will they have high 64-bit performance for GPGPU? What about AI performance and RT performance?
 
@DemonicRyzen666
able to name a single thing, that's factually incorrect with my post?
 
Sadly, that is just how tech works. Nvidia is way overpriced in South Africa. The 4090 is 40-45k. That is more than double the price of a 7900XTX which is 19.1k. Even the 4080 are around 25-26k. Obviously the 4090 are the latest and greatest but I am not willing to pay 100% more when I can basically do the same (without raytracing and a little lower res) for half the price. The leader in front sets the market. We have seen this with Intel, AMD and Nvidia. Not to speak about Apple products that are way too overpriced for what you get. Another sad part is that they gonna be in short supply. Scalpers gonna buy them and sell them for 2/3x price and people with deep pockets will buy them.
 
The leader in front sets the market.
Nope - consumers set the market. They just don't know it and keep swallowing for some strange, perverted reason.
 
If RTX5070ti through RTX5090 are for professionals, will they have high 64-bit performance for GPGPU?
FP64 has been reserved for the x100 chips for quite some time now. Given how this is mostly relevant to CFD/HPC, I believe most other professionals can live without it.
 
Nope - consumers set the market. They just don't know it and keep swallowing for some strange, perverted reason.
True
As long as we are paying 1000+ or 2000+ for GPUs they will keep charging those prices and more.
 
Nope - consumers set the market. They just don't know it and keep swallowing for some strange, perverted reason.
LoL you mean people who are willing to pay ridiculous prices as they apparently have loads of money
Even though i can afford these cards from faiividia i will never buy them ever again
If i looked how all the so called reviewers reacted so bad on the release of the 7900 XT very good card it always gets it can not beat failvidia so it is garbage.
For me the biggest fail is all the failvidia cards make objects blink like a sun, while i play games with my AMD cards i see more realistic graphics
The bling factor is as far i see the main reason why people are buying failvidia, there is nothing realistic about that insane bling
I did not upgrade since then as in games my silly old 6800 XT beats in the games i play many newer models except in ther new garbage games i do not even want to buy. There is not one game released which is not a new copy of another game
And since i am a strategy game lover i am stuck at older games as nothing i saw in the last 6 years makes me want to buy them
And i do not care about the garbage movies and stories they cook up also, i want to battle so even though it still is crashing often i kinda play everyday some battles Ashes of the singularity.
Yes they are old games as the new games are not worth buying at all, a friend buys all games he thinks are great and has hundreds in his steam account like me and never ever plays them.
So when i bought new strategy games i look how they are and could be a game i want .....
Well as i said currently before nothing is worth my money, and since scalpers bought all ther 7900 XT cards they are sold at silly prices as well.
So for me as long as my gpu still is the best and fastest card for my games i will stay on my 6800 XT as it runs fantastic.
I might consider the new AMD gpu if it is better performing in my strategy games if not i will skip it again
Gladly i have many friends who buy new cards almost every year so i can see if the outperform or give better graphics but till now nothing did.
Failvidia is of course not a candidate.
 
This should be the cost of RTX 5000 series with accordance with 2 decades of previous pricing from Nvidia themselves, taking into account inflation as well.

RTX 5090 $1200 (a Titan type product that can have a price premium for being the absolute best)
RTX 5080 $800 (considering there is no TI product this and last generation, this is the absolute best for normal consumers, meaning people who still have to work 8 hours a day, 6 days a week to be able to afford stuff. This leaves Nvidia with at least 10% profits over what it costs to bring and sell to market)
RTX 5070ti $600 (likely to sit closer to the 5080 than the 5070, its a bridge between the two cards and historically $600 is not an insane price for this tier)
RTX 5070 $450 (considering its got only 12gb of vram this is a very weak card, 12gb can already be used up in several games at highest settings and only 1440p resolutions, historically the 70 series has not been more than the $400-490 price range)
RTX 5060TI $300 (forget the little TI bit, this is essentially the true 5060, we get a TI only so it can be sold at a premium, so in essence its the real 5060 and should not sell for more than $300)
RTX 5060 $230 (in all intents and purposes this is a x050 series card, it will come with a measly 8GB of vram and 128bit bus interface, which is way too low for 2025)
RTX 5050 $130 (in all reality this is a GT 730 type card, GT 1030 type card, so its barebones entry level, it should not cost a penny over $150, should probably go for $120)
 
Pay up suckas.

You want the big benchmarks?

You need to spend the big dollas.
 
This should be the cost of RTX 5000 series with accordance with 2 decades of previous pricing from Nvidia themselves, taking into account inflation as well.

Yes, I had the same dream and then woke up with my feet uncovered, and here in the attic it's 10C. Besides, any self-respecting business overcharges by at least 50%, 10% is like nothing for the likes of leather man. GDDR7 is expensive and you can't know the BOM at this point. Anybody can get a 5090 with just 50 monthly savings, cut on the calories a little bit.
 
Status
Not open for further replies.
Back
Top