• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

RTX 5090 ridiculous price!

I don't get why people are complaining(!), AIB 4090's cost 3k+ USD in Turkey assuming you find a "importer" you can trust.
 
I can sell you AIB RTX 5090 for 10k USD in Istanbul airport, assuming you know I can be trusted.
 
When I see people, this time from one of the countries that has one of the highest living standards in the world, whining about the prices of a high-end consumer graphics cards they can easily earn in a month doing any kind of job, and they keep doing that (whining) as each generation is released, I would organize a charity fund. Let's donate $1 each to buy them this card, to ease their suffering.
Jokes aside, I'd buy Jensen another leather jacket for this drama piece, but good jacket is pricey here, where monthly wages are still around $600.
 
There will all ways be inflated prices. My RTX 4090 is no different. But it seems like i got mine for desent price seeing it all what has happened.

Well i payed 17000 Dkr. with tax. That is around 2370 USD with tax in my country.

With out tax it was 12750 Dkr. or 1.777 USD with out taxes. So yeah a bit over the 1600 USD msrp.

I got my gigabyte geforce rtx 4090 gaming oc back in November 2022. So yeah i was an early adopter to RTX 4090.

And seing prices now and what 5090 cost contra the rumored native performance uplift of 15-35 % performance uplift over RTX 4090. I can´t see a reason upgrade.
 
Not complaining about it maybe because 90% of potential buyers are not interested much in gaming?
The price has integrated into it the ability to do much more than gaming. And the frame buffer of 32GB is also a strong point.

Why Quadros are priced x2-4 more than x090s?
48 or 64GB of frame buffer is not worth multi k $$.
Its the ability to achieve stuff with them and by paying for them you get access to the more advanced software that is the 2nd key factor, other than the hardware itself.
Eh, tbh most of the 2~4x price tag is indeed due to the extra vram or licensing agreements with nvidia. Otherwise, there's no "more advanced software" compared to your regular GeForce counterpart.
Oh, these will sell to gamers alright. The extreme specs don't necessarily translate to "it's a professional or AI GPU", it's just the next-generation's flagship product. It's not just the AI crowd. The latest Steam Hardware Survey places the RTX 4090 at a higher utilization rate than even the RX 580, which is by far the best selling GPU model of all time (thanks to cryptocurrency miners). List here includes iGPUs and mobile dGPUs:

View attachment 380673
I may be spreading what could be fake news a lot, but I've heard that ~80% of the total 4090 supply was sold to AI farms.
So yeah, gamers will buy it, but it's far from the majority of the buyers, and far from the reason they're priced like that.
No matter the price, sheep will buy these and we can thank them for continuously increasing prices.

Wonder why practically no gamer bought a GTX Titan/Titan Black/Titan X/Xp for ~1000EUR but now it's totally okay to get a 2500EUR+ card every two years. Inflation isn't a valid answer.
Because before you had a 780ti/980ti/1080ti/2080ti that delivered the almost the same, or even higher performance at a fraction of the cost. Same reason why the 3090/3090ti didn't sell as much compared to the 3080/3080ti.
You don't have a similar thing happening with the 4000 and 5000 generation.
 
I don't blame tech YouTubers. I blame hardware becoming extremely advanced and difficult to manufacture, with ever higher standards of quality to be met.

Look at a 8800 GTX, made of plastic with a blower and a hunk of metal installed. Hell here's mine, the fancy professional model, that's really it.

View attachment 380690

Compare that to a modern GPU where instead of some plastic you have a die cast steel frame, instead of cheap blower you've got 4 carefully designed fans, there is no lighting, no fancy anything, even the chokes on a modern card are leaps and bounds ahead of the time GPUs were perceived to be a lot cheaper.

I don't think many people are even accounting for these things. Meanwhile it's not only that, software development and engineering costs have also gone through the roof, research and development budget is probably a thousand times that of G80's...

So yeah GPU prices will go up across every segment but especially the flagship tier which is where they will have margins to fund the whole venture.

It's a fair point. While everything is relative to its era, ultimately the machines and GPUs we have today are orders of magnitude more advanced than in the early 3D era, so it stands to reason they should be more expensive, both manufacturing and R&D. Same with game development. And that's not even factoring in inflation.

It's still weird though, the £1000 Titans we're talking about ultimately weren't that long ago, and they it was kind of bonkers spending that much on a GPU, and somehow now £2000 is kind of normal :D
 
There will all ways be inflated prices. My RTX 4090 is no different. But it seems like i got mine for desent price seeing it all what has happened.

lol.
 
maybe, but Nvidia has some famous blunders in his past but they can always fail upwards because of their position in this market. Even when they screw pricing it's mostly other people's problem, just like AMD is doing now:

AMD ain't done yet shooting themselves:

1737415116005.png
 
March!?! oof.
 
March!?! oof.

The date wouldn't be bad if they'd you know actually talk about performance and pricing if they looked strong like the rumors and were price well people wouldn't mind waiting.

The fact that they don't want to really talk about either and are instead waiting for 70 series reviews makes it look like they just want to price it like they always have so buissness as usual I guess always late to the party...

I miss the 7970 AMD that had the balls to go first and price a product... Those were exciting times or even the 290X AMD that came out with a product that beat the more expensive 780 and matched the Titan.... Started to feel old missing the good old days....
 
The date wouldn't be bad if they'd you know actually talk about performance and pricing if they looked strong like the rumors and were price well people wouldn't mind waiting.

The fact that they don't want to really talk about either and are instead waiting for 70 series reviews makes it look like they just want to price it like they always have so buissness as usual I guess always late to the party...

I miss the 7970 AMD that had the balls to go first and price a product... Those were exciting times or even the 290X AMD that came out with a product that beat the more expensive 780 and matched the Titan.... Started to feel old missing the good old days....
Tahiti will always hold a fond place in my heart. An amazing chip value for enthusiasts...
 
Tahiti will always hold a fond place in my heart. An amazing chip value for enthusiasts...

I loved both the 7970 and 290X... The 290X allowed me to skip Nvidia 700 series which was the only generation I've skipped since the 400 series.. Probably skipping 5000 though lol.
 
Tahiti will always hold a fond place in my heart. An amazing chip value for enthusiasts...
My 280x served me great, RIP.
 
Not complaining about it maybe because 90% of potential buyers are not interested much in gaming?
The price has integrated into it the ability to do much more than gaming. And the frame buffer of 32GB is also a strong point.

Why Quadros are priced x2-4 more than x090s?
48 or 64GB of frame buffer is not worth multi k $$.
Its the ability to achieve stuff with them and by paying for them you get access to the more advanced software that is the 2nd key factor, other than the hardware itself.
What's the point though? I find the 5090 too expensive for what is it, at least for gamers. I really do. And that's why most likely, im not gonna buy it. Not the end of the world. But in a world where the competition can't even match my 2 year old card, or better yet, not even my 4 year old 3090 where it matters, yeah prices are gonna be painful.
 
And use DLSS in any even unsupported game too, from now on (DLSS4) next driver :
which works only on RTX video cards
That's logical and not a problem. Those GTX 1080 Ti or 1080 fanboys should accept that their cards are outdated lol... (but no problem in using them, just must understand that they are not "top" or even "great" already lol)

I wouldn't be surprised if the Release 570 driver dropped support for pre-RTX video cards, with the exception maybe of Volta (Titan V) and small Turing (GTX 16 series), the RTX 20 series GPUs are now 7 years old, and Pascal is 9 years old. The current driver branch supports as far as Maxwell 1, the 750 Ti is now 11 years old.

Regardless this isn't new, DLSS requires a GeForce RTX card to begin with. It doesn't work on earlier models, and this includes even DLSS 1.0 from way back.
why the f some "Titan" cards should be "exception"? That's stupid, and Nvidia by its own make things COMPLICATED:
I have some GTX 7 series card, go get "latest" driver, then BOOM driver says "incompatible". I go GPU-Z, and WOW MAGIC, MY GPU DIE is used from 600 series got renamed! WTF NVIDIA? So all that "Titan"s must be buried the same then!

And GTX 16 series are already 3 gens old, too...:rolleyes:
 
worst than the price is the avaibility, there is almost none in recent shipment, some retailer got 20 5080 and 0 5090, like for the 4090 nvidia doesn't to see the 5090 going down in price so they aren't making a lot of them
most of them aren't for the public, you can find some in a full build tower from nvidia parteners
 
What's the point though? I find the 5090 too expensive for what is it, at least for gamers. I really do. And that's why most likely, im not gonna buy it. Not the end of the world. But in a world where the competition can't even match my 2 year old card, or better yet, not even my 4 year old 3090 where it matters, yeah prices are gonna be painful.
Didn't find any graphs where the 7900xtx wouldn't at least match the 3090 performance, other then Blender 4 Rendering & Stable Diffusion. Are these the important things you are mentioning? Or is it just FSR?
 
why the f some "Titan" cards should be "exception"? That's stupid, and Nvidia by its own make things COMPLICATED:
I have some GTX 7 series card, go get "latest" driver, then BOOM driver says "incompatible". I go GPU-Z, and WOW MAGIC, MY GPU DIE is used from 600 series got renamed! WTF NVIDIA? So all that "Titan"s must be buried the same then!

And GTX 16 series are already 3 gens old, too...:rolleyes:

It's not that it's an exception, it's that Volta is indeed newer than Pascal and has more in common with Turing than it has with Pascal. The fact that only a single, $3000 model with this architecture was ever released to consumers (which is the Titan V) could sway it either way, either get phased out and forgotten, or given an extra bone.
 
Didn't find any graphs where the 7900xtx wouldn't at least match the 3090 performance, other then Blender 4 Rendering & Stable Diffusion. Are these the important things you are mentioning? Or is it just FSR?
I was talking about the games that the 3090 already poops the bed, anything that has a decent amount of rt. You know, the cyberpunks and the likes.
 
i paid 650 euros for my 3060ti during the "troubles" and i never imagined things would be this stupid for so long. I had no choice, even 2nd hand cards were insane back then. People now have a choice, and they decide to enable these prices.
 
I was talking about the games that the 3090 already poops the bed, anything that has a decent amount of rt. You know, the cyberpunks and the likes.
oh yeah, now that you mentioned it average rt is higher on the 7900xtx BUT in CP2077, HogwartsL & "Ratchet & Clank" it is below the 3090... Interesting. thanks for the info
 
that's not were the money is going, that's a drop in the bucket, the one getting great margins is Nvidia not the ones building the card or selling the card.
And it would be more costly for Nvidia to do all the R&D and negotiate prices with everyone, back then as a small niche company as opposed to now as one of the largest companies in the world and diluting costs with other segments that are even more profitable.
Not sure I agree with this. First of all Nvidia do build and sell cards, which is what makes this fascinating, as it really does show up the AIB's for the liars that they are.

As Dr Dro stated, Nvidia funds the research, provides software, firmware etc. The AIB's for the most part are just slapping on a cooler which are usually rehashed from previous gen, branding it, and have own distribution and marketing channels.

Looking at Nvidia's UK site right now.

4080 Super FE is £950 aka MSRP. This is a fully built card, the cooler is very likely more expensive to manufacturer vs the cheap plastic stuff AIB's use..
Asus 4080 Super TUF is £1100. Given this is only a TUF, there was likely a previous even higher cost card from Asus. But delisted from most places now.
Asus have a 4090 thats £900 higher vs a gigabyte 4090. nearly 3000USD.

I brought my 3080 FE for £650 aka MSRP from Nvidia, at the time, EVGA and co were making big profits selling direct to miners (for some reason people thought Nvidia was doing this, but was the AIB's), the few that made it to retail, they were like the wild west, a retailer would order say e.g. 500 of a SKU at XX price, then the AIB would relent and say we can only give you 100 because we sold rest to miners, and the price is now YY if you still want them, then would be bidding wars for the stock, this was made very public by a UK retailer. They allowed a Founder Edition thread on their forums to help people buy FE from their main competitor (who is Nvidias authorised retailer). I remember seeing a Gigabyte 3080 card for nearly £1500 retail, when I paid £650 for my FE.

Nvidia also forced the retailer to make technical adjustments to try and not sell to scalpers, one unit per address, one unit per payment card, no foreign addresses, no post boxes, this was total lifetime units, you couldnt e.g. order one, and then later order another. No other retailer made this kind of effort that Nvidia did. Nvidia also later mine limited AIB chips, as they knew AIB's were taking the mickey. Because of this FE cards were considerably easier to get than AIB, and they were way cheaper, better built to boot.
 
Last edited:
Even in Canada....I think the price will go up to 4000 CAD...because we have something called: Scalper
 
It's not that it's an exception, it's that Volta is indeed newer than Pascal and has more in common with Turing than it has with Pascal. The fact that only a single, $3000 model with this architecture was ever released to consumers (which is the Titan V) could sway it either way, either get phased out and forgotten, or given an extra bone.
my thought it should be RIP. $3000 is a cost of VERY GOOD PC in ANY TIMEFRAME. Who buys only GPU for such price FOR PERSONAL use must realize that it's just "bling bling".
 
Back
Top