• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

How much would you pay for RTX 4080?

How much would you pay for RTX 4080 at most?

  • Less than $400

    Votes: 1,533 12.9%
  • $400

    Votes: 522 4.4%
  • $500

    Votes: 1,082 9.1%
  • $600

    Votes: 1,715 14.5%
  • $700

    Votes: 2,615 22.1%
  • $800

    Votes: 2,569 21.7%
  • $900

    Votes: 869 7.3%
  • $1000

    Votes: 639 5.4%
  • $1100

    Votes: 48 0.4%
  • $1200

    Votes: 102 0.9%
  • $1300

    Votes: 23 0.2%
  • $1400

    Votes: 32 0.3%
  • More than $1400

    Votes: 110 0.9%

  • Total voters
    11,859
  • Poll closed .
That makes no sense. Whether Ray-Tracing exists or not doesn't mean non-RTX dGPU's would have been "uninvented". It's like saying to someone "You don't see the need for PCIe 5.0 SSD? Go use a 1.44MB floppy disk then" as if there's absolutely nothing in between of those two extremes...
It's a comment on the ridiculousness of the notion to reject the new thing cause the old thing is just fine. Yes, the old thing is fine for you, if you don't want progress, go back to the stone age. Like I mentioned earlier, as a game dev and graphics programmer, rasterized triangles are at their limit in terms of being able to make a game look better for the amount of effort you put in (production wise and runtime wise). Ray tracing makes newer effects easier to implement and faster to run. It's really hard to actually comment on something like "Ray Tracing" as really, ray tracing has been done on shaders for many many years for all sorts of effects, really we're talking about the hardware used to accelerate the most costly part of Ray Tracing, intersections with geometry, that's really all it is (on top of denoising algos and reconstruction stuff, which is more AI focused and AMD seems to be catching up, but still not quite there).

It's crazy to me that people DON'T want hardware to accelerate a pretty fundamental process on the GPU that ray tracing leans on heavily. If you don't want better looking games, that's fine, why the hell are you even commenting on high-end GPUs, this isn't even relevant to you. If you DO want better looking games, trust me, this is the path of least resistance and it really opens the doors to better things.

This is pixel shaders all over again. When pixel shaders came out with DX8.1/9, there were laggards who were just fine with a textured polygon and no shadows. Now look at games. 20 years from now, if we're still alive, people be making the same argument again, "My Ray Tracing is just fine, I don't need no stinkin' neuro light field cortical signal injection".

You know the more I see Ray tracing.
And I can use it.

The more I think it's a complete waste of time, energy and money.

Pre baked lighting is so good in many cases on what are fast moving game's that to me the difference is not worth the effort.

Plus instead of pre baked lighting giving us efficient and effective gaming.

We're all supposed to buy cards using twice the eco credits (power) for three times the money credits, all while trying to save the earth by not using plastic bags cars or farting too much.

I think it lunacy.

Ok, now do baked lighting for an open world game. Implement a time of day system. Have the user move dynamic objects in the world is a statically lit scene and have it look good. Yes, baked lighting has it's place, and if you CAN use it you SHOULD. However there are many use cases where it doesn't fit and we have to revert to dynamic lighting models, which look bad compared to the baked lighting scenario (which a ray traced scenario would look identical to).

You're not supposed to buy anything, if you want to be green, don't buy high-end computer hardware, just get something used and mid-range.
 
I'm pretty sure GtaV has shadows, are they as good as RT, no, did I say they were, no, is it easy to not even notice when you are actually gaming,.


Yes.

I mean also do you game most games already had shadows, they're not new.


And your allowed RT, I am not the boss, I have RT, yet that's my opinion.

And let's be real about RT in use, most games Don't do much with RT and at best it still isn't much more than a gymic, fully simulated AI and physics, true 3d sound and full path tracing of all photons would be nice too but who's got a fu£#@ render farm and power station.
 
Last edited:
I'm pretty sure GtaV has shadows, are they as good as RT, no, did I say they were, no, is it easy to not even notice when you are actually gaming,.


Yes.

I mean also do you game most games already had shadows, they're not new.


And your allowed RT, I am not the boss, I have RT, yet that's my opinion.

A common misconception is that raytracing is only pretty shadows or reflections. True, so far, we've seen only heavily denoised soft shadows and accurate reflective surfaces, but the truth is, those are relatively low complexity graphics techniques afforded by raytracing.

For the first time in computer graphics, RT allows for a realistic and accurate simulation of atmospherics - complete with refraction, accurate subsurface scattering, depth of field, motion blur, caustics and barycentrics, light dispersion phenomena etc. - with mathematical precision. When Jensen Huang said it was the holy grail of computer graphics, he wasn't lying or trying to upsell the technology. It is the real deal. Without it, future generations of ultra-high-resolution graphics would not be feasible.

True, many of these techniques have been simulated with rasterized graphics relatively well, even if it took several generations of hardware and multiple iterations to do so - for example, NVIDIA's 2012 "A New Dawn" tech demo for Kepler showcased rasterized subsurface scattering and that relatively simple scene with one fairy took everything the original GTX Titan had and more to pull out. A full scene with that fidelity is still extremely difficult to pull off in an AAA game today, a decade later. I will argue that raytracing is still in its baby steps, it's an infant technology with almost limitless potential when combined with other advanced graphics techniques provided by DirectX 12 Ultimate such as variable rate shading, conservative rasterization, etc.


It's not even limited to graphics; sound can also be ray traced for unparalleled positional accuracy. For the time being, it's absolutely not essential if you are gaming on a budget, even though you should opt for a GPU that has at least basic support for the functionality. But I would not be so harsh on it. I personally believe in the technology, but it is definitely not useless just as it is not essential for today's games, especially if you're still playing GTA V, lol
 
It's a comment on the ridiculousness of the notion to reject the new thing cause the old thing is just fine. Yes, the old thing is fine for you, if you don't want progress, go back to the stone age.
I'm pretty sure that's not what people are doing, they're more saying "the cost + performance penalty isn't ready for mainstream yet" and they're still not wrong. The bulk of major GPU improvements throughout history have typically been about improving what came before in a more efficient way, not doing something in the most inefficient way possible purely to repurpose a 'designed for commercial use' GPU for gamers. Eg, Hardware Transform & Lighting significantly sped up frame-rates vs the status quo (real-time shadows via CPU software rendering), in many of these cases it certainly wasn't a case of "pay double for half the frame-rates then call everyone else a luddite whilst mumbling about 'progress'". Saying "you have a binary choice of paying £2,000 GPU for RTX or go back to Intel iGPU's with absolutely no option in between" sounds even more of a ridiculous extremist mindset than the people you're trying to mock for being 'ridiculous' though. :rolleyes:

A common misconception is that raytracing is only pretty shadows or reflections. True, so far, we've seen only heavily denoised soft shadows and accurate reflective surfaces, but the truth is, those are relatively low complexity graphics techniques afforded by raytracing. For the first time in computer graphics, RT allows for a realistic and accurate simulation of atmospherics - complete with refraction, accurate subsurface scattering, depth of field, motion blur, caustics and barycentrics, light dispersion phenomena etc. - with mathematical precision. It's not even limited to graphics; sound can also be ray traced for unparalleled positional accuracy.
For some reason I'm reminded of Thief The Dark Project which came out 24 years ago with some seriously impressive audio physics propagation / 3D positional audio and more than a few people bought expensive sound cards in anticipation of that "being the future". Two decades later, no-one even makes an attempt (including the painfully bad 2014 sequel) because it's "too much effort" or the developer has been instructed by their publisher to spend limited valuable development time on "more important stuff" (micro-transactions, DLC-spam, etc). It sounds nice in an advert for selling expensive GPU's but I'll file this under "I'll believe it when I see it" on any serious mainstream level beyond a couple of tech-demo showcases for exactly the same reason AI, audio, etc, have hardly improved in +15 years for reasons other than lack of hardware...
 
Last edited:
perhaps those who voted under 400$ would consider buying after guarantee will be over.
in 3 years u end up in no drivers, limited new featurse and support, repaste and thermal pads needed etc.
so how much 980 worth today? $100 something on ebay ?
 
700$ for me, that's with taxes. I don't feel the need to upgrade every gen, especially with those prices.... (though I am tempted)

RT is boring improvement as of now, for me. It's a slight visual improvement, that's not worth the "cost", when there is a lot of other areas that I'd rather want improvement. AI as mentioned (UESB2 has one of the coolest advancement there, running the (basic) RTS unit AI off of the graphics card, and enabling some million unit battle), better AI in FPS, RT sound would be way cooler than RT light.
 
I'm pretty sure that's not what people are doing, they're more saying "the cost + performance penalty isn't ready for mainstream yet" and they're still not wrong. The bulk of major GPU improvements throughout history have typically been about improving what came before in a more efficient way, not doing something in the most inefficient way possible purely to repurpose a 'designed for commercial use' GPU for gamers. Eg, Hardware Transform & Lighting significantly sped up frame-rates vs the status quo (real-time shadows via CPU software rendering), in many of these cases it certainly wasn't a case of "pay double for half the frame-rates then call everyone else a luddite whilst mumbling about 'progress'". Saying "you have a binary choice of paying £2,000 GPU for RTX or go back to Intel iGPU's with absolutely no option in between" sounds even more of a ridiculous extremist mindset than the people you're trying to mock for being 'ridiculous' though. :rolleyes:


For some reason I'm reminded of Thief The Dark Project which came out 24 years ago with some seriously impressive audio physics propagation / 3D positional audio and more than a few people bought expensive sound cards in anticipation of that "being the future". Two decades later, no-one even makes an attempt (including the painfully bad 2014 sequel) because it's "too much effort" or the developer has been instructed by their publisher to spend limited valuable development time on "more important stuff" (micro-transactions, DLC-spam, etc). It sounds nice in an advert for selling expensive GPU's but I'll file this under "I'll believe it when I see it" on any serious mainstream level beyond a couple of tech-demo showcases for exactly the same reason AI, audio, etc, have hardly improved in +15 years for reasons other than lack of hardware...
Yep, all without RT.

And with RT we usually only get one or two of the many features it could bring.

Pointless despite being the future.
@BSim500 I'll quote myself.
And let's be real about RT in use, most games Don't do much with RT and at best it still isn't much more than a gymic, fully simulated AI and physics, true 3d sound and full path tracing of all photons would be nice too but who's got a fu£#@ render farm and power station.
 
there is really no need for anybody to move from the 3xxx series nvidia cards to the 4xxx series unless they want or need more than what the 3xxx series offers...

trog
 
there is really no need for anybody to move from the 3xxx series nvidia cards to the 4xxx series unless they want or need more than what the 3xxx series offers...

trog

At this point the RTX 40 is nothing more than new tiers above the already existing RTX 30.
RTX 4080 is actually RTX 3095 Ti
RTX 4090 is actually RTX 3099 Titan Ultra.

They will exist at the same time, nvidia doesn't want to replace and EOL the RTX 30 series.
 
At this point the RTX 40 is nothing more than new tiers above the already existing RTX 30.
RTX 4080 is actually RTX 3095 Ti
RTX 4090 is actually RTX 3099 Titan Ultra.

They will exist at the same time, nvidia doesn't want to replace and EOL the RTX 30 series.

i think making the 4xxx series an extension of the 3xxx series makes a lot of sense but that is just me others will think differently..

trog
 
The current prices in Germany in euro, RX 6600 is 260:

Radeon RX 6400 - 152.89
Radeon RX 6500 XT - 199.00
Radeon RX 6600 - 259.00
Radeon RX 6600 XT - 339.00
Radeon RX 6650 XT - 329.00
Radeon RX 6700 XT - 399.00
Radeon RX 6750 XT - 455.00
Radeon RX 6800 - 499.00
Radeon RX 6800 XT - 648.00
Radeon RX 6900 XT - 699.00
Radeon RX 6950 XT - 880.00
And a little comparison from the latest Hardware Unboxed video:
1669053870091.png
 
Voted: $800 (as the absolute most)

....a coerced vote induced by my months-old several time mention in several threads that I absolutely will not shoot beyond $800 for my next GPU upgrade....as quoted "not a single dime/cent more". I guess this was my way of finally putting a brake on impulsive hardware purchases to feed the over-fed performance/eye candy enthusiasm. Or just a method employed to be less of a profit casualty or prey.

The reason I mention the above, i don't value the 4080 at $800, for me it's a $500 card... unless production or bringing-to-life/shelf costs and nV poor profit margins suggest otherwise. But I would fork out $800, what can you do... i wouldn't mind paying $300 on top if it achieves 4080-like execution which is right up my alley for the targeted/desired performance. Please don't start mentioning the 10-year previous Gen XX80 division prices to contest the $500 mention... as if previous generation GPUs were all utopia-romantically priced with consumers having the last laugh.

....finally its an inconsequential post without the mention of $1200 being an absolute (2x-4x fantasized but futile) stripped-butt-naked rip-off MSRP. Wouldn't touch it if you paid me (providing i get to keep the money :p).

-----------------------------

11.6% for less than $400 - i feel for you guys!

2.8% all votes above $1000 - you feel for i guys!

How about the 2.8%'ers remove those charitable donations from filling nV pockets and offer donations to the 11.6%'ers instead, so eventually we can all average up closer to the $500-$800 region lol... a stronger data point to make things easier for w1zzard hehe
 
Last edited:
Nvidia marketing guys aren't stupid, they know the 4xxx cards are overpriced but they're doing it on purpose so that people keep buying series 3xxx cards to clear the post mining stocks and make more bucks from rich early adopters. Once 3xxx cards stocks will be gone they'll probably lower 4xxx cards prices. But it might takes several months before this happens.

In other words the market is still adjusting to mining and covid. Even if the 2 causes are now gone it'll still take several months to get back to normal. If there is no new production or mining issues in the meantime.
 
average price of gpu's in my country, today.....

rtx 4090 24gb = US$ 1900
rtx 4080 16gb = US$ 1500
rtx 3090 ti 24gb = US$ 1400
rtx 3090 24gb = US$ 1250
rtx 3080 ti 12gb = US$ 850
rtx 3080 10gb = US$ 650

with that performance.... the average price is not included of new PSU 1000w 80+ gold (US$ 200)....

so, after radeon release rx 7900 xtx 24gb and rx 7900 xt 20gb series for the next month, the average price of rtx 4080 16gb must fall, under US$ 1000-1200......
 
I'm pretty sure that's not what people are doing, they're more saying "the cost + performance penalty isn't ready for mainstream yet" and they're still not wrong. The bulk of major GPU improvements throughout history have typically been about improving what came before in a more efficient way, not doing something in the most inefficient way possible purely to repurpose a 'designed for commercial use' GPU for gamers. Eg, Hardware Transform & Lighting significantly sped up frame-rates vs the status quo (real-time shadows via CPU software rendering), in many of these cases it certainly wasn't a case of "pay double for half the frame-rates then call everyone else a luddite whilst mumbling about 'progress'". Saying "you have a binary choice of paying £2,000 GPU for RTX or go back to Intel iGPU's with absolutely no option in between" sounds even more of a ridiculous extremist mindset than the people you're trying to mock for being 'ridiculous' though. :rolleyes:


For some reason I'm reminded of Thief The Dark Project which came out 24 years ago with some seriously impressive audio physics propagation / 3D positional audio and more than a few people bought expensive sound cards in anticipation of that "being the future". Two decades later, no-one even makes an attempt (including the painfully bad 2014 sequel) because it's "too much effort" or the developer has been instructed by their publisher to spend limited valuable development time on "more important stuff" (micro-transactions, DLC-spam, etc). It sounds nice in an advert for selling expensive GPU's but I'll file this under "I'll believe it when I see it" on any serious mainstream level beyond a couple of tech-demo showcases for exactly the same reason AI, audio, etc, have hardly improved in +15 years for reasons other than lack of hardware...

You're examples work because they are cherry picked. Let me cherry pick some examples. The advent of shader programs. Better looking games, lower frame rates, and simply not possible on cards without it. You also have no idea what you're talking about with game audio, there are seriously impressive libraries and technologies used in games to make really awesome soundscapes (spoiler, I've developed games, AAA). All the audio tech of the past was accelerated by sound cards as CPUs were too slow. Now that sound is a tiny task for CPUs, we have no need for sound accelerators anymore, that tech never went away, it's here today. You're also wrong about performance penalty, for the effects they enable, they give a performance BOOST compared to graphics cards without ray tracing hardware doing the SAME effects. The NET effect of ray tracing is it takes more time and thus produces a lower frame rate ... but this has always been the case with new tech in this regard, because you're rendering an effect that wasn't previously there. The same actually applies for hardware T&L, it accelerates that same EFFECT on hardware, it's essentially the same as ray tracing hardware in that regard, because if you get a card without dedicated ray tracing bits, it runs even SLOWER. I'll grant you that Ray Tracing has yet to produce the same shift in real time graphics as shader programs did though, but we're only really at the beginning here.
 
perhaps those who voted under 400$ would consider buying after guarantee will be over.
in 3 years u end up in no drivers, limited new featurse and support, repaste and thermal pads needed etc.
so how much 980 worth today? $100 something on ebay ?

wtf is that stupid comparison ? 980 was released 7y ago
Look at current release date of 3000 series, years AND currents price... (can't imagine if crypto were still high)
 
With all the controversy surrounding the RTX 4080 launch, we're wondering. How much is the card worth in your opinion? What's the highest you're willing to pay for it?

Although i admire:love:W1zzard's work in his reviews , his polls needs a little refinement i think ,since they are confusing (in the past i had an issue with another poll)

First of all , the most basic :
The amount i'm willing to pay , varies through time ,related with my income at a certain point of time.
Specifically , right now my financial are extremely strict , thus i wouldn't pay more than 500-600$ for an RTX4080. But if my financials were similar to better(past) times , i wouldn't mind to pay close to 1000$ for the RTX4080.
So the poll's title : """How much would you pay for RTX 4080?""" is confusing .
Does that mean : ""how far would you be willing to pay during a long/er-period during RTX4080's life-cycle ,whenever your financial gets optimal"" , or ""how far would you be willing to pay right now ,during RTX4080 being at its launch date""
As i said , my answer would differ related with the time of purchase , and since i believe that in few months from now my financials will be improved compared to right now , time is important for me in order to choose an answer.

Secondly , W1zzard's quoted sentence is also confusing:
How much do i think that the card is worth , is different to how much i'd be willing to pay for it.
--I think that the card worths its asking price : TSMC has increased the waffer cost so it's inevitable that the prices will go upwards compared to the past.
As for the card itself , we are talking about huge generational leaps in all areas : raster performance , RT performance , its power-efficiency:love: improvement is mind blowing ,and also it uses the same cooler like the RTX4090 which guarantees excellent thermals(thus longevity) and lower noise. everything in this card seem excellent:love:
--But ,nevertheless ,as i said , the highest i'm willing to pay for it differs ,since although i believe it worths its price , i don't think i can afford a price over 1000$
 
it should be free, enough people dont know any better than a capitalist system. well done folks. we are now screwed.
 
Same price I paid for my 12GB 3080. $700.
 
4080 at 450$ will be nice - 4080ti 600$ 4090 750$ 4090ti 900$

Just remember the past - 2010 high end gpus was at 300-500$ the dual gpus monsters from amd nvidia was around 700$

We need to go back the prices like was before 2010 the companies will never be happy about the money gets but with evey year passing they get more and more hungry and here we are today !
 
4080 at 450$ will be nice - 4080ti 600$ 4090 750$ 4090ti 900$

Just remember the past - 2010 high end gpus was at 300-500$ the dual gpus monsters from amd nvidia was around 700$

We need to go back the prices like was before 2010 the companies will never be happy about the money gets but with evey year passing they get more and more hungry and here we are today !

lets not forget that a few years back we had SLI.. so high end would have been 500 x 2 i always used to run sli.. right up until i bought a 2080ti..

trog
 
I would pay no more than £500 for a graphics card. At the end of the day its just a graphics card. Its only part of the pc.

I have two main gaming pcs (I move around a lot with work) one has an FE 3070 that cost me £400 and one has an RX5600XT that cost me I think it was £200. Both of these pcs have Ryzen 2700X cpus. Im not an AMD fanboi, neither am I into nVidia or indeed Intel. I buy what suits at the time.

What I would say though, with nVidia prices, if we pay the prices that they demand they will take our money and put the prices up. If the sales are down prices will go down. Like £1000+ for a graphics card? no, its not that I cant afford it, I can but no. I wont pay those sorts of prices.
 
I would pay no more than £500 for a graphics card. At the end of the day its just a graphics card. Its only part of the pc.

I have two main gaming pcs (I move around a lot with work) one has an FE 3070 that cost me £400 and one has an RX5600XT that cost me I think it was £200. Both of these pcs have Ryzen 2700X cpus. Im not an AMD fanboi, neither am I into nVidia or indeed Intel. I buy what suits at the time.

What I would say though, with nVidia prices, if we pay the prices that they demand they will take our money and put the prices up. If the sales are down prices will go down. Like £1000+ for a graphics card? no, its not that I cant afford it, I can but no. I wont pay those sorts of prices.

Yep. I loved how the 3090 Ti pricing went down quite fast when very few people were stupid enough to buy them.
 
Back
Top