• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

RTX 3080 Users Report Crashes to Desktop While Gaming

At the end of the day, the first thing most people care about is perf/dollar, followed by perf/watt then additional features. With Ampere Nvidia has placed the most emphasis on Perf/dollar while getting a little efficiency uplift on the side.
Actually no. Perf/W and perf/$$$ are derived from the three things that matter: performance, price and power draw.

Using derived measures, one can complain to no end if really wanting to find fault. Is the performance better than the previous generation? Complain about perf/W. Is perf/W ok? Derive again and complain about the increase of the perf/W being smaller than the previous generation. And so on and so forth.

Despite the discussion on internet forums, the primary parameters of a video card are those that I bolded above. With the mention that some do not care about power draw (for various reasons) while some (lucky bastards) can ignore the price altogether.
 
Actually no. Perf/W and perf/$$$ are derived from the three things that matter: performance, price and power draw.

Using derived measures, one can complain to no end if really wanting to find fault. Is the performance better than the previous generation? Complain about perf/W. Is perf/W ok? Derive again and complain about the increase of the perf/W being smaller than the previous generation. And so on and so forth.

Despite the discussion on internet forums, the primary parameters of a video card are those that I bolded above. With the mention that some do not care about power draw (for various reasons) while some (lucky bastards) can ignore the price altogether.

Do you consider Performance, Price and Power separately or together ? cause if you look at them separately you are just a clueless buyer.
Highest performance ? RTX 3090
Lowest price ? iGPU
lowest power ? iGPU
Now if you consider them all then 3080 is certainly an upgrade from 2080 in every metric. And if you don't know how to reduce the power consumption to your liking then you shouldn't play with PC, go grab a PS5.
 
Do you consider Performance, Price and Power separately or together ?
Do you look at width, length and depth separately when purchasing a sofa?

The way I look at it is: about 200W TDP (because I don't want a big PSU and fancy cooling) and ~$250 budget. Then I go and look what kind of HP I can get within those limits. If it's at least 20% faster than what I currently have, I may consider an upgrade.
As you can see, perf/W or perf/$$$ is only an indirect part of that equation.
 
Used to run my 980 Ti with one cable tho now using two as it was recommended.
Considering a 980 Ti is fine power wise (only 250W TDP), it would be fine, as NVIDIA has tested the 8 pin to go up to 175W safely, hence why some of their quadros that are rated for 250W only use 1x 8 pin, but running higher TDP cards like 300+ on a single cable is too much (unless it is 12 pin, as that is rated for more).

Actually no. Perf/W and perf/$$$ are derived from the three things that matter: performance, price and power draw.

Using derived measures, one can complain to no end if really wanting to find fault. Is the performance better than the previous generation? Complain about perf/W. Is perf/W ok? Derive again and complain about the increase of the perf/W being smaller than the previous generation. And so on and so forth.

Despite the discussion on internet forums, the primary parameters of a video card are those that I bolded above. With the mention that some do not care about power draw (for various reasons) while some (lucky bastards) can ignore the price altogether.
Please read through this entire post

While I would like to agree with you, I can't because there are some things you are incorrect. Nguyen is correct in that, most consumers that are buying GeForce graphics cards care about performance/dollar first, hence why lots of people saw value in AMD 500 series even though their performance/watt was disgusting compared to 10 series, because they offered much better performance/dollar. Who SHOULD care about performance/watt over performance/dollar? Laptop, workstation and server users. Because all of these scenarios have specific conditions that require more priority on the performance/watt, such as limited power requirements. Why should DESKTOP GAMERS care more about performance/dollar than performance/watt? Because 80% (might be really off on this one) of gamers don't use their graphics cards to make money, unlike content creation or mining. So performance/watt shouldn't matter for the desktop gaming market (but if you absolutely must care because maybe you are the 1% of exceptions, 30 series has higher perf/watt than 20 series).

TLDR, generally speaking,
Desktop gamers should care more about performance/dollar than performance/watt.
Laptop, workstation, and server users should care more about performance/watt than performance/dollar.

However your next point is quite apparent. Yeah, people literally go to extreme ends to find some sort of fault on Intel, AMD, and NVIDIA. My theory on this (relating to gaming) is, people have the never ending lust for more FPS. I target 1080p60, that's all I really care about. But a lot of other people want higher and higher FPS, higher than what their monitor makes, and then they complained about tearing, and adaptive sync was made. I still target 1080p60, and I like to use V-Sync. And then a lot of gamers complained that they needed 1440p144 and once they got it it wasn't enough they needed more refresh rates, and now they need 1440p240. It has a quality cost, but according to the Earth, more FPS equals better. Oh by the way I still target 1080p60 with V-Sync, and I get to enjoy no weird quality loss. And then a lot of people complained about CPU cores not being enough, and we went from only needing 4c4t to 8c16t (ok 4c4t isn't really enough in 2020, but 4c8t is still holding up) so we had to go from requiring a 250$ CPU to needing a 500$ CPU or bust. Oh by the way I still target 1080p60, I am using a 3770K and a GTX 980 (got both used). Yes, an 8 year old CPU can still keep up for 1080p60. I know, it's surprising, yes.

Dang, this whole time, it looks like most of us were just complaining for more and more, and not actually enjoying the video games that we built computers for...... Yeah, the PC community in a nutshell. Honestly, don't target something stupid like 8K120, and you won't have a heartbreak every damn time you go under 240 FPS. At 8K resolution. geez.

Okay don't get me wrong, there is some merit to these complaints, but most of this stuff is overhyped. It is literally to the point people literally defend buying a 240Hz monitor for Roblox. I am not even kidding about this. People get hyper competitive on Roblox FPS games, using FPS unlocker, and getting lo and behold 300 f*cking FPS in a lego game. Yeah, this community is weird as f*ck. Anyways, I'm going off into way too many tangents...

Yeah, performance, price, and power draw are important figures, but not alone. A card running at 150W compared to a card running at 350W means nothing, if we don't know why it takes more power. You have to compare the bolded figures against themselves in order to actually say which card is better or worse.
 
  • Like
Reactions: bug
Can't find the thread dedicated to the issue with oleds and vrr:
 
EVGA confirms the issue with the capacitors

Recently there has been some discussion about the EVGA GeForce RTX 3080 series.

During our mass production QC testing we discovered a full 6 POSCAPs solution cannot pass the real world applications testing. It took almost a week of R&D effort to find the cause and reduce the POSCAPs to 4 and add 20 MLCC caps prior to shipping production boards, this is why the EVGA GeForce RTX 3080 FTW3 series was delayed at launch. There were no 6 POSCAP production EVGA GeForce RTX 3080 FTW3 boards shipped.

But, due to the time crunch, some of the reviewers were sent a pre-production version with 6 POSCAP’s, we are working with those reviewers directly to replace their boards with production versions.
EVGA GeForce RTX 3080 XC3 series with 5 POSCAPs + 10 MLCC solution is matched with the XC3 spec without issues.

Also note that we have updated the product pictures at EVGA.com to reflect the production components that shipped to gamers and enthusiasts since day 1 of product launch.
Once you receive the card you can compare for yourself, EVGA stands behind its products!

Thanks
EVGA

https://forums.evga.com/m/tm.aspx?m=3095238&p=1
 
Last edited:
Back
Top