• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Change GPU always in the world of FHD

Your PSU is a BeQuiet' Dark Power Pro 11 650W. A stock 3070 ti requires a 750 watts on nvidia's website. A 3070 requires a 650 watts PSU as per nvidia's website. A RTX 3070 ti is not a good upgrade option.
A RTX 3060 ti requires 600 watts and a 3060 requires 550 watts. A 2060 supper requires a 550 watts PSU. A 2080 super 650 watts. source

An AMD RX 6600 requires a 450 watts PSU. An AMD RX 6600XT requires a 500 watts and an AMD RX 6650XT requires a 500 watts.

read the fine print:
5 - Requirement is made based on PC configured with an Intel Core i9-10900K processor. A lower power rating may work depending on system configuration.
116012.png

OP:
97910.png


pro tip: take the max power limit of the card's bios, divide 80% (or multi1.20%) add that to what you already know (rest of the system) and call it a day. though *big chips* can have high transients the new pcie-5 power specs for PSUs - anything 450+watts will be required to provide 200% for xxx milliseconds will take care of that - beside being irrelevant here.

or maybe i'm a luddite for thinking a PSU should fit like a glove -platinum is an excuse to go overkill - 80%eff. at 10% load and all . . .
 
Just another stooge he is
Any thread he posts in it's the same thing
AMD is Garbage and NVIDIA is the only choice.
The Internet has a bias towards AMD because they strong arm them.
Ray Tracing is the only thing that matters for Gaming.
NVIDIA is the Company that looks out for the Gamer.

I have had a few watching the FA Cup final so he caught me at the right time. I am really tired of people who preach the disinformation that is rife in modern society.
 
The Best Graphics Cards Shortlist Tomshardware.
GPUPerformance RankDXR RankValue Rank – online (MSRP)
Nvidia GeForce RTX 3090 Ti1 – 132.4 fps1 – 84.4 fps13 – $2,000 ($1,999)
Nvidia GeForce RTX 30804 – 116.3 fps2 – 66.3 fps12 – $949 ($699)
AMD Radeon RX 6900 XT2 – 130.6 fps3 – 49.8 fps11 – $1,020 ($999)
AMD Radeon RX 6800 XT3 – 124.5 fps4 – 46.1 fps10 – $920 ($649)
AMD Radeon RX 68005 – 111.7 fps6 – 39.3 fps9 – $800 ($579)
Nvidia GeForce RTX 3060 Ti7 – 91.5 fps5 – 43.3 fps7 – $580 ($399) better than the 6700xt. MSRP lower than 6700xt over 10fps better in RT.
AMD Radeon RX 6700 XT6 – 96.0 fps8 – 30.5 fps6 – $515 ($489)
Nvidia GeForce RTX 30609 – 70.2 fps7 – 32.3 fps5 – $390 ($329) cheaper and faster than 6600XT. Close to 10fps faster in RT.
AMD Radeon RX 6600 XT8 – 78.2 fps9 – 23.6 fps4 – $410 ($379)
AMD Radeon RX 660010 – 66.7 fps11 – 19.7 fps3 – $325 ($329)
Nvidia GeForce RTX 305011 – 51.4 fps10 – 22.8 fps2 – $300 ($249)
AMD Radeon RX 6500 XT12 – 30.4 fps12 – 5.6 fps1 – $210 ($199)

Where I live.
Palit GeForce RTX 3060 Ti Dual LHR 8GB GDDR6 PCI-Express Graphics Card £469.99
PowerColor Radeon RX 6650 XT Red Devil 8GB GDDR6 PCI-Express Graphics Card £428.99
Palit GeForce RTX 3060 StormX LHR 12GB GDDR6 PCI-Express Graphics Card £399.95
PowerColor Radeon RX 6600 XT Fighter 8GB GDDR6 PCI-Express Graphics Card £359.99 down from £379.99
Palit GeForce RTX 2060 Dual 12288MB GDDR6 PCI-Express Graphics Card £299.99 down from £379.99
PowerColor Radeon RX 6600 Fighter 8GB GDDR6 PCI-Express Graphics Card £289.99 down from £338.99

Given that the RX 6600 is only beaten by the RX 6500 for garbage perfromance, all the bottem cards are for is turning the monitors on. The RTX 2060 12GB beats the RX 6600 in both raster and RT as shown above in a post of mine. There is no reason to get a AMD RX 6600. On tomshardware the 3060 is faster than the RX 6600XT and only because of discounts its a few pounds cheaper than 3060 at the example shop. The RTX 3060 is the faster card and will be faster overall than the 6600XT (note the very poor RT performance with AMD reducing their GPUs ranks). As you can see the 3060 ti beats the 6650xt/6600xt and 6700xt for performance and is just a little more than the RX 6650xt. There is no reason to ever get an AMD card on a budget. The 3060 has more rebust performance at the same price point of the 6600xt with far more rebust ray tracing performance. Ray Tracing performance is less affected by the cpu on nvidia cards. Raster performance is more affected by the old cpu on all cards.

Budget get a RTX 2060 12GB or a RTX 3060. Want better performance and the budget will stretch get a RTX 3060 TI. Don't listen to the trolls, you wont get the top raster numbers with these cards and nvidia's performance is far better. A RTX 3060 or RTX 3060 TI and you will be able to play RT games like you had a RX 6700xt or better from AMD. The Ray tracing performance of the 3060 TI is better than the RX 6800 and close to the 6800XT. I would say a RTX 2060 12GB is better value than a RTX 3060.
In RT the cards are as so, note the poor AMD perfromance.

07150016389l.jpg


The RX 6600XT is behind the RTX 2060 super FE and RTX 3060 in Ray Tracing.

07150044297l.jpg


The RTX 3060 beats the RX 6600XT in raster. Below this point you might as well get a RTX 2060 super card. Want more performance then get RTX 3060 ti.

read the fine print:
5 - Requirement is made based on PC configured with an Intel Core i9-10900K processor. A lower power rating may work depending on system configuration.
View attachment 247416
OP:
View attachment 247417

pro tip: take the max power limit of the card's bios, divide 80% (or multi1.20%) add that to what you already know (rest of the system) and call it a day. though *big chips* can have high transients the new pcie-5 power specs for PSUs - anything 450+watts will be required to provide 200% for xxx milliseconds will take care of that - beside being irrelevant here.

or maybe i'm a luddite for thinking a PSU should fit like a glove -platinum is an excuse to go overkill - 80%eff. at 10% load and all . . .
The manufactures states those power requirement. These and many other forums are full with people that thought better. google the power spikes these gpus can create. A PSU maker refers you to the power draw specs from nvidia. Both AMD gpus and nvidia gpus have power spikes many times their power draw. This can cause weaker PSU's to shut down. Thus going with a quality PSU, rated at what AMD and nvidia states you should have is the best advise.
 
Last edited:
I have a 2060, you can play all RT games at 1080p. The big problem is that will change fast, the card is on the edge. I replace it with a RTX 3080 TI because the RT was better. Basically as the RX 6000 series is cut down this affects RT performance massively. A RX 6900xt is basically a RTX 3070 ti for RT performance but with much faster raster performance. The RTX 3080 12GB is the best card for future proofing and performance but is expensive. Amd Ryzen 5 2600 system will loss a lot of the perfomance you see in benchmarks and a AMD R9 Fury X lack the DX12u feature set.

I think a modest gpu upgrade if neccessary is required. I would be looking at a 3060 ti but not above a RTX 3070 ti. Graphics cards will be replaced soon and PSU's will be replaced with a new standard. Your CPU, motherboard and RAM will need to also be replaced. The motherboard if lucky could take a 5800x3d but that is a stop gap. Your RAM is only 3200MT/s.

Your PSU is a BeQuiet' Dark Power Pro 11 650W. A stock 3070 ti requires a 750 watts on nvidia's website. A 3070 requires a 650 watts PSU as per nvidia's website. A RTX 3070 ti is not a good upgrade option.
A RTX 3060 ti requires 600 watts and a 3060 requires 550 watts. A 2060 supper requires a 550 watts PSU. A 2080 super 650 watts. source

An AMD RX 6600 requires a 450 watts PSU. An AMD RX 6600XT requires a 500 watts and an AMD RX 6650XT requires a 500 watts.

You have a power draw budget to live under as well. A RTX 3070 ti is not a target upgrade and the 3070 is on the edge. So is 2080/2080 ti. The 3060 ti is the best out of the choices available. source The 3060 ti is cheaper were I live than the 6700xt. The price of the RX 6600xt for some versions are equal to the RTX 3060 ti. Thats the card I would target. The 3060 is cheaper and faster in RT than the RX 6600XT, what you likely lose in raster you gain in RT performance. Your local prices may vary but the 3060/3060 TI are better balanced between raster and RT performance. They are within your PSU's power limit.

Remember you will have to account for the loss in performance due to the CPU and slower RAM speeds. Both cards will be capped below there maximum raster performance by your cpu. This will make them more or less equal in raster. What will not be equal is RT performance. NVidia is your only choice.

Trust me, if he can run an R9 Fury X on that setup, he can run a 3090 in the same power budget, easily. But you actually do have a point, 650 watt supplies are very small and as they age, they aren't going to provide the current a GPU like the Fury needs. It might just be the source of his black screens altogether.
 
Trust me, if he can run an R9 Fury X on that setup, he can run a 3090 in the same power budget, easily. But you actually do have a point, 650 watt supplies are very small and as they age, they aren't going to provide the current a GPU like the Fury needs. It might just be the source of his black screens altogether.
I found the PSU here, its listed as 80 Plus Platinum. It should be fine for a good long time. There is a review for the 1000 watt version here. Japanese caps are encouraging. PSU required here Suggested PSU
600 W

Black screen is a common issue with the R9 Fury X and many AMD gpus. People sometime find a full wipe of the drivers and reinstall sometimes helps. People find replacing the gpu fixes it.
 
Last edited:
I'd say 6600, unless you want raytracing and/or DLSS. If so, then 2060.

Although, you said that your current card freezes your system. Are you sure your PSU isn't faulty?
 
I'd say 6600, unless you want raytracing and/or DLSS. If so, then 2060.

Although, you said that your current card freezes your system. Are you sure your PSU isn't faulty?
Freezing can be because of an unstable system. My CPU cache for example will freeze games hard if its too high. He could test to see if the system is stable in a stress test. See if any component are overheating. Also he could check if the right voltages are reported in HWiNFO64. Apart from that he would need a PSU tester or another PSU to swap with to test the PSU.
 
Guys thank you for the time you are dedicating to me, it was not my intention to unleash flame.
As for the CPU, I plan to upgrade to a 5600X as AMD recently unlocked the use of Gen 5 on the X370s.

As for the video cards that you have proposed, they are close to 500, the only one (one of the last comments) is the 2060 super which, in reality, I do not find, the 12GB 2060 is easier for me.
The power supply was taken about four years ago with the thought that it was not a problem for the next 5 years at least.

Regarding my current video card, for now, I am trying to increase the voltage of the medium-low frequencies keeping it at 1000 mV (against the standard 900 mV) it seems to be stable for now but I do not give anything for sure.
The current peak is about 300 W, it has never been a problem as I had a Vega 56, I tested an HD5970 (two GPUs on the same card) also two R9 280X in CrossFireX without ever having problems I excluded the power supply as the source of the problem.

I am completing this answer as I read your comments.
As I said above I have recently had other cards with a similar consumption and I have never had these problems (the Vega hits the 350W sometimes).
The CPU stops in the worst of the stress tests around 75°C and in-game but rarely touches 60°C (80W peak)
The video card does not exceed 65°C after an hour of stress test, in-game it rarely reaches 60°C. (300W peak)
Both CPU and GPU voltages are correct in HWiNFO64.
Currently, the only other power supply I have that might be able to handle this card is an older generation CX600 that suffered from high temperatures (with an A10-6800K),
in addition the liquid heatsink of the Fury X it is not exactly the easiest thing to disassemble from the case.
 
Glad to hear you got it working.
 
Poor benchmark results.

You still get way more raw power for your cash especially in this price range using AMD cards. I'm reading you're suggesting them a 3060 Ti and I suggest you actually go check the price of the 3060 Ti, it DOES NOT compete with the vanilla flavor 6600... both in U.S and EU markets, AMD trumps Nvidia out of the water at this level. For instance, you can get a triple fan 6700 XT for $500 any day right now. If you have a better performing Nvidia card at that price level, talk about its "amazing benchmark results" because that card will eat the 3060 on the poor, unrelated benchmarks you sent for breakfast, dinner and lunch. People are basically telling you to lose your fanboyism, in case you can't read through the lines...
 
The manufactures states those power requirement.
yes they do and for what configuration. and they, themselves also state:
5 - Requirement is made based on PC configured with an Intel Core i9-10900K processor. A lower power rating may work depending on system configuration.

These and many other forums are full with people that thought better. google the power spikes these gpus can create. A PSU maker refers you to the power draw specs from nvidia.
and that link goes right back the above (via seasonic!)
Both AMD gpus and nvidia gpus have power spikes many times their power draw. This can cause weaker PSU's to shut down. Thus going with a quality PSU, rated at what AMD and nvidia states you should have is the best advise.
not my first rodeo, seen 1500 watt power spikes for a 980ti kingpin which is long before it became more prevalent just recently. but that not the point - what AMD/Nvidia or any PSU maker will advise will be safe but far from the "best"

:)
 
yes they do and for what configuration. and they, themselves also state:
5 - Requirement is made based on PC configured with an Intel Core i9-10900K processor. A lower power rating may work depending on system configuration.


and that link goes right back the above (via seasonic!)

not my first rodeo, seen 1500 watt power spikes for a 980ti kingpin which is long before it became more prevalent just recently. but that not the point - what AMD/Nvidia or any PSU maker will advise will be safe but far from the "best"

:)
and that link goes right back the above (via seasonic!)
Yep you should also know it was their protection circuits that were tripping due to the power spikes. Also black screens. Crashes. Why would the PSU manufacture send you to nVidia's website for recommended PSU watts? Maybe you are ment to get the correct wattage and thats were you find out? Maybe people need to research if there are issues with the PSU? People got the recommend PSU and still had issues, its not safe to just state a lower watt PSU. All sorts of problems happen when you think you are too smart to follow the manufactures advice. Get the bare minimum and wonder way you have issues no one else has. Then you go to upgrade and have to replace the PSU as it lacks the watts for the brand new gpus that came out. This happened to RTX 20 series gamers.

I got the decent 1000 watt PSU, a 10900k and RTX 2080. Then I upgraded to a RTX 3080 ti. The problems I had was zero. Never get just what you need now and never get less than the manufacture recomments. You need more then the power draw for spikes and other loads. Then enough so you can upgrade the PC in future without having to run out and buy another PSU. The just enough mindset is the core of most PSU issues. PC's have short duration very high loads, many times the normal components average long term loads. If your PSU cant take it you shutdown. Why risk all the strange issues many people get on forums. The best quality component should be the PSU and should be over spec'ed so that you have headroom.

Bottom line: this PSU (Seasonic Prime TX-850 Titanium) just can handle the 3090. OCP keeps tripping randomly regardless of load and I have to reset the PSU at the switch to get my system back.

This is a known issue with certain PSUs and the 3090.

Should I go to a Seasonic Prime 1000 watt Platinum or other brand? I’d like to be able to OC a bit but this 850, isn’t cutting it because of the OCP protection issue.
 
Last edited:
Nvidia loves to gimp their hardware to make you buy the new but I guess that is not true.
We've tested that here on the forums: it's not.
 
yeah i did as they were happening. and i will point out that has been on GA102 chips; completely irrelevant here.
Why would the PSU manufacture send you to nVidia's website for recommended PSU watts?
simple - liability, when something goes wrong, seasonic points the finger at nvidia.
Star Wars Disney Plus GIF by Disney+

Maybe you are ment to get the correct wattage and thats were you find out?
why yes, of course - for the stated system configuration.
Wake Up Hello GIF by The Animal Crackers Movie

Maybe people need to research if there are issues with the PSU?
sure, though someone running a threadripper or dual socket xeon setup- which can consumes more power - would be just as clueless as before w/o reading the conditions
Look At This Fine Print GIF by Rooster Teeth


People got the recommend PSU and still had issues, its not safe to just state a lower watt PSU. All sorts of problems happen when you think you are too smart to follow the manufactures advice. Get the bare minimum and wonder way you have issues no one else has. Then you go to upgrade and have to replace the PSU as it lacks the watts for the brand new gpus that came out. This happened to RTX 20 series gamers.
yes and it was about time intel and pci-sig addressed the high transients with new (and improved) standards thats been longer over due. not only that but for years now the market has been flooded with gpus that cannot be pci-sig certified because the exceed the max 300 watt auxilary power conectors (75watts pic slot, 75 watts 6 pin and 150 8 pin) any card with two (let alone three!) 8 pin power connectors.

what good are standards if they can be ignored?

and you want to tell me about recommendations?!?!

amused jean arthur GIF
 
yeah i did as they were happening. and i will point out that has been on GA102 chips; completely irrelevant here.

simple - liability, when something goes wrong, seasonic points the finger at nvidia.
Star Wars Disney Plus GIF by Disney+


why yes, of course - for the stated system configuration.
Wake Up Hello GIF by The Animal Crackers Movie


sure, though someone running a threadripper or dual socket xeon setup- which can consumes more power - would be just as clueless as before w/o reading the conditions
Look At This Fine Print GIF by Rooster Teeth



yes and it was about time intel and pci-sig addressed the high transients with new (and improved) standards thats been longer over due. not only that but for years now the market has been flooded with gpus that cannot be pci-sig certified because the exceed the max 300 watt auxilary power conectors (75watts pic slot, 75 watts 6 pin and 150 8 pin) any card with two (let alone three!) 8 pin power connectors.

what good are standards if they can be ignored?

and you want to tell me about recommendations?!?!

amused jean arthur GIF
Also black screen issues plague AMD cards. Also AMD cards would pull as much wattage as they could from the PCIe slot. source 2 In their testing, they found that the RX480 they had received for review drew 86W through the PCIE slot. That’s 11W above the maximum 75W specification required to meet compliance.

Seasonic was following the standard.

Seasonic updated the statement with some explanation: https://knowledge.seasonic.com/article/20-focus-plus-and-gpu-potential-compatibility-issues

For AMD Vega 56/64: OCP triggered by the overwhelming transient current when pairing Focus Plus 550 with Vega. Solution: use higher rating PSUs for Vega.

For ASUS GTX970 STRIX: design flaw of this specific model graphics card. Solution: use PCIe power cables without filtering capacitors. source
Translation:

AMD's Vega 56/64 graphics card has a very high transient power consumption. The oscilloscope screenshot below shows the transient current when using the two Vega 56 CrossFire for FurMark test, up to 102A / 10ms, which means the power supply must withstand 1200W peak wattage. Even a single Vega 56 graphics card may have nearly 600W of transient power consumption.
file-Af5QnaoIYo.jpg

So getting just what you need causes problems because there is not head room. If a PSU is ment to provide 650 watts, the AMD manufacture tells you 650 watts and you pull 600 watts just on the gpu. Well that can cause issues.
In this case, from the security point of view, in order to protect other parts of the computer including the graphics card, the overcurrent protection threshold and trigger time of some FOCUS PLUS power supplies are set relatively sensitive. After the power supply taking protective measures, the computer may restart or shutdown.

AMD officially recommends 650W/750W power supply for Vega 56/64. Basically, only users who use FOCUS PLUS 550 can possibly encounter such power overload problems. If the user's power supply is purchased before January 2018 (according to the serial number on the power sticker), please contact Seasonic Customer Service for after-sales service.

A power supply sold after January 2018 has the updated sensitivity preset of overcurrent protection, so users can use it with confidence.

If you are using a high-power water-cooled Vega graphics card or other high-end graphics cards, please purchase power supplies with higher power ratings to ensure that the computer works properly.

In rare cases, using FOCUS PLUS and ASUS GTX970 STRIX graphics cards may result in continual black screens, which is currently only present when paired with the ASUS GTX970 STRIX model. Using the PCIe power cable without capacitors can solve the problem. If the user encounters such problems, he can contact customer service to obtain a PCIe power cables.

We have been cooperating with major graphics card manufacturers to solve the problems caused by the increasing power consumption of graphics cards.
As some one(OldJim/Nbohr1more) recently pointed out, the RX 480 is pulling more power over the PCIe port than the specification allows. This is bad -

  • Pulling more than 75 from the PCIe socket can possibly damage or reduce the life of your motherboard.
  • According to the licensing contract for the spec, if AMD do not fix this within 3 months they will NOT be able to call the card a PCI Express card. If they do, they face not only litigation but a ban the importation of the card as counterfeit.
  • AMD are stating inconsistent facts, the card was marketed as having a TDP of 150W, now their saying the GPU uses 110W. Yet a bunch of reviews have measure card pulling upto 175w and beyond.
More than 75 watts from the PCIe slot can lead to meltdowns, motherboard manufactures now limit power draw.

Here is an example of a meltdown. 3x ATI Radeon HD 5870's which were mining. Too much power draw from the PCIe slots is likely the cause. nVidia limits the power draw on the card, for the 8-pin connectors and the PCIe slot. I am not sure about current AMD cards. AMD are rumored not to limit power draw in any way.

gIAsrweh.jpg


rnIvKj0h.jpg
 
Last edited:
i've been working with PCs and many other electronics (esp. audio) both as a hobby and work for over 25 years now - after being previously a certified marine engineer in u.s navy in 1982 - is there something you can show me i don't know about?

i tell you standards are ignored and you post examples.
Obras Futfem GIF by SD Eibar
 
i've been working with PCs and many other electronics (esp. audio) both as a hobby and work for over 25 years now - after being previously a certified marine engineer in u.s navy in 1982 - is there something you can show me i don't know about?

i tell you standards are ignored and you post examples.
Obras Futfem GIF by SD Eibar
You need to get your ego under control, I am a full Electrical and Electronic engineer and have done the courses for fault finding to component level on PCBs and repairing PCBs. Qualification wise who's higher, who cares it an internet forum.

You stated
simple - liability, when something goes wrong, seasonic points the finger at nvidia.
I just told you that was wrong the nice way. Seasonic just have to follow the ATX standards etc. Most modern desktop personal computer power supplies conform to the ATX specification, which includes form factor and voltage tolerances. So long as they follow all the standards the PSU is covered from liability. Any device a person connects to the PSU that causes issues in operation due to not following standards is their fault.
 
Last edited:
You need to get your ego under control, I am a full Electrical and Electronic engineer and have done the courses for fault finding to component level on PCBs and repairing PCBs. Qualification wise who's higher, who cares it an internet forum.
and i am amazed you passed those online courses since you can't seem to read footnotes.

nice chat- we are done.
 
and i am amazed you passed those online courses since you can't seem to read footnotes.

nice chat- we are done.
See post above. Not playing the ego game with you. There are enough flames and trolling at the moment that I have to endure.
 
Based on the Techpowerup testing, a RX 6600 will be about 130% basic performance of a RTX 3050, and about 108% of a RTX 2060 12GB. The 3050 has the advantage on ray tracing architecture, bringing it almost to the 6600 performance level. The 2060 12GB pulls ahead by a reasonable margin with RTX enabled. Based on this, it is reasonable to eliminate the 3050 from the competition.

Your X370 platform is limited to PCIe 3.0, which will result in a slight performance hit for the 3050 and 6600. This should not be over 5% loss, as they still are fairly low-end for the number of lanes available.

The 2060 12GB and the 3050 support DLSS, which you would likely need if enabling ray tracing. The 6600 supports RSR, an alternative that does not require game support, but is of lower quality. FSR is available on all three.

Future-proofing - the 2060 12GB may end driver support sooner, but has more VRAM capacity. This will likely be an advantage in the future as game studios take advantage of the GPU arms race to release much more hardware-intensive games in the future.

Bottom line: get the cheapest between the 2060 12GB or 6600, unless you plan on playing many RTX games. If so, I would recommend the 2060 12GB over either the 3050 or the 6600.
 
We've tested that here on the forums: it's not.
I am talking about my own experience when my first GPU was a GTS 450. Then about 2 months later I bought another one for SLI. Then I bought a 6800 ($99) and sold them to a friend. After about 3 months he called me complaining that nothing he did could make SLI work. When we investigated Nvidia removed SLI support for the GTS450. As you know we are talking about the hey day of Multi GPUs so that was a not something they had to do.
 
Based on the Techpowerup testing, a RX 6600 will be about 130% basic performance of a RTX 3050, and about 108% of a RTX 2060 12GB. The 3050 has the advantage on ray tracing architecture, bringing it almost to the 6600 performance level. The 2060 12GB pulls ahead by a reasonable margin with RTX enabled. Based on this, it is reasonable to eliminate the 3050 from the competition.

Your X370 platform is limited to PCIe 3.0, which will result in a slight performance hit for the 3050 and 6600. This should not be over 5% loss, as they still are fairly low-end for the number of lanes available.

The 2060 12GB and the 3050 support DLSS, which you would likely need if enabling ray tracing. The 6600 supports RSR, an alternative that does not require game support, but is of lower quality. FSR is available on all three.

Future-proofing - the 2060 12GB may end driver support sooner, but has more VRAM capacity. This will likely be an advantage in the future as game studios take advantage of the GPU arms race to release much more hardware-intensive games in the future.

Bottom line: get the cheapest between the 2060 12GB or 6600, unless you plan on playing many RTX games. If so, I would recommend the 2060 12GB over either the 3050 or the 6600.

Problem is new vs used gpu. I'd say new for 99%

Also if the price of a 2060/ti is right with a 6600/xt, I'd go with the 6600XT because Turing is from what 2018, not worth $500 used, it's oah. Also good luck getting that card bnib. Rtx 2000 were known for memory troubles as well, still have those floating around used, or possibly bnib...

Also dlss/rt on a 2060 would actually be a crippling experience, you'd need at least a 2070 for it to not feel like garbage.

So the 6600xt is a compelling choice and I believe that functions can be turned off like you could back in 99/2000.
 
Last edited:
Also black screen issues plague AMD cards. Also AMD cards would pull as much wattage as they could from the PCIe slot. source 2 In their testing, they found that the RX480 they had received for review drew 86W through the PCIE slot. That’s 11W above the maximum 75W specification required to meet compliance.

Seasonic was following the standard.



file-Af5QnaoIYo.jpg

So getting just what you need causes problems because there is not head room. If a PSU is ment to provide 650 watts, the AMD manufacture tells you 650 watts and you pull 600 watts just on the gpu. Well that can cause issues.


More than 75 watts from the PCIe slot can lead to meltdowns, motherboard manufactures now limit power draw.

Here is an example of a meltdown. 3x ATI Radeon HD 5870's which were mining. Too much power draw from the PCIe slots is likely the cause. nVidia limits the power draw on the card, for the 8-pin connectors and the PCIe slot. I am not sure about current AMD cards. AMD are rumored not to limit power draw in any way.

View attachment 247429

View attachment 247430
Your hatred for AMD graphics cards seems to be deep rooted, yet here I am with my 6400 in my HTPC that I absolutely love. I also had a 5700 XT that was also great, albeit a bit hot and its drivers a bit unstable at times. I also had a 7970 about ten years ago that I also loved. It was a power-hungry monster, but it never caused any issue.

If you ask around this forum, I'm sure you'll find lots of happy AMD owners. If the problems you hand-picked were that common, AMD would be in some serious pickle and their products wouldn't be so popular.

As your sources are from reddit and videocardz.com - the latter of which states that their sources were either using Furmark, or an overclocked card (neither of which fit a normal use case in my opinion), I dare to say that it's more of a case of people not knowing how to build a PC (under-speccing PSUs and motherboards, using Furmark, pushing the GPU too far, etc) rather than the GPU's fault at factory settings.
 
Problem is new vs used gpu. I'd say new for 99%

Also if the price of a 2060/ti is right with a 6600/xt, I'd go with the 6600XT because Turing is from what 2018, not worth $500 used, it's oah. Also good luck getting that card bnib. Rtx 2000 were known for memory troubles as well, still have those floating around used, or possibly bnib...

Also dlss/rt on a 2060 would actually be a crippling experience, you'd need at least a 2070 for it to not feel like garbage.

So the 6600xt is a compelling choice and I believe that functions can be turned off like you could back in 99/2000.
I was referring specifically to the 12GB version of the 2060, which launched in January of this year. A quick Newegg search turned up all new cards, so I am not sure that would be a problem.

I agree the 6600 would be a better idea unless he really wants ray tracing, as AMD takes a bigger hit to performance.
 
Back
Top