• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD RDNA3 Graphics cards are unusable

Status
Not open for further replies.
Feel free to use secure boot and pure UEFI mode if you want, but I’ve tested it on every system I’ve had (current is z590 and 10850k) and prefer MBR. For the casuals and weirdos that claim no settings in the BIOS or Windows have any effect on gaming, all it would do is stop you from running a 4 TB boot drive, so who cares.
 
Feel free to use secure boot and pure UEFI mode if you want, but I’ve tested it on every system I’ve had (current is z590 and 10850k) and prefer MBR. For the casuals and weirdos that claim no settings in the BIOS or Windows have any effect on gaming, all it would do is stop you from running a 4 TB boot drive, so who cares.
Are you confused, maybe? The alternative to UEFI isn’t MBR, it’s CSM. And yes, the partition table switch to GPT is a part of UEFI boot, but that’s a separate thing to talk about. The idea that MBR vs GPT would have any effect on “gaming” is nonsensical.
Oh, and running in CSM also makes you unable to use ReBAR which, I suspect, would have a far larger effect on gaming than whatever you are trying to achieve.
And yes, “casuals and weirdos” will continue claiming that, VBS on older CPUs aside, pretty much no Windows and BIOS snake oil tweaks over the years have achieved any actual benefit. And until you can, unequivocally, using actual measurements taken with LDAT, prove otherwise you will be ignored by the sane enthusiasts. Your tales of “clown cursor” and “mouse feel” are not proof, you understand.
 
Anyone that isn’t trying to be annoying on purpose knows MBR means CSM on, otherwise you’d probably be running UEFI GPT. I believe most pro players are currently avoiding above 4g decoding/rebar (I don’t like it either), just like most competitive players don’t use things like freesync either which I also dislike, so another fail from the ultra-casual crowd trying to attack me.
 
I am sure they do. After all, it’s not like forcing ReBAR on in CS2 leads to measurably better lows… oh wait.
Without:
IMG_1633.png

With:
IMG_1632.png

But meh, I don’t want to derail the thread (whatever it is supposed to be about anyway), so feel free to continue ranting into the void.
 
Pretty much no Windows and BIOS snake oil tweaks over the years have achieved any actual benefit.
Your post is the definition of “snake oil” since modern BIOS come with things like PCIE clock gating, massive c-states, E-cores of either little or detrimental use for gaming, the list goes on, all turned on by default.
 
Anyone that isn’t trying to be annoying on purpose knows MBR means CSM on. I believe most pro players are currently avoiding above 4g decoding/rebar (I don’t like it either), just like most competitive players don’t use things like freesync either which I also dislike, so another fail from the ultra-casual crowd trying to attack me.
Well that one has got me involved. Rebar is always on and does not contribute to issues but improves FPS. Freesync is VRR. Freesync is why there is no tearing for most Gamers today and makes 45-165 or now 45-240 Hz butter smooth. That argument is just foolish. Most Pro Gamers use panels that are outside of the Freesync range anyway not because they don't want to use it. There is not a 120Hz panel that does not support some type of sync tech. Your goalposts have moved about 3 times already. I don't have a 7800XT but I do have a 7600 and your claim is just plain false about that card. I even read some of your other posts on different forums and i will leave it with that.
 
So...the 5700 xt is a card that I own. There seems to be a lot of hate for it...as the 6700 xt basically invalidated it in almost no time flat. I run the latest drivers, and a 5600x processor. It runs plenty of stuff fine on 1080p...where this setup absolutely was meant to be. It's not magic, there are better and worse running games depending upon the driver revision...but bemoaning everything under the sun "because AMD's drivers are known to be bad and bloated" is kind of unreasonable...even on what we can all agree is pretty meh hardware for (at the time) a fairly meh price.

Can you maybe decide to lose your crap over something that matters? This reads like you really hate AMD...and decided to hate buy a GPU that you were already determined to not like. As such...maybe just pay the Nvidia tax and shutup about the card focusing on raster performance and price, being not so good at the thing that it never advertised itself to do well.


So we are clear, I don't have to give AMD my personal details to update my GPU drivers. Intel doesn't make me jump through hoops to update anything. Nvidia is the only company where I have to pay my performance tax...and then cowtow to them to prove I should be able to download drivers for the product that I have....whose drivers at least partially detect the hardware and thus have proof I bought what I said I have. I will admit that going from a 5600x and 5700 xt to a 5700x and 3080 offers "buttery smooth" performance in some games. I'll also admit that for the $1000 and change extra it cost me it damn well better be a great experience...but that I'm still happy with my 5700 xt. For about $600 I got a whole system that could basically beat out any console...and is still beating some of them out today. The definition of "unusable" being that it doesn't do everything perfectly...despite not burning through cash at a silly rate...is the sign that you are getting only what you pay for. It's not a universal experience that AMD cards suck...only that if you expect filet mignon at dollar store prices you're going to be disappointed...and even filet mignon can be critiqued when you compare it to Wagyu.
 
I am sure they do. After all, it’s not like forcing ReBAR on in CS2 leads to measurably better lows… oh
False correlation. Most people with non-ghetto rigs aren’t going to have a problem with “lows” in the first place. So you’re falsely stating that this setting is not possible to have any drawbacks while only bringing benefits for a problem most people don’t even have.

Competitive shooters are not ‘peformance’ constrained games. They aren’t all made in UE5 (yet) and only pushing 58 FPS or something. I played with this setting when I first got my z590 (didn’t have the setting beforehand) and didn’t like it. I started looking around and noticed a lot of high profile players don’t like it and don’t use it either.

So this is yet another case of the casual noob crowd buying the snake oil from AMD (they seemed to be the ones who originally marketed it) just like they did for things like freesync, when nobody uses freesync either. Class dismissed.
 
False correlation. Most people with non-ghetto rigs aren’t going to have a problem with “lows” in the first place. So you’re falsely stating that this setting is not possible to have any drawbacks while only bringing benefits for a problem most people don’t even have.

Competitive shooters are not ‘peformance’ constrained games. They aren’t all made in UE5 (yet) and only pushing 58 FPS or something. I played with this setting when I first got my z590 (didn’t have the setting beforehand) and didn’t like it. I started looking around and noticed a lot of high profile players don’t like it and don’t use it either.

So this is yet another case of the casual noob crowd buying the snake oil from AMD (they seemed to be the ones who originally marketed it) just like they did for things like freesync, when nobody uses freesync either. Class dismissed.
LMAO, no one uses Freesync. That was the most idiotic comment I have seen in months. Keep your opinion to yourself and don't try to project it to others. That is like saying most people don't go into Display settings and set their monitor's max refresh rate. In fact thinking about your original post that could be exactly what happened.
 
*also find it kind of funny how people in that post just see it as normal as a very new card like a 6800xt dieing for no reason when Nvidia cards that are 10+ years old all over the place run like the day they were new.

I find most of your posts kind of standoffish, yet mildly entertaining enough to read through them (mostly) without completely skipping over. However, I am certainly surprised this thread keeps going. My confusion though has to do with your ability to lay out a claim that Nvidia cards are still running 10+ years like they were new whereas AMD cards are dying....hahaha, yeah, right.

My 3080Ti just upped and died on me after about 3 months of use. No reason whatsoever. Just stopped. I had the power limit set to 75%. Temps never exceeded 80C and my PSU is working just fine (currently running my replacement 3080Ti for 14 months without a hitch). Now, I understand that 3 months is almost at that 10+ year mark so I might be kind of quick to say that you're wrong.

Electronics are fickle at times. Sometimes you get a bad item. I've had DOA motherboards before. I've had bad RAM, bad HDDs and I've even had a bad GPU. It happens. Blatant claims that AMD cards are just dying and Nvidia cards are rocking it 10+ years later is stupid.
 
The thread should go on. I want to see what other wisdom can I learn. It's entertaining.

This thread is more like a click-bait yellow article posted by wccftech or similar.

I know there are rumours that RDNA 3 (or some of its features like hardware shader prefetch, random/unstable clocks) is broken, but the only thing we can argue is that the overall performance has missed the original targets.
 
This thread is more like a click-bait yellow article posted by wccftech or similar.

I know there are rumours that RDNA 3 (or some of its features like hardware shader prefetch, random/unstable clocks) is broken, but the only thing we can argue is that the overall performance has missed the original targets.
My 7900XT is much faster than my 6800XT so I am satisfied with the performance increase. In a strange juxtaposition it was even cheaper as well.
 
My confusion though has to do with your ability to lay out a claim that Nvidia cards are still running 10+ years like they were new whereas AMD cards are dying....hahaha, yeah, right.

My 3080Ti just upped and died on me after about 3 months of use. No reason whatsoever. Just stopped.
But this is cherry picking the worst Nvidia fab used in years + bad batches of Hynix ram, entire economy shutdown, 350w TDP (more in reality), which everyone knows to avoid these power hog cards, and so on and so forth. If it was a 3060 Ti or something with non-Hynix RAM you probably would not be typing this.

If somebody just walked up and gave me a free 3070 + 3090 I would use the 3070 instead, so it’s just like, why do you do this to yourself on purpose? These “I love my 7900xt” people are going to be typing the same thing in a couple years.
 
False correlation. Most people with non-ghetto rigs aren’t going to have a problem with “lows” in the first place. So you’re falsely stating that this setting is not possible to have any drawbacks while only bringing benefits for a problem most people don’t even have.
What the actual hell am I reading. The testing I provided was carried out on a 5600X + 3070Ti system. Is it top of the line by modern standards? No. But it is absolutely a respectable mid-range system and more than enough for competitive CS at the highest level. If THAT system had measurable improvement… well.
But this is cherry picking the worst Nvidia fab used in years + bad batches of Hynix ram, entire economy shutdown, 350w TDP (more in reality), which everyone knows to avoid these power hog cards, and so on and so forth. If it was a 3060 Ti or something with non-Hynix RAM you probably would not be typing this. If somebody just walked up and gave me a free 3070 + 3090 I would use the 3070 instead, so it’s just like, why do you do this to yourself on purpose?
WHAT Hynix RAM? The 3080Ti runs GDDR6X by Micron. You know, like ALL GDDR6X is.
 
And here I was thinking that I over do it when I used to use DDU before every new driver release.
 
Yea, you’re right. I forgot he was talking about a GDDR6X RAM card. So it died from being a nuclear reactor and not bad RAM. Don’t buy meme nuclear reactor cards.
 
Its the 24.x.x drivers that are the problem roll back to 23.x.x and the issues go away
 
I'm not a big fan of the Blurbusters guy because he basically said something along the lines of the entire purpose of his forum is just to try and make money (that he was not making enough of running it) and was complaining sections of the forum "only made him $10 a month" and would edit my posts if I said things like DSC feel like laggy garbage to play on to try and preserve advertiser dollars for new DSC compression monitors (who the hell does things like this?), but he would probably type out at least 4 pages ridiculing you for ignoring things like scanout rate for this comment.

I'm sure Mark is wrong and you're perfectly right, he's just some no name guy that has no name in industry or anything, the same goes for the OCN mods that banned you for doing exactly this thread last year

If there is any ultimate proof that TPU mods are nice, that patience is their virtue and that they're very lenient is that this thread is still up and your account is still active
 
But this is cherry picking the worst Nvidia fab used in years + bad batches of Hynix ram, entire economy shutdown, 350w TDP (more in reality), which everyone knows to avoid these power hog cards, and so on and so forth. If it was a 3060 Ti or something with non-Hynix RAM you probably would not be typing this.

If somebody just walked up and gave me a free 3070 + 3090 I would use the 3070 instead, so it’s just like, why do you do this to yourself on purpose? These “I love my 7900xt” people are going to be typing the same thing in a couple years.
You can't have it both ways. Claim that Nvidia cards run like new 10+ years and then come back and say, "well....you know, there were some problems with some Nvidia cards....but, but, but, but, it was because of bad ram!".

You come off sounding like a fool that's just backpedaling to make themselves still believe their original claim even after admitting it wasn't right.
 
You know about the internet and server side lag, right? The thing in between your mouse in your hand and the cursor on the screen? Think about it.
 
LMAO, no one uses Freesync. That was the most idiotic comment I have seen in months. Keep your opinion to yourself and don't try to project it to others. That is like saying most people don't go into Display settings and set their monitor's max refresh rate. In fact thinking about your original post that could be exactly what happened.
Eh? VRR in the radeon settings IS freesync
 
Status
Not open for further replies.
Back
Top