• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Issues Official Statement Regarding RX 560 Silent Downgrade

Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
I do suck at that, don't I? LOL!

Regardless if the chicken or the egg came first, the point is NVIDIA distinguishes the cards on the website. So there is clearly a resource to find out versus AMD, there isn't. We know NVIDIA is the big bad wolf. It has no business being brought up here for the umpteenth time. Straw man argument... deflections....

TPU posting makes me want to cry.

In the immortal words of Howard Stern's father, "Don't be stupid, you moron".

(that was not directed at you, Vya)
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
They were not launched concurrently. They weren't even announced at the same date. The 6GB version predates the 3GB one in every way.

https://www.anandtech.com/show/10580/nvidia-releases-geforce-gtx-1060-3gb

If the consumer had read reviews and announcements from the original launch they could have easily walked into a store a month or two down the line and unknowingly buy a weaker 1060 under the assumption that it simply had less VRAM since it a very common occurrence for cards to be sold like this. This could have played out in exactly the same way if we are talking about a 560.

If however we assume the buyer does have access to all information that is available, in either cases he could have easily looked at the specs and figure out something is amiss. Now if they didn't then there is nothing else that could have prevented this from happening , both of those products should have had proper names.

This is why at the end of the day it is the same issue whether you are buying a 1060 or a 560.

Again, no it isn't the same thing. A person walking in a store can easily tell which version of the GTX1060 they are buying. They can not tell which version of the RX 560 they are buying. That is a very big difference. Yes, it isn't the best practice labeling both versions of the GTX1060 a GTX1060, but someone buying the cards can instantly tell which version they are buying just by what is on the box. This can't be done with the RX 560, and that is what make what AMD did much worse than what nVidia did. You seem to be trying to make it seem like I'm saying the GTX1060 issue isn't bad, but I'm not. I don't agree with nVidia there, but at least they defined the difference from the day the weaker version was launch and made it easy to distinguish between the two in the stores. They also sent review samples to every reviewer that got GTX1060 6GB samples, so both would have reviews. Have we seen any reviews for the weaker RX 560? Did AMD bother to make sure the performance information and reviews were even available to the consumer?

You can't assume the buyer has access to all the information, the point is AMD and the AIBs have made it impossible to have all the information when buying the card. They don't put the shader count on the box. So someone walking into a store just sees a RX 560, they have no way of knowing which version they are getting. That is the problem! This is also why it is different than the GTX 1060.

If I , for some reason , can't look up the specs ? No , I can't. And you conveniently provided zero explanations on why I should. Even more so , you said it's easy.

It is easy. I can look at the box of the card and instantly know which one is which. What is printed on the packaging is enough for me to determine which version I am getting. This is why it is easy. I don't have to buy the card, pray I got the good version, take it home, open the box, plug the card in, and run GPU-Z to figure out which GTX 1060 version I got. But you have to do all that to figure out which RX 560 you got. Do you still not see the difference?

Maybe putting it in bullet point form will help.

  • Steps to determine which version of GTX 1060 you are buying.
    1. Look at box for 3GB or 6GB.
    2. Done.
  • Steps to determine which version of RX 560 you are buying.
    1. Guess and buy a RX 560.
    2. Take card home.
    3. Unbox the card.
    4. Put the card in your computer.
    5. Install the drivers.
    6. Launch GPU-Z and pray you got a good version.

The reason why your example is fundamentally flawed is because it would have to rely on the assumption that one would already know that one of them has fewer shaders. Because if they didn't , there is absolutely no way they could tell that just by looking at the name and quite frankly I can't see how you could possibly prove the opposite.

And the problem with that argument is that people that don't bother to look up specs and reviews are the exact same people that immediately assume cards with less memory are slower. And in this one instance, their assumption is actually correct.

If however you have some other explanation on why it is easier to tell which is which please say it.

See the bullet points above.

As for this, it's an issue if there's no way of telling pre purchase what you're getting, and if the price is the same. But you can usually get some detailed data on the card you're looking, unless you buy from unreputable retailers or random dudes on ebay.

And that is the issue. There is nothing on the packaging of the RX 560 to tell which version the card is. If you are buying online, most reputable retailers will list the card's specs pulled from the manufacturer's website, you have to hope they got the specs right now. However, if you just walk in the store there's no way to tell. There's nothing on the box telling you.
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
Also something to consider, is even if you see 896/1024 or whatever, who knows that 1024 even exists if you pick up a 896 core 560? Its logical to think that is the only variant out there, no? I supposed you would be picking up other boxes to compare, but... seriously. What non enthusiast really knows the other exists, especially without looking it up. ;)

Its a cluster from all angles...AMD could shut me up if they simply made a proper entry for the 560D in their specifications. Then, have them smack the AIBs around for ALSO not distinguishing it, and the problem is solved.

Amazing how far down the rabbit hole we travel when it really is such a simple friggin fix. They could have updated their website DAYS ago shortly after this hit. Instead, its still there almost 3 days later...
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Also something to consider, is even if you see 896/1024 or whatever, who knows that 1024 even exists if you pick up a 896 core 560? Its logical to think that is the only variant out there, no? I supposed you would be picking up other boxes to compare, but... seriously. What non enthusiast really knows the other exists, especially without looking it up. ;)

Its a cluster from all angles...AMD could shut me up if they simply made a proper entry for the 560D in their specifications. Then, have them smack the AIBs around for ALSO not distinguishing it, and the problem is solved.

Amazing how far down the rabbit hole we travel when it really is such a simple friggin fix. They could have updated their website DAYS ago shortly after this hit. Instead, its still there almost 3 days later...

Honestly, they could just do what nVidia has done in the past. Call it the RX 560 896 Edition, or like you said call it the 560D. Whatever, just do something to make it easy to tell from just looking at the box which version you have in your hand at the store. Or, hell, if they the reason was really "to provide the market with more RX 500 series options" then call the damn card an RX 555. Bam, problem solved from day one, with no confusing name at all. It would be very easy to tell that an RX 555 is weaker than an RX 560. The consumer doesn't have to scratch their head to figure out if the RX 560 or RX 560D is better. Because slapping a letter on the end of the model doesn't really tell the consumer that.

And that is the thing, they are never going to update their website. It is obvious from their response that they don't intend to change the name of the weaker RX 560 or stop selling it. They are putting the blame on the AIBs for not making it easy to distinguish which version is which, and the fix for that I bet is going to be nothing more than burying the shader count in the small print on the back of the box.
 
Joined
Jul 15, 2006
Messages
977 (0.15/day)
Location
Malaysia
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B450M-S2H
Cooling Scythe Kotetsu Mark II
Memory 2 x 16GB SK Hynix OEM DDR4-3200 @ 3666 18-20-18-36
Video Card(s) Colorful RTX 2060 SUPER 8GB
Storage 250GB WD BLACK SN750 M.2 + 4TB WD Red Plus + 4TB WD Purple
Display(s) AOpen 27HC5R 27" 1080p 165Hz
Case COUGAR MX440 Mesh RGB
Audio Device(s) Creative X-Fi Titanium HD + Kurtzweil KS-40A bookshelf
Power Supply Corsair CX750M
Mouse Razer Deathadder Essential
Keyboard Cougar Attack2 Cherry MX Black
Software Windows 10 Pro 22H1 x64
  • Steps to determine which version of RX 560 you are buying.
    1. Guess and buy a RX 560.
    2. Take card home.
    3. Unbox the card.
    4. Put the card in your computer.
    5. Install the drivers.
    6. Launch GPU-Z and pray you got a good version.
The simplest method is to look at their naming e.g: Evo in ASUS card or for more shady company like Powercolor look in their SKU numbers, some site like NewEgg mentions the sp count. Talking about Powercolor, they made the later R9 270 TurboDuo with 1024sp instead of 1280sp. I'm very close of buying it before I found it out. The RX 560 that I bought luckily can be unlocked without any problem, if not I'll be quite crossed. AMD should implement the D suffix to everyone, not just for sale in China.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
The simplest method is to look at their naming e.g: Evo in ASUS card or for more shady company like Powercolor look in their SKU numbers, some site like NewEgg mentions the sp count. Talking about Powercolor, they made the later R9 270 TurboDuo with 1024sp instead of 1280sp. I'm very close of buying it before I found it out. The RX 560 that I bought luckily can be unlocked without any problem, if not I'll be quite crossed. AMD should implement the D suffix to everyone, not just for sale in China.


That relies on the AIB giving it a different name, and hoping the AIB has the specs right on their website. Neither can be guaranteed. Plus, it still requires the buyer in the store to go beyond just looking at the packaging to figure out what version they are buying. They have to find the SKU, then reference it on the AIB's website. That's more work than a single step of just looking for 6GB or 3GB on the front of the box.
 
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
I'm not a huge fan of different variations of a card all being given the same model number, I definitely don't think that should be done. But comparing it to what AMD did with the RX 560 isn't a fair comparison. What AMD did is notably worse.

With the GT730, that was literally a bottom of the barrel card. AND they released all the versions of the card at the same exact time. So everyone knew they existed from day one, and you knew what to look for. If reviewers reviewed the card, they knew to look at the version and let people know which one they were reviewing. (Did any major site ever even review the GT730?) Plus, the target demographic wasn't gaming, if you tried to game on a GT730, I feel bad for you, every version of that card was shit.

But AMD released a model on the market, then later on released another weaker card under the same name. This is very different from what nVidia did, and worse. It also isn't the first time they've released a card, let it be reviewed, then reduced the performance. They've been pulling that shit since at least 2008...
I disagree and agree with your post.

First, I don't think that the money a consumer is paying for a GT 730 have less value than those payed from another for an RX 560, as the money payed for an RX 560 don't have less value for those payed for a GTX 1070 for example. I could also say that "Hey, RX 560 is not as good as a GTX 1070, so scr3w those poor people buying RX 560 for gaming". I am not trying to put words in your mouth here, just pointing to the fact that this "it's a low end card" excuse, in fact is NOT an excuse.

The GT 730 is not the bottom of the barrel. You can play games with that card, just at 720p and medium/low settings in most cases. The bottom of the barrel is the integrated graphics in Intel processors and many people who like to improve at least a little their gaming experience, or don't have money/don't wont to pay much for a graphics card, will go for these kind of cards. And when three people buy three GT 730 cards and end up with three totally different cards, that's not something I can excuse. The fact that they where released the same day means nothing really. You might know about these cards. I might know about these cards. Nvidia could have explained the three models in detail in it's press release(I doubt), but most people who will go at a shop to buy a GT 730, don't really know what they get. And reviews, no, haven't really seen any, at least not direct comparisons between the three different models. I would be putting benchmark charts here, not just the spec page, if there where any. It would have been much much more fun and much more obvious of what I am trying to say here.

I do agree that AMD creating a new lower performing model months AFTER the original was released is BAD. REALLY BAD. It reminds me why I am NOT buying Kingston SSDs(and microSD cards for their pathetic 4K writing speeds). But selling three totally different cards under the same name from day one and adding one worst version of a model latter, in 9 out of 10 cases is the same. Because 9 out of 10 consumers will never read reviews, 9 out of 10 consumers will buy the 4GB, DDR3 64bit version of the card, because the 64bit GDDR5 version comes only with 2GB memory, so it is half as fast. By the way. All DDR3 models of low end Nvidia cards come out almost always with 1800MHz DDR3 memory. Latter models usually come with much slower memory. The GT 620 I have (bought it at a really low price) was coming with DDR3 at 600MHz. I have even seen models with 533MHz DDR3. We are talking about the same memory bandwidth as a Geforce 210 with DDR3 memory.

Would you point to some examples with AMD doing the same in the past? Don't really remember any. Maybe you confuse it with models like the HD4830 and HD5830 or something.
Do you want an example where AMD did something bad? The transition from HD 7770, HD 7750 and HD 7730 to R5 240, R7 250 and R7 250X. It was a downgrade.

But... we're reading an AMD topic here. What's next, a full list of all weird releases of every company ever?

Also - you can see in your little screenshot a full spec list, including shader counts, memory bus etc. and these details are 100% correct. Let's fast forward to Intel's Skylake versus its Kaby Lake parts, now, or give their new i3 quadcore parts the middle finger because they're actually last years' i5 stack.... This is really quite normal. AMD has also been re-releasing Pitcairn until infinity, re-rebranding a whole product stack from HD- to R9, hell I could give you a dozen more examples of any hardware company.

Bottom line, your example makes no sense at all and its not the place.

And this one's for you @RejZoR
http://www.tomshardware.com/news/MSI-GTX-660-670-overvolting-PowerEdition,18013.html


Nobody is clean
I am pretty sure if this was about Nvidia, AMD's name wouldn't have been mentioned by anyone. Right?

Also I can see in my little screenshot all the specs because I know that there are three different models(by the way, the memory speed for the DDR3 models is NOT guaranteed). Did you knew about those different models? Would you ever thought you need to go to Nvidia's site and double check that there is only one version of the card in the market? Is the average consumer going to do it? Would the average consumer go to AMD's site and double check if there are in fact two models available under the same name? Are those cases any different really?

And by the way, did you really avoided reading those two first lines of my post to have an excuse to make your post, or just missed them?
Let me help you with that
arrows.jpg
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Would you point to some examples with AMD doing the same in the past? Don't really remember any. Maybe you confuse it with models like the HD4830 and HD5830 or something.
Do you want an example where AMD did something bad? The transition from HD 7770, HD 7750 and HD 7730 to R5 240, R7 250 and R7 250X. It was a downgrade.

I'm not really going to defend the GT730, because I don't agree with it. It easily isn't as bad as what AMD did with the RX 560, but it isn't good either.

If you want an example of AMD doing this in the past, look no further than the HD4850. It came out in GDDR3, GDDR4, and GDDR5 models. Each one with a difference performance. Oh, and a few months after launch, after all the reviews were out of course, AMD released a driver that intentionally throttled performance when the card was under high load because AMD discovered the reference cooler wasn't good enough and too many cards were dying after only a few months use. Sneaky sneaky...
 
Last edited:
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
I'm not really going to defend the GT730, because I don't agree with it. It easily isn't as bad as what AMD did with the RX 560, but it isn't good either. But I will say the performance difference between the two DDR3 cards likely isn't much. Yes, the 64-bit memory bus of the kepler version seems bad, but kepler performed better with smaller memory bus widths than Fermi. I would venture to guess the Fermi and and Kepler DDR3 cards performed about the same. But, I mean, no one benchmarked these cards. The target market really weren't buying them for the performance.

And, yeah, the GT730 was bottom of the barrel. I don't care that it could run some games at 720p, the gaming experience was garbage on that card no matter what. And we're talking about a card that was something like $75 when it was released. That's bottom of the barrel. Hell, there were threads posted on this very forum asking why the GT730 even existed, because it wasn't fast enough over integrated(hell is lost to AMD's APUs, IIRC).

If you want an example of AMD doing this in the past, look no further than the HD4850. It came out in GDDR3, GDDR4, and GDDR5 models. Each one with a difference performance. Oh, and a few months after launch, after all the reviews were out of course, AMD released a driver that intentionally throttled performance when the card was under high load because AMD discovered the reference cooler wasn't good enough and too many cards were dying after only a few months use. Sneaky sneaky...

Neither you or me defend anything here. We both make it clear that both cases are BAD. My point was that people usually overreact about AMD and not so about Nvidia. And at AMD they should know this fact better and realize that they should avoid doing "smart" things from a marketing perspective and just focus on doing really smart things.

For the consumer who gets a GT 730 is much worst what Nvidia is doing, because the 3D performance of a card having 40GB/sec bandwidth(GDDR5) with a card having 14GB/sec(DDR3, 64bit) or much less(slower DDR3 memory), will be significant at least. And until Nvidia finally gave that half baked DX12 support for the Fermi series, those Fermi GT 730 cards where inferior in features also. AMD's case is also bad as a case of bait and switch, but the person who will get the 14CUs model will not suffer much performance loss. But they do get something infirior, and that's not acceptable.

I disagree about the gaming experience. The gaming experience is different when the visual quality is much different, but for people who enjoy the game and haven't been spoiled with mid/high end graphics cards, but the FUN they get from a 720p medium settings game A, will be the same as the one someone else gets playing it at 4K with high settings. I greatly enjoyed Borderlands with a 9800GT as the main card at 720p with medium/high settings and the GT 620 as a PhysX card. It was huge fun and while it would be much better looking at 1080p with a hi end card, the FUN would have been probably the same.
 
Joined
Jul 15, 2006
Messages
977 (0.15/day)
Location
Malaysia
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B450M-S2H
Cooling Scythe Kotetsu Mark II
Memory 2 x 16GB SK Hynix OEM DDR4-3200 @ 3666 18-20-18-36
Video Card(s) Colorful RTX 2060 SUPER 8GB
Storage 250GB WD BLACK SN750 M.2 + 4TB WD Red Plus + 4TB WD Purple
Display(s) AOpen 27HC5R 27" 1080p 165Hz
Case COUGAR MX440 Mesh RGB
Audio Device(s) Creative X-Fi Titanium HD + Kurtzweil KS-40A bookshelf
Power Supply Corsair CX750M
Mouse Razer Deathadder Essential
Keyboard Cougar Attack2 Cherry MX Black
Software Windows 10 Pro 22H1 x64
I'm not really going to defend the GT730, because I don't agree with it. It easily isn't as bad as what AMD did with the RX 560, but it isn't good either.
Isnt as bad? This is the same as your post mentioning you need to look more than the packaging, as you need to look in SKU and serial number to look exactly what is your getting.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
My point was that people usually overreact about AMD and not so about Nvidia.

Wait, what? When nVidia was renaming G92 people were losing their f'n minds. When AMD does it, people have been just giving them a pass. Hell, people are still bringing up the 8800GT/9800GT reband to this day! I also don't remember anyone raising much stick back when they renamed the X1600 Pro to the X1300 XT then renamed the X1300 XT to X1650.

Isnt as bad? This is the same as your post mentioning you need to look more than the packaging, as you need to look in SKU and serial number to look exactly what is your getting.

It isn't as bad because nVidia didn't go back and change the card after it was released. They also made it easier to tell which version you were getting, 1GB DDR3 = Fermi, 2GB DDR3 = Kepler, 1GB DDR5 = Kepler.

The AIBs not adhering to nVidia's specs is another issue entirely, and on the AIBs.
 
Joined
Oct 2, 2015
Messages
2,991 (0.96/day)
Location
Argentina
System Name Ciel
Processor AMD Ryzen R5 5600X
Motherboard Asus Tuf Gaming B550 Plus
Cooling ID-Cooling 224-XT Basic
Memory 2x 16GB Kingston Fury 3600MHz@3933MHz
Video Card(s) Gainward Ghost 3060 Ti 8GB + Sapphire Pulse RX 6600 8GB
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB
Display(s) AOC Q27G3XMN + Samsung S22F350
Case Cougar MX410 Mesh-G
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W
Mouse EVGA X15
Keyboard VSG Alnilam
Software Windows 11
Memory shouldn't be a way of telling one card from another, the AIB can solder any chip they want, with the bus width they want, they already have the GPUs.
The chip maker has the responsibility of having a clear naming scheme, so no safe passes here.
 
Joined
Jul 5, 2013
Messages
25,559 (6.48/day)
Um, try again. The nVidia cards you listed are two identical cards with the same specs getting different names. That isn't exactly the same thing to what AMD is doing. Do you even have a grasp on what AMD did? Or you are just blindly defending them..and you're calling me a fanboy?
I use NVidia cards exclusively in my systems. So no, not an AMD fanboy. However, I do believe in calling a spade, a spade....
Even the GTX970 issue you brought up wasn't as bad as what AMD did. NVidia didn't go back and physically lower the specs of the GTX970, and hence reduce its performance, without telling anyone. The performance of the GTX970 was the same at all times, the card that reviewers got was the same card that the customer was buying. This is not the case with the AMD RX 560 issue. AMD has gone back and psychically lowered the specs and performance of the card without really making it clear to the customer.
Um, I had a 970 and was one of the people who sent it back when this was discovered. NVidia was not up front about the fact that part of the memory on the card was only 32bit access and would hamper overall performance when that section of memory was accessed. They hid those facts and then lied about them followed by trying to play it off like it was no big deal. Very bad form and completely unacceptable. They learned a hard lesson from those events. AMD's handling of THIS situation was one of failed communication and seemingly honest mistakes, which they have moved quickly to correct. NVidia tried to downplay and minimize their actions and only offered remedy due to public outrage. AMD has owned this problem and are actively correcting it. NVidia's shenanigans are just as bad if not worse in some cases.

So yeah, you're right. These are different situations. NVidia was far less responsible and tried to make excuses for their deceptions. AMD is being responsible and owning the problem.
I'm siding with AMD's actions in this situation because they are correct and appropriate. NVidia's track record isn't good by comparison. That doesn't make me a fanboy one way or the other. It means that I look at facts for what they are, not for what I want them to be. Now use that same idealistic mirror and take a good look for yourself, at yourself.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I use NVidia cards exclusively in my systems. So no, not an AMD fanboy. However, I do believe in calling a spade, a spade....

Um, I had a 970 and was one of the people who sent it back when this was discovered. NVidia was not up front about the fact that part of the memory on the card was only 32bit access and would hamper overall performance when that section of memory was accessed. They hid those facts and then lied about them followed by trying to play it off like it was no big deal. Very bad form and completely unacceptable. They learned a hard lesson from those events. AMD's handling of THIS situation was one of failed communication and seemingly honest mistakes, which they have moved quickly to correct. NVidia tried to downplay and minimize their actions and only offered remedy due to public outrage. AMD has owned this problem and are actively correcting it. NVidia's shenanigans are just as bad if not worse in some cases.

So yeah, you're right. These are different situations. NVidia was far less responsible and tried to make excuses for their deceptions. AMD is being responsible and owning the problem.
I'm siding with AMD's actions in this situation because they are correct and appropriate. NVidia's track record isn't good by comparison. That doesn't make me a fanboy one way or the other. It means that I look at facts for what they are, not for what I want them to be. Now use that same idealistic mirror and take a good look for yourself, at yourself.

See, that is where you an I disagree. In nVidia's case, they have never gone back and limited the performance of card after it was on the market and reviewed, AMD did(HD4850). And while nVidia has released multiple versions of cards with the same name, they've never gone back several months after a cards launch and released a weaker version silently with no way to determine which version you are getting from the packaging. That is what makes AMD's actions worse than nVidia's. Them not disclosing how the memory functions was bad, but it doesn't change the performance of the card. The GTX970s didn't suddenly get slower the minute the news broke that the memory was segmented. It stayed exactly the same as it was the day it was launched. NVidia didn't release GTX970s with non-segmented memory buses at first, then silently switch to versions with a segmented memory bus. The GTX970 you bought on launch was the same GTX970 you bought the day they were discontinued.

And at the same time you're willing to give AMD a pass on silently releasing a weaker card under the same name without any way for the consumer to tell from the packaging what version they are getting and simply a miscommunication and a marketing mistake, but you're not willing to give nVidia the benefit of the doubt that the GTX970 was a miscommunication and a marketing mistake. You say nVidia is bad because they tried to make excuses, but AMD is doing the exact same thing. They're excuse is they were just trying to make more RX 500 series options for the AIBs. That's completely bullshit. Wanting more RX 500 options doesn't mean releasing cards under the same name! You want more model names. And they are just trying to shift blame to the AIBs. You say nVidia only offered a remedy because of public outrage, well what do you call what AMD is doing here? These cards have been on the market for months. Do you think AMD was going to do anything about them if this didn't hit the news and there wasn't public outrage? Give me a break... You say AMD are actively correcting the problem, I don't see that happening. Their statement was they are going to work with the AIBs to make the versions more clear. That isn't correcting the problem. The cards and their poor packaging are still out there on the shelves, and will be for months until new stock with new packaging comes in. They are literally doing the least amount of work possible to appease the angry people. They should be recalling the cards, renaming this weaker RX 560 to the RX 555, reflashing all the recalled cards to RX 555, repackaging them and then redistributing them. But instead they just said "we'll ask the AIBs to make their packaging more clear, if they want, we might even say please."

So, I do look at the facts. And the fact is nVidia has done wrong, I'm not saying they haven't. I didn't say the GTX970 issue wasn't bad. But the fact is what AMD has done with the RX 560 is worse. The fact is AMD's action is worse, and their response is just as bad. That makes the RX 560 fiasco worse. Period. When you objectively look at the facts, that is the conclusion you get.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
@newtekie1
You make valid points, and I can see your perspective. Let's agree to disagree on the finer points and agree that both companies have made mistakes and bad choices.

And I think that is the point I failed to make, thank you. I think when people see others saying company XYZ screwed up, that they are also saying screw company XYZ, don't buy from them, you should buy from company ABC. For example when I say what AMD did was worse than nVidia, they interpret that as I'm saying screw AMD buy nVidia, but that isn't what I'm saying at all. AMD made a mistake, yes it is worse than what nVidia has done, but not by a lot, and definitely not anywhere near bad enough to get me to boycott them. I mean, this isn't an EA level of F up here.
 
Last edited:
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Wait, what? When nVidia was renaming G92 people were losing their f'n minds. When AMD does it, people have been just giving them a pass. Hell, people are still bringing up the 8800GT/9800GT reband to this day! I also don't remember anyone raising much stick back when they renamed the X1600 Pro to the X1300 XT then renamed the X1300 XT to X1650.

It isn't as bad because nVidia didn't go back and change the card after it was released. They also made it easier to tell which version you were getting, 1GB DDR3 = Fermi, 2GB DDR3 = Kepler, 1GB DDR5 = Kepler.

The AIBs not adhering to nVidia's specs is another issue entirely, and on the AIBs.

Everyone knows that the tech press where giving no excuse to AMD while doing the opposite with Intel and Nvidia. For years. There are plenty of examples, but if we are going to call black white and white black, there is no point in listing examples. The tech press treats AMD with less hostility the last 12-18 months. Maybe they realised at some time that without AMD, the PC hardware market would end up extremely dull and most sites will have a problem remaining open.

About that "1GB DDR3 = Fermi, 2GB DDR3 = Kepler, 1GB DDR5 = Kepler"...
What can I really say here? If you came to such a wrong conclusion, what should someone expect from another person with less knowledge about computer hardware? Even if we where assuming that all 1GB DDR3 cards where Fermi and all 2GB DDR3 cards where Kepler, the average consumer could never really know or even understand what that detail means. That's why companies have to offer ONE standard base model, from where the AIBs can create and offer whatever they like. When a company creates three different cards and put the same name outside their boxes, in my opinion, it just tries to scam people from day one, instead of waiting a few months and trying to scam them latter.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Everyone knows that the tech press where giving no excuse to AMD while doing the opposite with Intel and Nvidia. For years. There are plenty of examples, but if we are going to call black white and white black, there is no point in listing examples. The tech press treats AMD with less hostility the last 12-18 months. Maybe they realised at some time that without AMD, the PC hardware market would end up extremely dull and most sites will have a problem remaining open.

Last 12-18 months? What about when AMD rebranded the same card twice like two years before nVidia did it with G92 and no one batted an eye?

Everyone has been giving AMD the pass for years. They have only recently started really calling them on their crappy moves.

About that "1GB DDR3 = Fermi, 2GB DDR3 = Kepler, 1GB DDR5 = Kepler"...
What can I really say here? If you came to such a wrong conclusion, what should someone expect from another person with less knowledge about computer hardware? Even if we where assuming that all 1GB DDR3 cards where Fermi and all 2GB DDR3 cards where Kepler, the average consumer could never really know or even understand what that detail means. That's why companies have to offer ONE standard base model, from where the AIBs can create and offer whatever they like. When a company creates three different cards and put the same name outside their boxes, in my opinion, it just tries to scam people from day one, instead of waiting a few months and trying to scam them latter.

What wrong conclusion? They made it so you can identify what card you are getting before you buy just by looking at the front of the box. Yes, you have to know the different versions exist. The GT730, I've said and I guess I have to say again, is still bad but the RX 560 is a worse situation.
 

OneMoar

There is Always Moar
Joined
Apr 9, 2010
Messages
8,744 (1.71/day)
Location
Rochester area
System Name RPC MK2.5
Processor Ryzen 5800x
Motherboard Gigabyte Aorus Pro V2
Cooling Enermax ETX-T50RGB
Memory CL16 BL2K16G36C16U4RL 3600 1:1 micron e-die
Video Card(s) GIGABYTE RTX 3070 Ti GAMING OC
Storage ADATA SX8200PRO NVME 512GB, Intel 545s 500GBSSD, ADATA SU800 SSD, 3TB Spinner
Display(s) LG Ultra Gear 32 1440p 165hz Dell 1440p 75hz
Case Phanteks P300 /w 300A front panel conversion
Audio Device(s) onboard
Power Supply SeaSonic Focus+ Platinum 750W
Mouse Kone burst Pro
Keyboard EVGA Z15
Software Windows 11 +startisallback
what is it about amd where they can't go a quarter without a major screw up
 
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Last 12-18 months? What about when AMD rebranded the same card twice like two years before nVidia did it with G92 and no one batted an eye?

Everyone has been giving AMD the pass for years. They have only recently started really calling them on their crappy moves.

As I said, if you are going to paint black white and white black, I see no reasons to lose my time by listing examples. As for ancient examples, I can also remember the Geforce 4 MX case, a card that was coming from the Geforce 2 series and sold with the Geforce 4 series. As for the 8000/9000 series, that was the worst case with cards dying after a few years of usage. Great cards(that 9600GT was a diamond), limited life time.

What wrong conclusion? They made it so you can identify what card you are getting before you buy just by looking at the front of the box. Yes, you have to know the different versions exist. The GT730, I've said and I guess I have to say again, is still bad but the RX 560 is a worse situation.

You came to a completely wrong conclusion about the three versions of the GT 730 and that even after looking to that screenshot I posted in my first post and you insist even after informing you that you got it wrong. You still don't get it how to differentiate the models. You can't differentiate them based on the memory capacity. It's not as easy to differentiate the GT 730 versions as you might think. Maybe that's why you insist that the 560 fiasco is worst. And of course no one from that tech press, that favor AMD, did an article on the GT 730, or other low end Nvidia cards with multiple versions. No one write an editorial, no one did a direct comparison, no one asked Nvidia to come up with a reply, not an apology, a simple reply at least.

No matter how much memory on the card, 1GB, 2GB or 4GB, the card can be any of the three versions. You have to know to combine data bus and type of memory to know what version the card is. For me this is easy, for you I guess not. For the average consumer it's much worst than the RX 560 fiasco. Because if he buys the wrong RX 560, he will get maybe 85-90% of the performance of the good version. That's not the case with GT 730.

This is the case with GT 730.
GT 730 FPS issue (underperforming) | TechPowerUp Forums

Check the posts where they talk about the different versions, not just the first one, or those "0 seconds though, 10 seconds reply"
"What did you expect, it's a slow card"
type of posts.

@silentbogo 's post explains it better, if you think that my posts are just biased
https://www.techpowerup.com/forums/threads/gt-730-fps-issue-underperforming.220256/#post-3420815
to make it even more confusing, there were 3 variations:
1) Fermi-based card with 96 cores and 128-bit DDR3 (had that one and it sucks)
2) Kepler-based with 384 CUDA cores and 64-bit DDR3 (better, but limited mem bandwidth kills gaming performance).
3) Kepler-based with 384 cores and 64-bit GDDR5 (best budget card ever with 28W max TDP).

All three are still on the market, so an average Joe has to look carefully at what's in the box (the difference between *1 and *3 is tremendous).
I still have a Gainward GTX260 laying around (occasionally used as a space heater), but a 15-20% advantage over GT730 just does not make it a viable option any more.

In 3DMark'06 my old rig with Core2Duo E8400 and GTX260 was getting a bit less than ~16K marks. Same rig with stock GT730 (GK208, 2GB GDDR5) was pushing ~13.5K : 15% advantage. In games the difference is even less noticeable.

Basically, what I'm trying to say, is that the newer GT730 is still a very capable card, but you've got to be realistic about it's limitations.

I am always suggesting the GDDR5 version of GT 730 to people searching for low end cards. AMD has nothing to offer really.But I always insist in caution them to never ever be persuaded by others to get a DDR3 version. It will be their fault doing so.
 
Last edited:

silentbogo

Moderator
Staff member
Joined
Nov 20, 2013
Messages
5,473 (1.44/day)
Location
Kyiv, Ukraine
System Name WS#1337
Processor Ryzen 7 3800X
Motherboard ASUS X570-PLUS TUF Gaming
Cooling Xigmatek Scylla 240mm AIO
Memory 4x8GB Samsung DDR4 ECC UDIMM
Video Card(s) Inno3D RTX 3070 Ti iChill
Storage ADATA Legend 2TB + ADATA SX8200 Pro 1TB
Display(s) Samsung U24E590D (4K/UHD)
Case ghetto CM Cosmos RC-1000
Audio Device(s) ALC1220
Power Supply SeaSonic SSR-550FX (80+ GOLD)
Mouse Logitech G603
Keyboard Modecom Volcano Blade (Kailh choc LP)
VR HMD Google dreamview headset(aka fancy cardboard)
Software Windows 11, Ubuntu 20.04 LTS
@silentbogo 's post explains it better, if you think that my posts are just biased
https://www.techpowerup.com/forums/threads/gt-730-fps-issue-underperforming.220256/#post-3420815
Well, this situation is a bit different. Even though Camp Green had similar misnomenclature on GT730, it was a totally opposite situation.
Instead of "downgrading", they were slowly replacing old Fermi chips with Kepler (which is more of a "silent upgrade"). At least that was their intention, which did not account for OEMs hoarding older GPU ICs.
While VRAM type and size was on the box all the time, some OEMs failed to include how many CUDA cores the card has. The only distinct feature that gave up Kepler from Fermi was a 64-bit bus width.
 
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
They are different cases, but the result is the same. Some customers don't get what they think they get.

If I am not mistaken, AMD is not replacing the better version, it's adding a worst version also. So, it's not a downgrade. It's bad, but not what we would call a downgrade. If they are replacing all the good cards or the majority of them, if their intentions where to flood the market with 896 cards keeping the good GPUs with all 1024 functioning cores for their RX 660, then yes, it's really bad. Worst than what I thought.

In the case of GT 730, all cards came out at the same time and all cards are in the market at full availability years later from their release. I don't see any kind of replacement or silent upgrade really. Just a mess that will continue until the card's EOL.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
You came to a completely wrong conclusion about the three versions of the GT 730 and that even after looking to that screenshot I posted in my first post and you insist even after informing you that you got it wrong. You still don't get it how to differentiate the models. You can't differentiate them based on the memory capacity. It's not as easy to differentiate the GT 730 versions as you might think. Maybe that's why you insist that the 560 fiasco is worst. And of course no one from that tech press, that favor AMD, did an article on the GT 730, or other low end Nvidia cards with multiple versions. No one write an editorial, no one did a direct comparison, no one asked Nvidia to come up with a reply, not an apology, a simple reply at least.

No matter how much memory on the card, 1GB, 2GB or 4GB, the card can be any of the three versions. You have to know to combine data bus and type of memory to know what version the card is. For me this is easy, for you I guess not. For the average consumer it's much worst than the RX 560 fiasco. Because if he buys the wrong RX 560, he will get maybe 85-90% of the performance of the good version. That's not the case with GT 730.

This is the case with GT 730.
GT 730 FPS issue (underperforming) | TechPowerUp Forums

Check the posts where they talk about the different versions, not just the first one, or those "0 seconds though, 10 seconds reply"
"What did you expect, it's a slow card"
type of posts.

@silentbogo 's post explains it better, if you think that my posts are just biased
https://www.techpowerup.com/forums/threads/gt-730-fps-issue-underperforming.220256/#post-3420815
I am always suggesting the GDDR5 version of GT 730 to people searching for low end cards. AMD has nothing to offer really.But I always insist in caution them to never ever be persuaded by others to get a DDR3 version. It will be their fault doing so.

Again, nVidia made the specs clear enough that you can use just the memory amount and type to determine the card. Just look at you own screen shot. Notice how nVidia only lists one memory size for each of the 3 versions? DDR3 1GB = Fermi, that is the only one of the three versions that has 1GB DDR3. DDR3 2GB = Kepler, that is the only version of the three that has 2GB DDR3. And finally 1GB DDR5 = Better Kepler, that is the only one of the three that has that 1GB DDR5. These are the specs that nVidia put out, and it makes it possible to determine the version you are getting from looking at the front of the box. The problem is the AIBs didn't stick to nVidia's specs and released cards with different memory amounts. That is an issue, and why releasing 3 versions of the GT730 was bad. But AMD's RX 560 was a little worse, because AMD made no effort to make a way to easily distinguish what version of the card you are getting just by looking at the front of the box.
 
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
nVidia made the specs clear enough that you can use just the memory amount and type to determine the card.

No it's not , you have to look it up before to know that and there is absolutely no reason to believe that one would already know that there's a million versions with a million memory types being sold under the same name. And if that's the case then you can look up the the specs of a 560 and know that you're buying something slower as well.

Stop spinning it around in a million ways and repeat the same thing over and over. Should a random person just simply walk up to a store they would have no idea which of the GT 730 they are buying without doing some prior research just how they wouldn't know in the case of a 560. It's the same damn issue and it's equally bad.

Both AMD and Nvidia have been doing this forever and make no mistake they will keep doing it. Let's just agree that's how it is and stop trying to rank them on how one did something worse or better than the other.
 
Last edited:
Top