• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6400

Joined
Nov 17, 2016
Messages
152 (0.06/day)
You can say that we have already seen it on 1060, yes, but people who buy 1060 and 6400 do not overlap. If you ever bought a 1060, your goal is at least 3050 and 3060. Anyone who buys a 6400 as a gaming solution used to buy GT 620, 730 and similar cards. If there are still such players, they have something to replace, the performance gain will be large.
Utter nonsense.

I bought a 7870 for about £120, and then a 1060 6GB for £200.

I today have a i5-12500 without gpu.

The 620 had 1/16 of the shaders of a 680, and when the 730 came out it wasn't even announced, just dumped on the market


Nobody went out to buy a 730, they went out to buy one of the many affordable cards actually good for gaming. When I bought my 7870 I remember considering the GTX 750 Ti for a long time, but the 7870 was the same price and much better. There were any number of bad cards which you would be stupid to buy for gaming, and then from about £100 you just needed to decide how much GPU you wanted.

I would buy a 6400 today, if it was cheaper. Why not a 6600/3050 ? It costs £300, so no way.

At 40% of the performance of a 6600, give me a 6400 for £120 and I would buy it right now.

The 6400 is absolutely a mainstream product, in that the market now where I am is: 1030 - big price gap - 6400, 1050 ti, 6500 xt, 1650 - big price gap - 3050, 6600

Since the price for a good GPU is much higher now, that means that there is room in the market for more SKUs. Whereas the 730 release was just a plop out and it was irrelevant, the 6400 is priced between 1050 ti and 1060 release prices.

That makes it a gaming product in the same sense of 1050 ti. We can't say 'you bought a 1060, you must buy a 3060', if 3060 now is 2x the price of the 1060 then. I thought the 1060 was too much to spend on a GPU, and I certainly would not have bought a 1070, but it made sense to replace the 7870 as being 2x+ faster, so ok I bought the 1060 6GB for more money than I wanted to spend because it delivered good value, so I stretched my budget. Today I would not pay more for less, pay '70 money for '60 performance, that is crazy to me. I would much rather pay for lower performance, than burn $$$ in an inflated market. At least if you buy a cheap card you can game, and can upgrade later when the market is good. Now to pay $$$ for 2-year-old tech and then in 6 months oops, your 3080 priced like a 1090 is now a 4060? Ugh, no.
 
Joined
Feb 25, 2018
Messages
4 (0.00/day)
I bought a 7870 for about £120, and then a 1060 6GB for £200.
The choice may be based on the desired performance. The 1060 was cheaper than the 3060, but it may be more important for you to stay with the same level of comfort by paying more money. Therefore, I evaluate the 6400 as a low-end product, and not just for its price.

ps

in my country I ordered 6400 for ~$265 there are no other prices. :)
 
Joined
Nov 17, 2016
Messages
152 (0.06/day)
i consider gaming to be a luxury/discretionary purchase. Therefore I'm buying on value for money.

Right now on my IGP I could play games such as AOE4 at 1080p low. While I can't play AAA games such as Total War, I could game 24/7 playing indie games or whatever.

A low end GPU would get me into AAA games at low settings, so that is a big value in that sense, as opposed to not being able to play them. More money = more fps/detail, which is not as important maybe as being able to play at all, but if you have 60 fps for £100 or 120 fps for £150, it makes sense to go for the more expensive card even if you have a 60 Hz monitor, because that card is going to need replacement less soon.

But if you have 60 fps for £200 or 120fps for £375, due to unusual market conditions, the 60 fps card makes more sense honestly, because in a generation or two you should get twice the performance for half the money, so buy the cheapest card now rather than buying on fps/$ in an inflated market
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Utter nonsense.

I bought a 7870 for about £120, and then a 1060 6GB for £200.

I today have a i5-12500 without gpu.

The 620 had 1/16 of the shaders of a 680, and when the 730 came out it wasn't even announced, just dumped on the market


Nobody went out to buy a 730, they went out to buy one of the many affordable cards actually good for gaming. When I bought my 7870 I remember considering the GTX 750 Ti for a long time, but the 7870 was the same price and much better. There were any number of bad cards which you would be stupid to buy for gaming, and then from about £100 you just needed to decide how much GPU you wanted.

I would buy a 6400 today, if it was cheaper. Why not a 6600/3050 ? It costs £300, so no way.

At 40% of the performance of a 6600, give me a 6400 for £120 and I would buy it right now.

The 6400 is absolutely a mainstream product, in that the market now where I am is: 1030 - big price gap - 6400, 1050 ti, 6500 xt, 1650 - big price gap - 3050, 6600

Since the price for a good GPU is much higher now, that means that there is room in the market for more SKUs. Whereas the 730 release was just a plop out and it was irrelevant, the 6400 is priced between 1050 ti and 1060 release prices.

That makes it a gaming product in the same sense of 1050 ti. We can't say 'you bought a 1060, you must buy a 3060', if 3060 now is 2x the price of the 1060 then. I thought the 1060 was too much to spend on a GPU, and I certainly would not have bought a 1070, but it made sense to replace the 7870 as being 2x+ faster, so ok I bought the 1060 6GB for more money than I wanted to spend because it delivered good value, so I stretched my budget. Today I would not pay more for less, pay '70 money for '60 performance, that is crazy to me. I would much rather pay for lower performance, than burn $$$ in an inflated market. At least if you buy a cheap card you can game, and can upgrade later when the market is good. Now to pay $$$ for 2-year-old tech and then in 6 months oops, your 3080 priced like a 1090 is now a 4060? Ugh, no.
GPU pricing in the early 2010s was absolutely nuts though - I bought my old HD 6950 in 2011 for NOK 1356. In that case, that was also helped by a weak USD/strong Norwegian Krone, as the conversion rate at the time was around 1 USD = 5.7 NOK. At the moment it's around 1 USD = 9.4 NOK, which naturally drives up prices on imported goods with USD pricing to match. Still, accounting for inflation (1356 2011-NOK = 1731 2022-NOK) and exchange developments (9.4/5.7=1.65), that card would theoretically have cost me 2856 NOK today. That's a lot more, of course - but it was AMD's third highest GPU at the time (behind the 6970 and 6990), and had launched less than a year before (even if the 7000-series would be launching a few months later). For comparison, today 6400s and 1650s start at ~2200 NOK, 6500 XTs at ~2400, and 1650 Supers at ~2600. So, for the money that in 2011 got me a third-highest tier GPU, I could today get a slightly fancy third-from-the-bottom GPU instead. Which would, 11 years later, deliver slightly more than twice the performance according to TPU's database.

The big question is what, exactly, changed since then. Because that HD 6950 had an MSRP of $299 - NOK 1700 at the time, not accounting for the 25% Norwegian VAT (which would have made it 2130 NOK). So what exactly made it possible for me to buy it less than a year after launch for just 64% of MSRP? Clearly PCB costs are much higher today, with the signalling requirements of GDDR6 (vs. 5) and PCIe 4.0 (vs. 2.0). The massive wattages, fluctuating clocks and need for very tightly controlled voltages of current GPUs also push VRM costs quite a bit higher - even if the 6950 was a 200W GPU (yet mine had a 2-slot, single fan HIS cooler, and was pretty quiet!).

So ... were margins just far better back then? Was production that much simpler, bringing down costs? I have no idea, to be honest. Even just four years later, my next upgrade, a Fury X, was NOK 7000 (at the time, 1USD = 8.2 NOK). Of course that was a stupidly priced flagship, and there were far better value cards further down the stack. Still, something clearly started happening in the mid-2010s bringing GPU prices much, much higher in my part of the world. Or did we all just start paying big bucks for the most expensive SKUs, incentivizing GPU makers to push average product pricing ever higher?
 
Joined
Nov 17, 2016
Messages
152 (0.06/day)
afaict what happened is:

1. silicon shortage (mobile phone prices are up this year, by around 20%, due to cost of chipsets)
2. miners inflated the price of everything because gpu = coins

When I bought my 1060 it was late in the cycle, so it was cheaper than earlier.

That would always tend to happen.

Now a new 1050 ti is more than five years ago, but the 1030 is much less inflated.

To me while there is a small effect from silicon shortage, everything else is just miners.
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
afaict what happened is:

1. silicon shortage (mobile phone prices are up this year, by around 20%, due to cost of chipsets)
2. miners inflated the price of everything because gpu = coins

When I bought my 1060 it was late in the cycle, so it was cheaper than earlier.

That would always tend to happen.

Now a new 1050 ti is more than five years ago, but the 1030 is much less inflated.

To me while there is a small effect from silicon shortage, everything else is just miners.
That's definitely a part of it, but it doesn't explain how prices seemed to take off in the early 2010s - the first mining boom was around 2017-2018, after all, and the silicon shortage started in 2020. As with everything, it's likely a complex combination of factors. In addition to the two you point out, there's also
- increasing production costs due to higher complexity
- BOM cost increases due to more/more advanced components required
- increasing material costs in recent years
- shareholders demanding higher margins
- higher demand as PC gaming has gone from a relatively niche hobby to global phenomenon in the past 10-ish years
- the former development bringing in a lot more wealthy people, driving up demand for expensive hardware
- the fact that Nvidia was essentially uncontested in the upper ranges of performance for a few years
- the massive mindshare advantage Nvidia has, allowing them to push up prices due to the perception of being better (and AMD then following suit with pricing to not appear as a cheap option and because shareholders like that)
- AIB partners compensating for years of near-zero margins
- wild swings in the competitiveness of consoles - from outdated pre-2013, to low-end post-2013, to suddenly high-end post 2020

And probably a lot more. Still, I really, really hope we can see a return to a more sensible GPU pricing and market segmentation model some time soon. The current order just isn't tenable at all, and IMO PC gaming will shrink back to a niche quite rapidly if this continues.
 
Joined
May 8, 2021
Messages
1,978 (1.87/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Your arguments show something else. And if you find people countering your arguments "annoying", that's on you, not me.
On you to nitpick on things, when I clearly am aware of correct terminology.

MXM had its heyday around 2010, with ever-dwindling use since then. Also, please note that the 1060 launched 6 years ago. That's quite a while - and even back then it was exceedingly rare.
I didn't realize that 1060 was that old.

I'm not talking about this specific implementation, I'm talking about the rationale behind designing the Navi 24 die as weirdly as they did. I'm not saying it makes sense, I'm saying that this is the likely explanation.
That wasn't even close to clear, but okay.

.... do you think PCBs are designed in MSPaint?
You can basically take designed 6400 PCB and erase extra 16x part, that's how much design it would take to make it.


... so now there aren't that many options? Because, to refresh your memory, this whole branch of the discussion started out with someone paraphrasign you, saying "You basically said that almost every modern graphics card except for the 6400 and 6500 XT has some kind of video encoder in it, which is true. How many more options do you need?" So ... is this bad because it's much worse than everything else, or is it bad because it's failing to provide a much-needed, missing function in this market segment? It's one or the other, as those options are mutually exclusive.
Yeah, basically there are many cards like RX 6900 XT or RTX 3090, that mere mortals can't afford. There are only handful of cards that are cheap enough and can decode/encode. And there always used to be plenty of such cards that can (for their time).


They might have become, but what you're quoting already accounts for that.
I remember 1050 Tis being sold out, when 1650 came out and now 1050 Tis are back. I think they restarted production.

There's something wrong with the TPU DB entry there, as it doesn't align with TPU's own benchmark results - they show it ~matching the 1650, yet the database shows the 1650 as 22% faster. The entry is wrong, somehow.
:D

Also, it's kind of ...odd? to reference the database result rather than actual test results when you're in the comments thread for that review. Just saying.
I already showed another source, 6400 matched 1050 Ti. Test was done on gen 4 connector too.

Which sh*t i7 (I've got two)? I've just watched a 4K Youtube video on my 4765T and integrated graphics (HD 4600) without any lag or dropped frame. It's a 9 year-old 35 Watt CPU, mind you. if this thing can do it, then anything above an i5 2500 can do it, and you can literally pick one up for pennies.
I would want to see you say the same if you had Core 2 Quad, which is still oaky for everyday tasks, but might not cut it anymore for YT.


That "quite close" i3 is a lower-mid-tier CPU that was released 3 years before the Athlon X4 and still beats it.
You also quoted i3-4330, not 3220. Whatever, they only had 20% difference and base architecture of Athlon is Sandy bridge era old.


What performance? FM2 was never competitive against contemporary Intel CPUs.
It certainly was competitive in value comparisons. It beat Pentiums clearly and later could rival i3s in perf/money. They were really reasonable for low end system buyers, who still wanted to run games. You could get Athlon X4 for what Pentium for going for, that was no brainer.


I didn't miss your point. I only stated that it's irrelevant.
But it's not. Weren't you the one, who bitched about getting older core GT 710 just for decoding? In that case, you should know better about why this stuff is important.


I agree about the 6600, but I don't agree about the nvidia ones you mentioned, as they are too expensive here in the UK. The 1050 Ti is selling at about the same price as the 6400, which is quite frankly a ripoff. Quadros aren't only expensive, but hard to find, too.
I overestimated UK then. Used RX 580s are still quite "cheap" here.

-T600 at least in the US is selling for $250+, which is nearly $100 more than RX 6400
In Lithuania RX 6400 goes for 195 EUR, 6500 XT goes for 214 EUR, T600 ~200 EUR. Obviously 6500 XT would be "good" deal for horsepower only, but if you want full featured GPU, then T600 it is or 1050 Ti. The only thing for 150 EUR you can get is RX 550 or GTX 750 Ti.

-T2000 is equivlent to GTX 1650. T1000 is downclocked from that, and T600 has fewer cores from that. T600 performs nearly the same as 1050 Ti.
I only said that it's like 1650 LE, in practice it is 1050 Ti performance, maybe a bit faster than 1050 Ti. May be better deal than 6400 due to having all features and similar performance for same price. BTW T2000 is so rare, that TPU database doesn't have it, only mobile T2000. It practically doesn't exist. I haven't been able to find drivers either, so it doesn't exist at all. T1000 does and it's faster than GTX 1650. It literally has the same GPU (a bit downclocked), but faster vRAM (which compensates for slower GPU). T1000 is closer to GTX 1650 GDDR6 on hardware level.

-Where are you seeing it's slower than 1050 Ti? Just a few pages before yours shows that RX 6400 is faster than 1650 in most cases.
Already quoted from where. Seems like it may occasionally perform poorly in some games, that skews results or it has some driver issues. Perhaps 6400 is a bit faster, but worthless without decoding/encoding, ReLive, OC, 4X slot. On gen 3 systems it's 1050 Ti performance for sure, which makes it unappealing to basically anyone with somewhat recent system or older. For gen 2 system is straight up poo.
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
On you to nitpick on things, when I clearly am aware of correct terminology.
I'm not nitpicking, I'm pointing out that your core arguments - such as this somehow being similar in concept to an MXM adapter - make no sense, and seem to stem from a conflation of different understandings or uses of the same term.

Btw, does this mean you've realized how misguided that argument was, finally?
That wasn't even close to clear, but okay.
It really ought to be. There is no other logical direction for the discussion to go in - once it's established that the bottlenecks can't be put down to implementation issues - as the die is fully enabled in the relevant aspects (PCIe, VRAM) and thus not bottlenecked by the implementation, the question then shifts to "why is the die design unbalanced in this way". This is exactly the confusion I've been trying to point out that you've been having.
You can basically take designed 6400 PCB and erase extra 16x part, that's how much design it would take to make it.
Again: it's not difficult. But it takes more time than not doing so, and as it would be a new connector design, it would require new QC and likely trigger time-consuming further steps to ensure that nothing has been messed up by the changes to the connector (clearances, insertion force, friction against the connector, etc.). Thus, just leaving the copy-pasted x16 connector in there is the lowest effort, cheapest path forward.
Yeah, basically there are many cards like RX 6900 XT or RTX 3090, that mere mortals can't afford. There are only handful of cards that are cheap enough and can decode/encode. And there always used to be plenty of such cards that can (for their time).
That's quite a different stance than the comment that was originally responded to, triggering that whole mess. But at least we can agree on that much. I still don't think it's a major issue, but it's a bit of a let-down still. To me it just highlights that AMD designed this chip first and foremost for being paired with a 6000-series APU - but at least most of the same functionality can be had from any other APU or non-F Intel CPU.
I remember 1050 Tis being sold out, when 1650 came out and now 1050 Tis are back. I think they restarted production.
It's entirely possible they did, but that would still involve essentially zero R&D, as all of that has already been done. All that would be needed would be some basic recalibration of various steps of the lithography process. Everything else is already done. And, of course, it's entirely possible that Nvidia had a stockpile of GP107 dice sitting in a warehouse somewhere. I'm not saying they did, but it wouldn't be all that surprising - there are always surplus stocks. Plus, GP107 is used for the MX350 GPU as well, which points towards some continuation of production, at least intermittently - that launched in 2020, after all.
?
I already showed another source, 6400 matched 1050 Ti. Test was done on gen 4 connector too.
Doesn't matter, as your original source is contradicted by the review that data is supposedly based on. There is something wrong with the TPU database entry. Period. TPU's review shows the 6400 being within 1-3% of the GTX 1650, which is 25% faster than the 1050 Ti according to the same database. That would make the 6400 ~23% faster than the 1050 Ti from TPU's testing, despite the 1050 Ti not being present in the 6400 review itself. Hardware Unboxed's review showed the 6400 being 28% faster (or the 1050 Ti being 40% slower, depending on where you set your baseline for comparison). I trust both TPU and HWUB far more than your rando Youtuber, sorry.

Edit: taking a look at that video, I definitely trust TPU and HWUB more. First off, there's also zero information about test methodology, test scenes, etc., so there's no way of ensuring the data is reliable. But there's also something else going on here, as ... well, the results are just too even. Like, every game is within a few FPS of each other. That just doesn't happen across a selection of games like that. No two GPUs from different manufacturers, across different architectures, scale that linearly unless they are bottlenecked elsewhere. This makes me think that either they failed to benchmark this properly, had some sort of system limitation, or the data was fudged somehow.

For example, their HZD results show the 6400 at 40/18fps avg/min vs. the 1050 Ti's 43/21 - at "1080p High". HWUB in HZD, at 1080p "Favor Quality" (which is likely what they mean by "High" in your video) get 49/41 (avg/1% low) for the 6400, and 38/33 for the 1050 Ti. They also test CoD:WZ, which ... well, it's a multiplayer title, you literally can't make a repeatable test sequence in a game like that. Unless the data is gathered over literal hours of gameplay, that data should be discarded, as there's no way of knowing if it's representative.

Their AC Valhalla results are also pretty dubious - at 1080p high they place the 1050 Ti at 33/16 fps and the 6400 at 36/18 fps. HWUB's ACV testing, at 1080p medium, has the 1050 Ti at 39/29fps and the 6400 at 54/41fps. While lows and 1% lows can't really be compared (lows can be random outliers, that's why you use 1% or .1% lows), but the difference here just doesn't make sense. They're not using the same settings or test scene, but even accounting for that massive amount of variability, it doesn't make sense for the 6400's performance to increase by 50% going from high to medium, while the 1050 Ti's performance only increases by 18% from the same change.

So, to summarize: we have two reviews from reputable sources showing significant performance gains, and one unknown one with some significant methodological and data-related red flags showing them to be equal. You're welcome to disagree (and I'd love to hear your arguments for doing so if that's the case!), but I choose to trust the two reviews from trustworthy sources.
 
Joined
Jan 14, 2019
Messages
9,725 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
I would want to see you say the same if you had Core 2 Quad, which is still oaky for everyday tasks, but might not cut it anymore for YT.
Why would I have a Core 2 Quad when I can pick up a Sandy or Ivy i5 from any second-hand store or electronics recycling centre for literally a couple of pounds? Building my i7 4765T system cost me about £100. The whole system! And you don't need an i7 for Youtube, I just wanted the top of the line 35 W chip because why not.

The point is: there are countless options for building a low-spec PC for nearly free to watch Youtube. There is absolutely zero need to stick to a 10+ year-old / extremely weak CPU.

But it's not. Weren't you the one, who bitched about getting older core GT 710 just for decoding? In that case, you should know better about why this stuff is important.
Bitching? What the...? :twitch: I said that the 6400 can decode all the video formats that the 710 can, and H.265 on top of that. Does this look like bitching to you?

You also quoted i3-4330, not 3220. Whatever, they only had 20% difference and base architecture of Athlon is Sandy bridge era old.
Correct. The 4330 was released in 2013, the X4 845 in 2016. I don't give a damn about what architecture it is. The only thing that matters is that it's newer and slower and was selling within a similar price range.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Building my i7 4765T system cost me about £100. The whole system! And you don't need an i7 for Youtube, I just wanted the top of the line 35 W chip because why not.
Hate to break it to you, but there's an i7-4785T as well ;)

(I'm also curious how you got that build so cheap, given how high motherboard prices are these days - but this is getting rather OT now)
 
Last edited:
Joined
Jan 14, 2019
Messages
9,725 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
I was expecting my Sapphire Pulse 6400 to arrive tomorrow, but thanks to a mistake with the courier, I got it today! :D I've already done some testing with the 1050 Ti, now I'm waiting for a few games to install on the HTPC with the 6400 in it. I'll post results as soon as I have them (hopefully within an hour or two).

Hate to break it to you, but there's an i7-4785T as well ;)

(I'm also curious how you got that build so cheap, given how high motherboard prices are these days - but this is getting rather OT now)
I know, but I couldn't find one at the time. :ohwell: As for the build, the motherboard was actually a gift from a friend. The CPU was quite expensive compared to what other contemporary parts sell for - making up around half of the budget. The other half was a cheap mini-ITX case, an even cheaper SFX PSU (just a basic 300 W unit) and 8 GB second-hand RAM. I admit that I had a SATA SSD laying around, so that was "free" too.

If you don't count gifts and spare parts, I'd still say that a build like this can be had for £150-200 (depending on your storage needs) if you know where to find the parts... and it's still overkill for Youtube. :D
 
Joined
May 21, 2009
Messages
264 (0.05/day)
Processor Intel Core i3 8350K @4.6Ghz
Motherboard ASUS Z370-P
Cooling Thermalright Macho Rev.C
Memory 12GB DDR4 2400Mhz
Video Card(s) Gigabyte Geforce GTX 1050 2GB DDR5 128bits
Storage 2TB + 2TB
Display(s) Samsung SyncMaster 794MB
Software Tubuntu 22.04 LTS x64 + Windows 8.1 x64
Benchmark Scores Cinebench R15 Single Thread: 215 points
in good news rx 6400 arrive to microcenter


and only wait for what happen because rx 6400 in this moment stay around 10 to 15us more cheap than gtx 1050ti and aroud 40us less than gtx 1650

:)
 
Joined
Feb 20, 2020
Messages
9,162 (6.11/day)
Location
Louisianna
System Name Ghetto Rigs z490|x99|Acer 17 Nitro 7840hs/ 5600c40-2x16/ 4060/ 1tb acer stock m.2/ 4tb sn850x
Processor 10900k w/Optimus Foundation | 5930k w/Black Noctua D15
Motherboard z490 Maximus XII Apex | x99 Sabertooth
Cooling oCool D5 res-combo/280 GTX/ Optimus Foundation/ gpu water block | Blk D15
Memory Trident-Z Royal 4000c16 2x16gb | Trident-Z 3200c14 4x8gb
Video Card(s) Titan Xp-water | evga 980ti gaming-w/ air
Storage 970evo+500gb & sn850x 4tb | 860 pro 256gb | Acer m.2 1tb/ sn850x 4tb| Many2.5" sata's ssd 3.5hdd's
Display(s) 1-AOC G2460PG 24"G-Sync 144Hz/ 2nd 1-ASUS VG248QE 24"/ 3rd LG 43" series
Case D450 | Cherry Entertainment center on Test bench
Audio Device(s) Built in Realtek x2 with 2-Insignia 2.0 sound bars & 1-LG sound bar
Power Supply EVGA 1000P2 with APC AX1500 | 850P2 with CyberPower-GX1325U
Mouse Redragon 901 Perdition x3
Keyboard G710+x3
Software Win-7 pro x3 and win-10 & 11pro x3
Benchmark Scores Are in the benchmark section
Hi,
Reminds me of the last tpu poll on the home page

Would you buy a 4gb gpu in 2022 maybe it was 2021
Believe the responses were mostly no and maybe hell no.
 
Joined
May 8, 2021
Messages
1,978 (1.87/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
I'm not nitpicking, I'm pointing out that your core arguments - such as this somehow being similar in concept to an MXM adapter - make no sense, and seem to stem from a conflation of different understandings or uses of the same term.

Btw, does this mean you've realized how misguided that argument was, finally?
No, that's you who don't understand.

Again: it's not difficult. But it takes more time than not doing so, and as it would be a new connector design, it would require new QC and likely trigger time-consuming further steps to ensure that nothing has been messed up by the changes to the connector (clearances, insertion force, friction against the connector, etc.). Thus, just leaving the copy-pasted x16 connector in there is the lowest effort, cheapest path forward.
Lowest effort? Maybe, but unlikely the cheapest. There's nothing else to do, but shrink connector, remove traces to it and remove caps if traces have them. That's all. Probably takes like 5 minutes in professional fab software to do this.

That's quite a different stance than the comment that was originally responded to, triggering that whole mess. But at least we can agree on that much. I still don't think it's a major issue, but it's a bit of a let-down still. To me it just highlights that AMD designed this chip first and foremost for being paired with a 6000-series APU - but at least most of the same functionality can be had from any other APU or non-F Intel CPU.
When you consider the lack of decoding/encoding, x4 PCIe, no ReLive, no overclocking, downclocking. This whole deal just stinks. It also alienates some previously interested audiences, like people with old systems that just wanna watch videos without frame skipping.

It's entirely possible they did, but that would still involve essentially zero R&D, as all of that has already been done. All that would be needed would be some basic recalibration of various steps of the lithography process. Everything else is already done. And, of course, it's entirely possible that Nvidia had a stockpile of GP107 dice sitting in a warehouse somewhere. I'm not saying they did, but it wouldn't be all that surprising - there are always surplus stocks. Plus, GP107 is used for the MX350 GPU as well, which points towards some continuation of production, at least intermittently - that launched in 2020, after all.
Hypothetically, it would be possible to make a new card, that has older lithography and uses DDR4 or DDR5 memory with high bus width (meaning more lower capacity chips, instead of few bigger capacity chips that are faster). And to reduce RnD expenses, it could be relaunched GTX 1060 or GTX 1070 GPU, but with clock speed reduced, so that it is more efficient. If you look at how much cheaper less than cutting edge nodes are, you would realize that you could basically make bigger dies on older node, than smaller ones with new node and that would be cheaper. That would be an ideal cheap card to relaunch as GPU shortage special.


You're welcome to disagree (and I'd love to hear your arguments for doing so if that's the case!), but I choose to trust the two reviews from trustworthy sources.
I won't, just that it's unusual when so many sources have rather different data. There still might be some driver related issues leading to inconsistent performance between different systems.

Why would I have a Core 2 Quad when I can pick up a Sandy or Ivy i5 from any second-hand store or electronics recycling centre for literally a couple of pounds? Building my i7 4765T system cost me about £100. The whole system! And you don't need an i7 for Youtube, I just wanted the top of the line 35 W chip because why not.

The point is: there are countless options for building a low-spec PC for nearly free to watch Youtube. There is absolutely zero need to stick to a 10+ year-old / extremely weak CPU.
Real life example: my school had shitty Pentium D machines, that were quite woeful. It guy put some GT 610, so that they could run videos for time being. It worked. All for like 40 EUR per machine, instead of 100.

Hypothetical example: You got free computer with BD drive. You wanna play movies, but GPU is too old to have decoding and CPU is not cutting it. You get lowest end card that decodes.

Bitching? What the...? :twitch: I said that the 6400 can decode all the video formats that the 710 can, and H.265 on top of that. Does this look like bitching to you?
Yet another completely misunderstood sentence. I said that you got GT 710 for playing back videos, but when it arrived it turned out to be older variant (Fermi?) and AFAIK it didn't decode or something. You bitched about that and later got GT 1030, which was good enough.

Doesn't RX 6400 look exactly like the same trap?

Correct. The 4330 was released in 2013, the X4 845 in 2016. I don't give a damn about what architecture it is. The only thing that matters is that it's newer and slower and was selling within a similar price range.
And I don't give a damn about Athlon either, but I use it as EXAMPLE of something you may find in older machine (performance wise only). And it does play 1080p fine, but you want more, so you get decoding capable card.

BTW that old i3 was not comparable to Athlon. Athlon was going for 80-90 EUR, meanwhile that i3 went for 130 EUR and that's without normalizing for inflation. That was very significant price increase for not really a lot more. And since I bought it late and for already existing system, which was not intended for anything in particular, yeah it ended up clearly unplanned and not the most economically efficient. Due to buying parts late I got Athlon for ~40 EUR, which was not great deal, but okay deal. And if you want to compare performance, don't use userbenchmark. It's a meme page of utter incompetence, huge bias, shilling... Using Cinebench alone would be better than userbenchmark for any reference of CPU performance.

I was able to find data about about their Cinebench scores. In R15 multicore test Athlon X4 845 stock got 320 points. i3 3220 got 294 points. You would think that Athlon is better, but nah it's not. It shares two FPUs with 4 integer cores. As long as all cores are utilized, it does have better multicore performance, but if not, then it quite weak. It doesn't have L3 cache at all. i3 on the other hand has two faster FPU/Integer cores, but to improve multicore performance, it uses HT. So it's faster if you don't need 4 threads or can't utilize them all, but once you do, it overall performs worse than Athlon. Also HT can sometimes reduce performance, if scheduling is poor in software code. Lack of L3 cache, means that Athlon can stutter badly in games. There's higher chance of experiencing downsides of cache miss. Or if you have code that doesn't fit in L2, you gonna have shitty time. i3 on the other hand also tends to stutter, but it's due to the fact that software is made to use more than 2 cores and thus it has to reprocess whole code to make it function on two cores. It does lead to lag spikes, if code is complicated enough. HT isn't efficient either and is code dependent and can only work, if cores aren't already saturated enough with core (aka if all of their instruction sets and pipeline width isn't utilized). So it's very hard to predict their performance, it can be very inconsistent too. Still, i3 is better with older or code that is difficult to multithread or FPU heavy code, meanwhile Athlon is better at more recent code with mostly integer operations. Athlon may also be a lot better due to having more modern instruction array. i3 may not be able to launch some software, due to old instructions being available. Good ole K10 architecture literally became obsolete, not due to lack of performance, but mostly by how quickly software started to require newer versions of SSE or FMA. In terms of efficiency, it's a tie, because Athlon X4 is not a bulldozer or excavator part, it's Carrizo part, taken from laptops with jacked up clock speeds. It still retained quite a lot of efficiency advantage of its origins, meanwhile i3 was just decent part from get go. Athlon can be overclocked technically, but its multi is locked, so you need to raise base speed, without changing other speeds (like PCIe bus speed, RAM speed, HyperTransport speed, RAM controller speed, legacy PCI speed and anything else you might have). So practically, it's too damn hard to do and FM2+ boards don't have good locks for speed separation, like in socket 754 days. i3 is not tweakable at all. Still thee are people, who managed to reach 4.9 GHz with Athlon, so YMMV. Athlon also has a lot of room for undervolting. You can reduce voltage by 0.2-0.3 volts on them.
 
Joined
Jan 14, 2019
Messages
9,725 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
Okay fellas, I've got my very first first-hand experiences with the 6400. :)

Here's the comparison with the 1050 Ti that I promised:

1. The main system: Asus Tuf B560M-Wifi, Core i7 11700 with 200 W power limits, 32 GB 3200 MHz RAM (dual channel of course), and the 1050 Ti (Palit KalmX).
2. The "ultimate bottleneck" system: Asus Tuf A520M, Ryzen 3 3100, 16 GB 3200 MHz RAM in single channel for the ultimate bottleneck, and the 6400 (Sapphire Pulse) in PCI-e 3.0 (because of the A520).

Here are my results:

3DMark TimeSpy graphics score:
1050 Ti: 2253,
6400: 3681.
No comment here. What works, works.

Superposition 1080p Extreme:
1050 Ti: 1279,
6400: 2012.
No comment here, either.

Cyberpunk 2077 (1080, low/med/high/ultra) average FPS:
1050 Ti: 29.54 / 24.03 / 18.57 / 14.89,
6400: 44.98 / 35.10 / 26.18 / 16.33.
Comment: The minimum FPS consistently followed the same trend as the average with the 1050 Ti, while it always stayed around 13-15 FPS with the 6400, regardless of setting used. It only shows in the data, though - it did not affect the smoothness of the game. Another weirdness with the second system: the main menu kept freezing for a quarter of the second every half minute or so. I have played the game with the same CPU without issues before. I don't know why this thing happened now. It might be the single-channel RAM, it might be the PCI-e 3.0 bus. Who knows? All in all, the game is enjoyable with the 6400 at low and medium settings, while the 1050 Ti struggles.

Red Dead Redemption 2 (1080, no advanced setting chosen, "overall quality" slider all the way to the left / all the way to the right):
1050 Ti: 50.87 / 29.69,
6400: 90.73 / 88.99.
Comment: It seems to me that the game chooses different "minimum" and "maximum" settings for you depending on your hardware, which is weird (though I'm not sure). Apart from this, both systems ran the game well on minimum, while the 1050 Ti struggled with the maximum setting (which may or may not be a higher graphical detail level than what the 6400 ran at because of the aforementioned weirdness).

Metro Exodus (1080, low/normal/high preset):
1050 Ti: 54.88 / 30.75 / 22.87,
6400: 20.34 / 19.19 / 15.74.
Comment: Something completely killed the 6400 system in this game. The benchmark looked unplayable on all settings. Again, I don't know if it's the single channel RAM, or the PCI-e 3.0. I might test this again later with a dual channel RAM config.

Mass Effect Andromeda (1080, max settings):
1050 Ti: between 20-30 FPS,
6400: between 25-50 FPS.
Comment: Both cards ran the game okay-ish, though it was a touch more enjoyable with the 6400. Asset loading seems to be an issue with both, as the FPS dropped when I was entering a new area or engaging a conversation with an NPC. It looks like a VRAM limit to me, although the 6400 was a tiny bit more affected, which again, might be due to the single-channel RAM, or the PCI-e 3.0 bus. I would still have the 6400 rather than the 1050 Ti for this game without a second thought.

Conclusion: If you're looking for a 6400 for gaming with an old system, the main thing you need to ask yourself is what games you want to play. It can give you a decent experience, or a totally shitty one depending on the answer.

Other stuff: This low profile Sapphire 6400 has a thick, flat heatpipe running through the length of the cooler, so the GPU runs as cool as the MSi one in the review while holding a relatively steady 2310 MHz, despite being a lot smaller. It also has idle fan stop, which is triggered at 50 °C. It will not be a gaming card, so all in all, I'm happy with the purchase. :)

Edit: The real test of the card will be its intended use: a 4K 60 Hz home theatre experience. :D
 
Last edited:
Joined
May 21, 2009
Messages
264 (0.05/day)
Processor Intel Core i3 8350K @4.6Ghz
Motherboard ASUS Z370-P
Cooling Thermalright Macho Rev.C
Memory 12GB DDR4 2400Mhz
Video Card(s) Gigabyte Geforce GTX 1050 2GB DDR5 128bits
Storage 2TB + 2TB
Display(s) Samsung SyncMaster 794MB
Software Tubuntu 22.04 LTS x64 + Windows 8.1 x64
Benchmark Scores Cinebench R15 Single Thread: 215 points
Edit: The real test of the card will be its intended use: a 4K 60 Hz home theatre experience. :D
with av1 hardware acceleration ?

:)
 
Joined
Jan 14, 2019
Messages
9,725 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
with av1 hardware acceleration ?

:)
I don't need it. The Ryzen 3 can handle it, no problem. :)

Real life example: my school had shitty Pentium D machines, that were quite woeful. It guy put some GT 610, so that they could run videos for time being. It worked. All for like 40 EUR per machine, instead of 100.

Hypothetical example: You got free computer with BD drive. You wanna play movies, but GPU is too old to have decoding and CPU is not cutting it. You get lowest end card that decodes.
Real life example: a school not wanting to spend money, or not having the funding is an entirely different matter altogether. I could mention countless of examples when we don't have basic necessities available at my workplace, even though I work for a multimillion GBP revenue company.

Hypothetical example: That computer is not suited for the task you want to use it for. It still doesn't mean that you can't pick up a Sandy i5 for a couple of quid and use that with the BD drive.

Yet another completely misunderstood sentence. I said that you got GT 710 for playing back videos, but when it arrived it turned out to be older variant (Fermi?) and AFAIK it didn't decode or something. You bitched about that and later got GT 1030, which was good enough.

Doesn't RX 6400 look exactly like the same trap?
The what did I "bitch" about now? (I still hate this word) I'm starting to lose track of all the things that you think I said.

And I don't give a damn about Athlon either, but I use it as EXAMPLE of something you may find in older machine (performance wise only). And it does play 1080p fine, but you want more, so you get decoding capable card.
... or you upgrade your CPU which is a lot cheaper, considering current GPU prices.

BTW that old i3 was not comparable to Athlon. Athlon was going for 80-90 EUR, meanwhile that i3 went for 130 EUR and that's without normalizing for inflation. That was very significant price increase for not really a lot more. And since I bought it late and for already existing system, which was not intended for anything in particular, yeah it ended up clearly unplanned and not the most economically efficient. Due to buying parts late I got Athlon for ~40 EUR, which was not great deal, but okay deal. And if you want to compare performance, don't use userbenchmark. It's a meme page of utter incompetence, huge bias, shilling... Using Cinebench alone would be better than userbenchmark for any reference of CPU performance.
Don't forget about the timescale. That i3 was selling for 130 EUR 3 years before the Athlon even appeared. In 2016, Intel was already pushing Skylake (or Kaby Lake? I'm not sure), so the 4330 was already used market territory by then.
 
Last edited:
Joined
May 8, 2021
Messages
1,978 (1.87/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Real life example: a school not wanting to spend money, or not having the funding is an entirely different matter altogether. I could mention countless of examples when we don't have basic necessities available at my workplace, even though I work for a multimillion GBP revenue company.
I thought that UK would know better than that, as it has quite a reputation for being rich country and having quite good overall welfare with century old capitalism. I thought that Lithuania might not provide that, because basically whole commerce is only 30 year old, and due to being rather "poor" (by EU standards) country with some sketchy bosses it would have been too interested in cost cutting, but it seems to be universal thing worldwide. Maybe with some exceptions like Germany, Netherlands, Denmark being somewhat different.


Hypothetical example: That computer is not suited for the task you want to use it for. It still doesn't mean that you can't pick up a Sandy i5 for a couple of quid and use that with the BD drive.
Machines like that can totally play 1080p videos just fine in VLC, just not in Youtube. They don't lack decoding capabilities altogether, but they lack support for newer codecs and Youtube's Javascript is too much for them. My Athlon 64 3200+ on socket 754 indeed could play back a fricking BD movie just fine, but Youtube at 360p is complete no go. Say what you want, but that's stupid. But to be fair, I used ATi X800 Pro AGP (PCIe version used newer core) card with Athlon, so it may have helped there. As far as I know that old card has ATi AVIVO decoder and it can decode H264, VC-1, WMV-9, MPEG-2. BD is probably lossless H264. That old card even had streaming capabilities. Sure it was at potato 320x240 at 15 fps, but it's accelerated and it was in 2004. In that one aspect it beats RX 6400.


The what did I "bitch" about now? (I still hate this word) I'm starting to lose track of all the things that you think I said.
You complained about GT 710

... or you upgrade your CPU which is a lot cheaper, considering current GPU prices.
I told you it was an example, I don't need your advice here. But maybe CPU is cheaper to upgrade, but depending on your needs, perhaps GT 1030 may be enough. At least it has Vp9 and HEVC decoding. VP9 is common on YT. H264 is used for <720p videos only. However for Netflix, you may need AV1 capable card (AV1 was experimental I think). In that case, it is worth it to upgrade to Alder Lake Pentium or i3.


Don't forget about the timescale. That i3 was selling for 130 EUR 3 years before the Athlon even appeared. In 2016, Intel was already pushing Skylake (or Kaby Lake? I'm not sure), so the 4330 was already used market territory by then.
You could have bought Athlon X4 740 back then too. It was only 70 USD at launch. I don't see value in i3. For i3's price, you could have got yourself FX 8320. Even FX 6300 was cheaper. Now tell me about i3's superior value. And way before i3, you could have bought FM1 based Athlon II X4 6xx series chip for less with basically the same performance as later bulldozer FM2 chips. i3 was awfully overpriced. Even today i3 doesn't cost 130 USD.
 
Joined
Mar 15, 2017
Messages
178 (0.07/day)
The review has been updated with GT1030 numbers .. I hate you guys .. testing at these FPS rates is such a shitshow
IMG_20220430_233604.png


Oh my, I think I can physically feel the pain...

And speaking of pain -- I glanced over the last few pages of this thread and all I could think of was this eternally relevant comic:



comparison with the 1050 Ti

1. The main system: Core i7 11700, 32 GB 3200 MHz RAM (dual channel), 1050 Ti
2. The "ultimate bottleneck" system: Ryzen 3 3100, 16 GB 3200 MHz RAM in single channel, the 6400, PCI-e 3.0

I don't get it. Why would you use two wildly different PCs for comparing two different cards? Why cripple one card and not the other one? Why not just use the faster PC, bench both cards in succsession, have some comparable data, then change the PCI-E link speed, rebench and investigate PCI-E 3.0 losses?
 
Last edited:
Joined
Jan 14, 2019
Messages
9,725 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
And speaking of pain -- I glanced over the last few pages of this thread and all I could think of was this eternally relevant comic:

I love this image! I'll take it if you don't mind. :D

I don't get it. Why would you use two wildly different PCs for comparing two different cards? Why cripple one card and not the other one? Why not just use the faster PC, bench both cards in succsession, have some comparable data, then change the PCI-E link speed, rebench and investigate PCI-E 3.0 losses?
There was an argument earlier in this forum. One forum member said that the 1050 Ti is a much better value than the 6400 even at the same price. I think it was also said that the 1050 Ti was faster. I wanted to test this claim by giving the 1050 Ti a head start in a faster PC, and cripple the 6400 as much as possible.

You might call my other reason laziness: my main PC has a 2070 in it, so I didn't have to reinstall the driver to test the 1050 Ti, and the 6400 is meant for my secondary rig anyway. :ohwell:

I thought that UK would know better than that, as it has quite a reputation for being rich country and having quite good overall welfare with century old capitalism. I thought that Lithuania might not provide that, because basically whole commerce is only 30 year old, and due to being rather "poor" (by EU standards) country with some sketchy bosses it would have been too interested in cost cutting, but it seems to be universal thing worldwide. Maybe with some exceptions like Germany, Netherlands, Denmark being somewhat different.
It's not a UK vs Lithuania thing. It's more like a "companies don't get rich by spending money" kind of thing. But let's stop the off-topic here. ;)

Machines like that can totally play 1080p videos just fine in VLC, just not in Youtube. They don't lack decoding capabilities altogether, but they lack support for newer codecs and Youtube's Javascript is too much for them. My Athlon 64 3200+ on socket 754 indeed could play back a fricking BD movie just fine, but Youtube at 360p is complete no go.
That's exactly why you need a new PC if you want to play Youtube in HD. My point stands that just because you have an Athlon 64 or Athlon X4 or whatever at hand, it doesn't mean that it's fit for your purpose. It was fine, but it's not anymore. Nothing stops you from walking into a computer recycling centre and paying 5-10 EUR for a Sandy i5. It's even (a lot) cheaper than buying a new graphics card, and it won't only give you HD videos, but the whole system will be faster as well. But if you'd rather pay a hundred EUR just for a fecking video codec, be my guest.

Say what you want, but that's stupid.
Is that your point? Closing ears, closing eyes, "lalala, I'm not listening because you're stupid"? Very mature indeed.

You complained about GT 710
I did not. I said that the 6400's decoder supports all the formats that the 710's does, plus H.265. This is not a complaint. This is a fact.

I told you it was an example, I don't need your advice here.
It's a counter-example, not advice.

But maybe CPU is cheaper to upgrade, but depending on your needs, perhaps GT 1030 may be enough. At least it has Vp9 and HEVC decoding. VP9 is common on YT. H264 is used for <720p videos only. However for Netflix, you may need AV1 capable card (AV1 was experimental I think). In that case, it is worth it to upgrade to Alder Lake Pentium or i3.
The 6400 has VP9 and HEVC decode as well. I agree that the 1030 is enough for 99% of HTPC uses, as long as you don't need HDMI 2.1. It's only that the 1030 costs around £100, which is a terrible deal, imo.

As for an Alder Lake upgrade, I played with the thought before buying the 6400, but a motherboard and a CPU would have cost me around £150-200, and then I would have ended up with a slower CPU than what I have now. An i3 with the same performance as my Ryzen 3 would have cost me £250 for the whole system. Not worth it. As for people coming from older systems, they might also need to buy some DDR4 RAM which makes it even more expensive. If they have some DDR3 laying around, picking up an old i5 from a second-hand store for a couple of quid is still a lot better deal.

But to be fair, I used ATi X800 Pro AGP (PCIe version used newer core) card with Athlon, so it may have helped there. As far as I know that old card has ATi AVIVO decoder and it can decode H264, VC-1, WMV-9, MPEG-2. BD is probably lossless H264. That old card even had streaming capabilities. Sure it was at potato 320x240 at 15 fps, but it's accelerated and it was in 2004. In that one aspect it beats RX 6400.
Are you seriously comparing the 6400 to a top tier card from 20 years ago? Jesus...
 
Joined
Mar 15, 2017
Messages
178 (0.07/day)
I love this image! I'll take it if you don't mind. :D

Go for it. It's xkcd that should take the credit anyway. Just try not to do as is shown :)

There was an argument earlier in this forum.
You might call my other reason laziness.

Fair enough.

walking into a computer recycling centre and paying 5-10 EUR for a Sandy i5

I feel obliged to say this, because I saw you bring up this argument a few times:
You should feel blessed that you have that option! I just checked my local used ads (in Bulgaria) and the good old 2500K sells for anything from $30 to $50. This is in stark contrast to our purchasing power, which is half of yours or even lower. I hope you can understand that even old used stuff isn't exactly cheap for us. The used market doesn't make any sense sometimes, but it is what it is.

Be grateful for what you have and don't just assume everyone has your (good) options.

I'm just leaving this here as a thinking point. If anyone agrees to disagree, he is free to do so. I will not engage on the topic any further, as you guys already have a pretty heated debate.
 
Joined
Jan 14, 2019
Messages
9,725 (5.12/day)
Location
Midlands, UK
System Name Nebulon-B Mk. 4
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 7800 XT
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Benchmark Scores Cinebench R23 single-core: 1,800, multi-core: 18,000. Superposition 1080p Extreme: 9,900.
I feel obliged to say this, because I saw you bring up this argument a few times:
You should feel blessed that you have that option! I just checked my local used ads (in Bulgaria) and the good old 2500K sells for anything from $30 to $50. This is in stark contrast to our purchasing power, which is half of yours or even lower. I hope you can understand that even old used stuff isn't exactly cheap for us. The used market doesn't make any sense sometimes, but it is what it is.

Be grateful for what you have and don't just assume everyone has your (good) options.

I'm just leaving this here as a thinking point. If anyone agrees to disagree, he is free to do so. I will not engage on the topic any further, as you guys already have a pretty heated debate.
That's sad. :( Does this make buying a new graphics card a better deal, though?
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I don't get it. Why would you use two wildly different PCs for comparing two different cards? Why cripple one card and not the other one? Why not just use the faster PC, bench both cards in succsession, have some comparable data, then change the PCI-E link speed, rebench and investigate PCI-E 3.0 losses?
I like this comparison for exactly that reason: people are up in arms around the 6400/6500 XT being bottlenecked, especially on slower PCIe standards, so this should represent a kind of worst-case-scenario comparison. And it still (mostly) soundly beats the 1050 Ti, despite people here claiming that it matches it on PCIe 4.0. Also, we have plenty of sources for 6400 testing on PCIe 4.0 and with a fast, current-gen CPU - inlcuding the review this thread is commenting on. We hardly need more of those, while what we do need is testing on older, slower platforms where the 6400 is more relevant as a potential upgrade. Of course one could then argue that the 1050 Ti also should be tested on that. But ... does it ultimately matter? Judging by the test results, no. There's no way the 1050 Ti will perform better on a slower system, so that point is moot.
 
Joined
Mar 15, 2017
Messages
178 (0.07/day)
That's sad. :( Does this make buying a new graphics card a better deal, though?

OK, market analysis time. I'll keep it about the 6400 as to not wander off topic:

The most popular model I could find is the AsRock Challenger ITX. Prices are in the $200-260 range for it (standard 20% VAT included), most often around ~$210. Every shop I checked out claimed they have stock.

Saw several offers for the ASUS Dual going from $265 to $285.

Just for laughs I also got some offers from one of our bigger IT shops and they have:
ASUS Dual for $285
MSI Aero ITX for $390
They also claim they have no stock...

Then, for even bigger laughs, we have our biggest e-tailer with these wonderful offers:
AsRock Challenger ITX for $350
MSI Aero ITX for $435
ASUS Dual for $460
All of these are not in stock with a delivery of 8 days.

Now, the ~$210 price doesn't sound half bad once you find out that you average 1050 Ti model retails for around $230 and more. The 6400 is probably the most sensible card you can buy new over here right now.

The average net wage here is 1/3 to 1/4 of what you have over there. Unless you have disposable income - yes, it's a f****** expensive hobby.

I like this comparison for exactly that reason

Alright, I gotcha.
 
Last edited:
Joined
Dec 28, 2012
Messages
3,475 (0.85/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
The total lockdown of the 6400xt clocks is terrible. There's no reason for that, the 6500xt is unlocked, as is the older rx 550 and 560. AMD is really trying to cover up just how horribly gimped the 6500 is.

Now, if nvidia did this with the 3050 or GT 1030 there'd be screeching from the rooftops. When AMD does it, crickets....
What point? This is a 1650-eqivalent card with the same amount of VRAM and the same VRAM bandwidth. The 1650 does it with 128-bit GDDR5, the 6400 with 64-bit GDDR6. Only that low profile 1650s go for £250-300 on ebay while the 6400 costs £160 new. What's not to like?
Well, off the top of my head:

1. it's 3% SLOWER then the 1650 at 1080p. The 1650 launched 3 years ago for the same $159 price point. That is technically negative price/perf movement over 3 years, an absolute embarrassment. rDNA 2 is much more capable then this.
2. it's still limited to 4GB of VRAM, even at 1080p this is an issue for my much slower RX 560 in certian games. Paying 1650 prices for a GPU with the same limitation is just silly.
3. it's got the same PCIe limitation as the 6500xt. Many of us are running pci gen 3 systems where the 6400 will lose additional performance, widening the gap with the now 3 year old 1650.

Yeah, I know the 1650 is more expensive on ebay. I dont care. MSRP on the 6400 is WAY too high. What should have been released was a 6500xt at this $159 price point, with a 6GB 96 bit bus and PCI x8 connectivity, clocked more reasonably so as to say in 75w. The 6500xt can easily do it, it's clocked way out of its efficiency curvy to try and maintain more performance.

That's definitely a part of it, but it doesn't explain how prices seemed to take off in the early 2010s - the first mining boom was around 2017-2018, after all, and the silicon shortage started in 2020. As with everything, it's likely a complex combination of factors. In addition to the two you point out, there's also
- increasing production costs due to higher complexity
- BOM cost increases due to more/more advanced components required
- increasing material costs in recent years
- shareholders demanding higher margins
- higher demand as PC gaming has gone from a relatively niche hobby to global phenomenon in the past 10-ish years
- the former development bringing in a lot more wealthy people, driving up demand for expensive hardware
- the fact that Nvidia was essentially uncontested in the upper ranges of performance for a few years
- the massive mindshare advantage Nvidia has, allowing them to push up prices due to the perception of being better (and AMD then following suit with pricing to not appear as a cheap option and because shareholders like that)
- AIB partners compensating for years of near-zero margins
- wild swings in the competitiveness of consoles - from outdated pre-2013, to low-end post-2013, to suddenly high-end post 2020

And probably a lot more. Still, I really, really hope we can see a return to a more sensible GPU pricing and market segmentation model some time soon. The current order just isn't tenable at all, and IMO PC gaming will shrink back to a niche quite rapidly if this continues.
Slight correction, the first mining boom was in 2013-2014. That's what drove prices of the R9 290x through the roof, at the time GCN was far and away superior to kepler at mining. That coupled with the titan's success is what helped pushed prices as high as they are now.
 
Top