Tuesday, September 22nd 2020

AMD Radeon "Navy Flounder" Features 40CU, 192-bit GDDR6 Memory

AMD uses offbeat codenames such as the "Great Horned Owl," "Sienna Cichlid" and "Navy Flounder" to identify sources of leaks internally. One such upcoming product, codenamed "Navy Flounder," is shaping up to be a possible successor to the RX 5500 XT, the company's 1080p segment-leading product. According to ROCm compute code fished out by stblr on Reddit, this GPU is configured with 40 compute units, a step up from 14 on the RX 5500 XT, and retains a 192-bit wide GDDR6 memory interface.

Assuming the RDNA2 compute unit on next-gen Radeon RX graphics processors has the same number of stream processors per CU, we're looking at 2,560 stream processors for the "Navy Flounder," compared to 80 on "Sienna Cichlid." The 192-bit wide memory interface allows a high degree of segmentation for AMD's product managers for graphics cards under the $250-mark.
Sources: VideoCardz, stblr (Reddit)
Add your own comment

135 Comments on AMD Radeon "Navy Flounder" Features 40CU, 192-bit GDDR6 Memory

#76
sergionography
ValantarYou sound like you're arguing that perf/$ scales linearly across GPU lineups. This has never been the case. Except for the very low end, where value is always terrible, you always get far more bang for your buck in the $150-400 mid-range than anything above. I mean, just look at the data:

Results are similar at other resolutions, though more expensive GPUs "improve" at 4k. As for value increasing as you drop in price: just look at the 5600 XT compared to the 5700 XT in those charts. Same architecture, same generation, same die, yet the cheaper card delivers significantly higher perf/W than the more expensive one.

As for your comparison: you're comparing across generations, so any value comparison is inherently skewed to the point of being invalid. If it weren't the case that you got more bang for your buck in a new generation, something would be very wrong. As it admittedly has been for a couple of generations now, and the 3080 hasn't fixed it either, just brought us back closer to where things should be. It is absolutely to be expected that all future GPUs from both manufacturers at lower points in their product stacks will deliver significantly better perf/$ than the 3080. That's the nature of premium products: you pay more for the privilege of having the best. $700 GPUs have never been a good value proposition.

We're literally a week post launch. Demand has been crazy, scalpers with bots have run rampant, and everything is sold out. Give it a while to normalize before you comment on "real-world" prices. And while Nvidia's previous "here's the MSRP, here's the somehow premium, but also base-line FE card at a premium price" shenanigans and the near-nonexistence of cards at MSRP, to be fair to them they seem to have stopped doing that this time around.

Right. As if I ever said that. Maybe actually read my post? Nice straw man you've got there.

Again: accounted for, if you had bothered to actually read my post.

Let me refresh your memory:

What I'm saying here is that the SoC TDP only accounting for 50% of the PSU's rating, which might include PSU losses due to efficiency sounds a bit low. I'm asking you to source a number that you're stating as fact. Remember, you said:

No source, no mention that this is speculation or even anything beyond established fact. Which is what I was asking you to provide.

Unsourced, based on rumors and speculation. The page says as much.

That is at least a correlation, but correlation does not mean that the rumor you are quoting as fact is actually true. This is what you call speculation.

And here is the core of the matter: you are repeating rumors and "what you think is likely" as if it is indisputable fact. It is of course entirely possible that the PS5 SoC has a TDP somewhere around 175W - but you don't have any actual proof of this. So please stop repeating rumors as if they are facts. That is a really bad habit.
Oh I understood what you said, I was just pointing out that power rating and TDP are related but not the same thing. And I linked the ps4 to show the correlation that TDP rating being 50% of the complete system power rating is pretty normal. And yes we are speculating that's what most of these comments are so I am not sure why you are so riled up about it. Even my original post was mostly making educated guesses of what likely the line up will look like. As for PS5 most of the specs are basically out so it's less of a rumor at this point. But we will wait and see
Posted on Reply
#77
dragontamer5788
This has been going around Reddit, and basically sums up the rumors:



Its a bit of a meme and non-serious, but I think yall get the gist. There's so many conflicting rumors on Navi its hard to take any of these "leaks" seriously.
Posted on Reply
#78
Valantar
Xex360You still didn't understand my point, I'm talking about the MSRP not the current prices due to low offer, the official price is set higher by nVidia for all markets compared to the US and Canada by sometimes stupid margins even in Taiwan, so the price/performance of the card is very poor.
Uhm, that's the opposite of what you just said:
Xex360I'm not talking about official prices around the world
But okay, that makes sense to a certain degree. How much of this is down to currency fluctuations, different levels of VAT, and so on? Likely quite a lot. The economic situation around the world has also changed quite dramatically during the past decade, with the US dollar strengthening itself compared to a lot of currencies, which sadly tends to have knock-on effects for importing industries (i.e. the higher prices impact multiple levels of a business, driving up costs and prices even more). There's also signs indicating a change in attitude where the US has previously been the cash cow, with a lot of other countries being given relatively advantageous pricing for various reasons (such as EU prices with VAT coming very close to US prices without sales tax). This seems to be gone now across most industries. Sadly that means stuff gets more expensive for most of us. But it's also an across-the-board increase - turing GPUs were certainly more expensive than any other series of GPUs here in Norway and Sweden, as were Pascal before them, etc.
sergionographyOh I understood what you said, I was just pointing out that power rating and TDP are related but not the same thing. And I linked the ps4 to show the correlation that TDP rating being 50% of the complete system power rating is pretty normal. And yes we are speculating that's what most of these comments are so I am not sure why you are so riled up about it. Even my original post was mostly making educated guesses of what likely the line up will look like. As for PS5 most of the specs are basically out so it's less of a rumor at this point. But we will wait and see
All I did was ask for a source for the 175W number in your original post, as a) I hadn't heard it before, and b) it sounded quite low. IMO it still does, and as the number you stated as fact is an unsourced and unsubstantiated rumor I'll hold off until we see the actual power draw of the console. I don't think it'll be 350W by any means, but 175W at those clock speeds sounds too good to be true.
dragontamer5788This has been going around Reddit, and basically sums up the rumors:



Its a bit of a meme and non-serious, but I think yall get the gist. There's so many conflicting rumors on Navi its hard to take any of these "leaks" seriously.
That looks about right. I think people are reading too much into a lot of things. It may well be that the 192/256-bit numbers are some sort of trickery (most leaks are controlled and on purpose, after all), it might indeed have HBM2e, or it might just have a plain GDDR6 bus. Or some other weirdness, like a cache. Until we have something more concrete on our hands, this is pretty useless. Interesting on paper, but useless without actual information.
Posted on Reply
#79
sergionography
ValantarAll I did was ask for a source for the 175W number in your original post, as a) I hadn't heard it before, and b) it sounded quite low. IMO it still does, and as the number you stated as fact is an unsourced and unsubstantiated rumor I'll hold off until we see the actual power draw of the console. I don't think it'll be 350W by any means, but 175W at those clock speeds sounds too good to be true.
Reasonable enough, it's only speculation so nothing is concrete. I just personally think it's doable. Lets lay down some of the actual facts or published info that we can relate to:
RX5700 has the same number of cores as the ps5 (2304)
RX5700 has a TDP of 225W
RDNA2 is 50% more efficient than RDNA1
according to AMD

now we can do our speculation/analysis:
If we do the math we can safely assume that 2304 RDNA2 shaders are 50% more efficient than RDNA1 shaders, so that leaves us with 150w at the same performance as rx5700, but in this case; it is clocked around 10-15% higher so that could account for the 175w rather than 150w. Also having cpu cores increases overall package (but not by much since it is clocked low and relies on smart shift technology)

Other things to consider:
Clockspeed for the ps5 is the turbo speed
TSMC 7nm is more mature
AMD is more likely to clock chips higher and sell them as a higher tier product than to clock them low at the sweet spot of performance/watt (this is in relation to this article and my first comment)
Posted on Reply
#80
Caring1
All these rumours surrounding Navy Flounder sound fishy to me. :p
Posted on Reply
#81
Aquinus
Resident Wat-man
Caring1All these rumours surrounding Navy Flounder sound fishy to me. :p
It sounds like the Kobayashi Maru has set sail for the promise land.
Posted on Reply
#82
Caring1
AquinusIt sounds like the Kobayashi Maru has set sail for the promise land.
I hope it doesn't hit a reef and flounder.
Posted on Reply
#83
dragontamer5788
Caring1I hope it doesn't hit a reef and flounder.
Nah, it finds a dragon, and Kobayashi and the Dragon become drinking buddies. Star Trek jokes are too old school. Time for the new jokes. :cool:

Oh right puns and stuff. Uhhh... food, fish, bottom feeding joke or something.... oh right. I hope my joke doesn't fall flat on anyone.
Posted on Reply
#84
Unregistered
ValantarBut okay, that makes sense to a certain degree. How much of this is down to currency fluctuations, different levels of VAT, and so on? Likely quite a lot. The economic situation around the world has also changed quite dramatically during the past decade, with the US dollar strengthening itself compared to a lot of currencies, which sadly tends to have knock-on effects for importing industries (i.e. the higher prices impact multiple levels of a business, driving up costs and prices even more). There's also signs indicating a change in attitude where the US has previously been the cash cow, with a lot of other countries being given relatively advantageous pricing for various reasons (such as EU prices with VAT coming very close to US prices without sales tax). This seems to be gone now across most industries. Sadly that means stuff gets more expensive for most of us. But it's also an across-the-board increase - turing GPUs were certainly more expensive than any other series of GPUs here in Norway and Sweden, as were Pascal before them, etc.
While I agree with your logic, unfortunately it doesn't hold in all markets, I feel it's nVidia's fault and its partners, given that comparable products don't have the same unjustified price hike, like products from Intel, AMD and consoles/phone makers.
Posted on Edit | Reply
#85
Anymal
Caring1All these rumours surrounding Navy Flounder sound fishy to me. :p
Flounder:
verb (used without object) : to struggle with stumbling or plunging movements (usually followed by about, along, on, through, etc.)
CMO of AMD is on a slippery road

Second thought, maybe it is water cooled. Fury RX.
Posted on Reply
#86
Assimilator
BoboOOZRedgamingtech and Moore's Law is Dead leaked about the cache:


Textures are not required for intensive calculations, what a cache is used for is for storing values that need to be accessed multiple times for multiple calculations.
It might be misinformation from AMD, but it's not completely crazy, especially if you listen to Mark Cerney's diatribe in the second video.
Yes yes, Moore's Law is Dead, that super impartial oracle who deems the 3080 "underwhelming" despite it beating the pants off everything else in the market, and says RDNA2 will crush it despite zero evidence. Definitely a reliable source.

I don't know about you, but I prefer my leaks not to come from the bleeding, distended rectum of an idiot fanboy.
Posted on Reply
#87
Camm
AssimilatorYes yes, Moore's Law is Dead, that super impartial oracle who deems the 3080 "underwhelming" despite it beating the pants off everything else in the market, and says RDNA2 will crush it despite zero evidence. Definitely a reliable source.

I don't know about you, but I prefer my leaks not to come from the bleeding, distended rectum of an idiot fanboy.
He was pretty reliable about leaks on the 3080 and 3090.

Also, Ampere IS underwhelming.

1\ Performance scales directly in line with power usage from a 2080 Ti.
2\ It draws buckets of power
3\ It fails when it boosts over 2Ghz
4\ It was a paper launch
5\ And isn't even value for money. You can't get any of these cards for MSRP except founders editions, and there are so little founders editions it makes that essentially irrelevant.

The only redeeming factor of Ampere at this point are they are the fastest cards around. But don't mistake it, this is fermi all over again.
Posted on Reply
#88
BoboOOZ
AssimilatorYes yes, Moore's Law is Dead, that super impartial oracle who deems the 3080 "underwhelming" despite it beating the pants off everything else in the market, and says RDNA2 will crush it despite zero evidence. Definitely a reliable source.

I don't know about you, but I prefer my leaks not to come from the bleeding, distended rectum of an idiot fanboy.
You speak like somebody who didn't watch the video. Anyways, I put the time scroll to the point where he discusses the memory management overhaul, that's the point there.

As to the quality of the leaks, I'm quite satisfied with the quality of the leaks and analyses Tom puts out, I've known from him the performance of the Nvidia cards 6 months ago it was right on the money the whole long.

As for the underwhelming aspect, I'm personally not underwhelmed, it's pretty much what I was expecting with the 3000 series. I agree about the hype though, people who believe this is the biggest generational leap are people who have known GPU's only for 4 years. It's a decent generation, who appears huge only because the generation before it was mediocre and expensive.
Posted on Reply
#89
Assimilator
CammHe was pretty reliable about leaks on the 3080 and 3090.

Also, Ampere IS underwhelming.

1\ Performance scales directly in line with power usage from a 2080 Ti.
2\ It draws buckets of power
3\ It fails when it boosts over 2Ghz
4\ It was a paper launch
5\ And isn't even value for money. You can't get any of these cards for MSRP except founders editions, and there are so little founders editions it makes that essentially irrelevant.

The only redeeming factor of Ampere at this point are they are the fastest cards around. But don't mistake it, this is fermi all over again.
1. Wrong: www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/31.html "Around 300 W is only 10% more power draw than the RTX 2080 Ti—while achieving +25% gaming performance."
2. Undisputed - but it puts that power to good use.
3. Anecdotal and uncorroborated. Could easily be PSUs that are not up to scratch.
4. No it wasn't. High demand is not a paper launch. Buy yourself a dictionary.
5. Supply and demand is NVIDIA's fault now?
Posted on Reply
#90
Valantar
Xex360While I agree with your logic, unfortunately it doesn't hold in all markets, I feel it's nVidia's fault and its partners, given that comparable products don't have the same unjustified price hike, like products from Intel, AMD and consoles/phone makers.
Well, I guess things change differently depending on your location. Here in the Nordics, prices have increased significantly across the board over the past decade. That's mostly due to the NOK/SEK to USD conversion rate, which made a big jump around 2015 or so, but as I said also due to knock-on effects from this. The same applies to prices in EUR though, as the same USD price jump can be seen there. This largely accounds for the change in practices where previously USD MSRP w/o tax ~= EU MSRP w/tax, simply because the EUR (and closely linked currencies) used to be worth more relative to the USD. That means that GPUs, consoles, phones, whatever - they've all become noticeably more expensive.
sergionographyReasonable enough, it's only speculation so nothing is concrete. I just personally think it's doable. Lets lay down some of the actual facts or published info that we can relate to:
RX5700 has the same number of cores as the ps5 (2304)
RX5700 has a TDP of 225W
RDNA2 is 50% more efficient than RDNA1
according to AMD

now we can do our speculation/analysis:
If we do the math we can safely assume that 2304 RDNA2 shaders are 50% more efficient than RDNA1 shaders, so that leaves us with 150w at the same performance as rx5700, but in this case; it is clocked around 10-15% higher so that could account for the 175w rather than 150w. Also having cpu cores increases overall package (but not by much since it is clocked low and relies on smart shift technology)

Other things to consider:
Clockspeed for the ps5 is the turbo speed
TSMC 7nm is more mature
AMD is more likely to clock chips higher and sell them as a higher tier product than to clock them low at the sweet spot of performance/watt (this is in relation to this article and my first comment)
That is of course possible, but remember that power increases exponentially as clock speeds increase, so a 10-15% increase in clocks never results in a 10-15% increase in power draw - something more along the lines of 25-35% is far more likely. Which is part of why I'm skeptical of this. Sony's rated clocks are as you say peak boost clocks, but they have promised that the console will run at or near those clocks for the vast majority of use cases. That means that you're running a slightly overclocked 4900H or HS (the consoles have the same reduced cache sizes as Renoir IIRC, so let's be generous and say they manage 3.5GHz all-core at 40W) and an overclocked 5700 within the SoC TDP. That leaves X minus 40W for the GPU. Your numbers then mean they would be able to run an overclocked 5700 equivalent at just 135W. If this was approached through a wide-and-slow, more CUs but lower clocks approach (like the XSX), I would be inclined to agree with you that it would be possible given the promised efficiency improvements (up to 50%, though "up to" makes that a very loose promise) and node improvements. But for a chip of the same width with clocks pushed likely as high as they are able to get them? We have plenty of data to go on for AMD GPU implementations like that (5700 XT vs 5700, RX 590 vs 580, etc.), and what that data shows us is that power consumption makes a very significant jump to reach those higher clocks. And while Smart Shift will of course help some with power balancing, it won't have that much to work with given the likely 40-50W power envelope of the CPU. Even lightly threaded games are unlikely to drop below 30W of CPU power consumption after all, so even that gives the GPU just 155W to work with.

You're also too optimistic in thinking that 50% perf/W increase is across the board in all cases. The wording was - very deliberately, as this was an investor call - up to 50%. That likely means a worst case vs best case scenario comparison, so something like a 5700 XT compared to the 6000-series equivalent of a 5600 XT. The PS5 GPU with its high clocks does not meet the criteria for being a best case scenario for efficiency. Of course that was stated a while ago, and they might have managed more than 50% best-case-scenario improvements, but that still doesn't mean we're likely to get 50% improvement when clocks are pushed high.
Caring1All these rumours surrounding Navy Flounder sound fishy to me. :p
AquinusIt sounds like the Kobayashi Maru has set sail for the promise land.
Caring1I hope it doesn't hit a reef and flounder.
dragontamer5788Oh right puns and stuff. Uhhh... food, fish, bottom feeding joke or something.... oh right. I hope my joke doesn't fall flat on anyone.
AnymalFlounder:
verb (used without object) : to struggle with stumbling or plunging movements (usually followed by about, along, on, through, etc.)
CMO of AMD is on a slippery road
All of those fish code names just made me think of Dr. Seuss.
Posted on Reply
#91
Camm
Assimilator1. Wrong: www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/31.html "Around 300 W is only 10% more power draw than the RTX 2080 Ti—while achieving +25% gaming performance."
2. Undisputed - but it puts that power to good use.
3. Anecdotal and uncorroborated. Could easily be PSUs that are not up to scratch.
4. No it wasn't. High demand is not a paper launch. Buy yourself a dictionary.
5. Supply and demand is NVIDIA's fault now?
1\ If you are going to quote techpowerup, at least use the right graph of performance per watt - www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/35.html. Its 3% more efficient on a 3080, and exactly the same on a 3090 in performance per watt to a 2080 Ti.
2\ Linear is linear, its the fastest GPU, but its also the most power hungry. Honestly, I expect more from a new architecture + node drop.
3\ Fairly well corroborated between multiple users and now a few in the tech press.
4\ Pull the other one. The tech press knew weeks before hand that there would be limited availability and reported as such. Low and behold.
5\ It's pretty obvious that there is fuck all margin between what Nvidia set the MSRP as and what it sells to AIB's for. Hence only Nvidia's cards are at MSRP and even the cheapest AIB cards are over MSRP.

Its somewhat funny, you blasted in claiming fanboi's everywhere, but man, aren't you the biggest one of all.
Posted on Reply
#92
laszlo
Assimilator4. No it wasn't. High demand is not a paper launch. Buy yourself a dictionary.
5. Supply and demand is NVIDIA's fault now?
unfortunately these points are NV faults period; they rushed the launch without having proper stock ; is not the 1st time when a company do this and backfire; even they cancelled the bot/scalper orders still no availability ..... it looks like a paper launch mostly because on ebay you can buy paper drawings of the cards ... and these are on stock...
Posted on Reply
#93
Valantar
Camm1\ If you are going to quote techpowerup, at least use the right graph of performance per watt - www.techpowerup.com/review/nvidia-geforce-rtx-3080-founders-edition/35.html. Its 3% more efficient on a 3080, and exactly the same on a 3090 in performance per watt to a 2080 Ti.
2\ Linear is linear, its the fastest GPU, but its also the most power hungry. Honestly, I expect more from a new architecture + node drop.
3\ Fairly well corroborated between multiple users and now a few in the tech press.
4\ Pull the other one. The tech press knew weeks before hand that there would be limited availability and reported as such. Low and behold.
5\ It's pretty obvious that there is fuck all margin between what Nvidia set the MSRP as and what it sells to AIB's for. Hence only Nvidia's cards are at MSRP and even the cheapest AIB cards are over MSRP.

Its somewhat funny, you blasted in claiming fanboi's everywhere, but man, aren't you the biggest one of all.
1 and 2: Power usage varies by resolution, and many games are power limited at 1080p for these GPUs, making power draw numbers below 1440p inaccurate. Besides, TPU bases all of their efficiency graphs on one power measurement (!) at 4k, meaning in CPU limited scenarios the actual power usage is likely to be notably lower than what is reported - but we can't know by how much, as this wasn't measured. There is an improvement, but it's most visible at 4k, and at lower resolutions (even when not CPU limited) it's not particularly impressive.
3: We still don't know what is causing this or what is happening. Might be a serious issue, might be a tiny firmware fix, might be PEBCAK.
4:
Not a paper launch, but very high demand, also (dramatically) inflated by scalpers using bots to vacuum up inventory.
5: Margins are absolutely razor thin, though given just how dense and expensive the FE PCB is, it should be entirely possible for AIB partners to make custom PCB cards at decent quality and sell them at MSRP - they just won't have as good components as the FE. This is a bit shitty from Nvidia's side, but it's better than them setting a precedent for MSPR+$100 as a baseline price like they did previously.
CammHe was pretty reliable about leaks on the 3080 and 3090.
BoboOOZAs to the quality of the leaks, I'm quite satisfied with the quality of the leaks and analyses Tom puts out
That is besides the point, and simply not how you validate your sources. That channel has time and time again presented rumors and garbage data as fact or "leaks", information that has since been shown to have been entirely wrong. That some of what is presented is accurate does in no way make up for this. As the saying goes, even a stopped clock is correct twice a day. MLID is a fundamentally untrustworthy source of information, and anything and everything presented there should be viewed as very dubious. Some of it might turn out to be true, but it's impossible to know what until we have confirmation from somewhere else. Until we can see a significant portion of time when all data presented turns out to be factually accurate, none of what is presented can be trusted to be true.
laszlounfortunately these points are NV faults period; they rushed the launch without having proper stock ; is not the 1st time when a company do this and backfire; even they cancelled the bot/scalper orders still no availability ..... it looks like a paper launch mostly because on ebay you can buy paper drawings of the cards ... and these are on stock...
See the GN video above. Stock levels are according to partners the same as or higher than previous launches. Nvidia certainly isn't blameless for the current situation, but this is not a paper launch.
Posted on Reply
#94
BoboOOZ
ValantarThat is besides the point, and simply not how you validate your sources. That channel has time and time again presented rumors and garbage data as fact or "leaks", information that has since been shown to have been entirely wrong. That some of what is presented is accurate does in no way make up for this. As the saying goes, even a stopped clock is correct twice a day. MLID is a fundamentally untrustworthy source of information, and anything and everything presented there should be viewed as very dubious. Some of it might turn out to be true, but it's impossible to know what until we have confirmation from somewhere else. Until we can see a significant portion of time when all data presented turns out to be factually accurate, none of what is presented can be trusted to be true.
I completely disagree with you there, but that's way off-topic. I'm pretty versed with analyzing information and on many points, Tom gets out relevant information better than others and before others, and you can always use your BS filter if you have one. I like his analyses and I'm comfortable with my ability to extract valid information out of them.
In any case, the part about the memory architecture overhaul is simply an extract from Cerney's PS5 discourse, so it's basically open-source information, not a leak.
Posted on Reply
#95
Valantar
BoboOOZI completely disagree with you there, but that's way off-topic. I'm pretty versed with analyzing information and on many points, Tom gets out relevant information better than others and before others, and you can always use your BS filter if you have one. I like his analyses and I'm comfortable with my ability to extract valid information out of them.
In any case, the part about the memory architecture overhaul is simply an extract from Cerney's PS5 discourse, so it's basically open-source information, not a leak.
Again, that really shouldn't be how you evaluate your sources. A source that has repeatedly presented inaccurate information without sufficiently underscoring the speculative nature of this or without then retracting that information immediately upon it being disproved is inherently untrustworthy. It then doesn't matter if their analyses are generally decent or whatever else they might do: the basis on which they perform their work can't be trusted, thus the work itself can't be trusted. That carries over even into cases where the data they are working from is widely known to be accurate, as it demonstrates an attitude towards thorough and proper treatment of data that is severely lacking. Your "BS filter" doesn't matter, as unless you are prescient (in which case, why the need for watching stuff like that?) you can't know which parts of the basis for the analysis are accurate or not. Not to mention that even the necessity of a "BS filter" when watching any type of analysis seriously underscores the low quality of the analysis. "It's good once you filter out the bad bits" ... well, yeah, but so is drinking sewage.
Posted on Reply
#96
Assimilator
ValantarAgain, that really shouldn't be how you evaluate your sources. A source that has repeatedly presented inaccurate information without sufficiently underscoring the speculative nature of this or without then retracting that information immediately upon it being disproved is inherently untrustworthy. It then doesn't matter if their analyses are generally decent or whatever else they might do: the basis on which they perform their work can't be trusted, thus the work itself can't be trusted. That carries over even into cases where the data they are working from is widely known to be accurate, as it demonstrates an attitude towards thorough and proper treatment of data that is severely lacking. Your "BS filter" doesn't matter, as unless you are prescient (in which case, why the need for watching stuff like that?) you can't know which parts of the basis for the analysis are accurate or not. Not to mention that even the necessity of a "BS filter" when watching any type of analysis seriously underscores the low quality of the analysis. "It's good once you filter out the bad bits" ... well, yeah, but so is drinking sewage.
The thing is, with trash channels like MLID it's not even filtering good from bad, because there is no good: it's regurgitated from legitimate sources like Gamers Nexus in order to make MLID appear legitimate, and he then abuses that apparent legitimacy to peddle his half-baked bullshit. Result, people who aren't good at discerning trash from quality believe it all and fall back on "but he was right regarding XXX (that he copied from a real source) so he must be right on YYY (nonsense that he crapped out)".

Melding your lies with the mainstream's truth in order to make your lies appear truthful is the oldest trick in the book when it comes to manipulating discourse and public opinion (see: Russia and US elections), and unfortunately most people choose news sources based on whether that source agrees with their worldview, rather than how trustworthy said source is. They also have a penchant for doubling down and defending "their" news source when said the credibility of said source is brought into question (instead of holding it accountable), or handwaving the source's inaccuracy away with excuses such as "everyone gets it wrong now and then". Except the dodgy sources get it wrong time and time again.

Make no mistake though, MLID is laughing all the way to the bank with every cent of ad revenue he gets from every chump who watches his reddit clickbait videos. Anyone who wants to reward a liar for his lies, that's your business - but don't expect me to do the same.
Posted on Reply
#97
Unregistered
AssimilatorYes yes, Moore's Law is Dead, that super impartial oracle who deems the 3080 "underwhelming" despite it beating the pants off everything else in the market, and says RDNA2 will crush it despite zero evidence. Definitely a reliable source.

I don't know about you, but I prefer my leaks not to come from the bleeding, distended rectum of an idiot fanboy.
Ampere is underwhelming because it's expensive the xx80 should've been 600$ and not come with only 10gb of vram. nVidia mislead people by comparing it to Turing which was a disaster price wise.
But I totally agree his analysis are messy and biased. I'm still waiting for updated PS5 hardware...
#98
BoboOOZ
Wow, you guys are so sincere in your opinions, it almost convinces me to consider my sources.
The only problem is, your posts reek of bias and ad hominem arguments. So thank you, but no thank you, I value Tom's leaks much more than your opinions, so we'll agree to disagree on this one.
Anyway, back on topic, there are some rumors that AMD has rehauled their memory architecture, but nobody's sure about it, although the people that gave this information also supplied RedGamingTech with real photos of the cards. We'll know more about it in a few weeks.
Posted on Reply
#99
Assimilator
BoboOOZWow, you guys are so sincere in your opinions, it almost convinces me to consider my sources.
The only problem is, your posts reek of bias and ad hominem arguments. So thank you, but no thank you, I value Tom's leaks much more than your opinions, so we'll agree to disagree on this one.
AssimilatorThey also have a penchant for doubling down and defending "their" news source
You don't have to be that predictable, y'know?
Posted on Reply
#100
Valantar
BoboOOZThe only problem is, your posts reek of bias and ad hominem arguments.
A sign of low quality debate: when a person who has a history of explicitly stating their support for one actor in a market and why (in my case: I prefer AMD due to a desire to support the underdog, plus my objections to Nvidia's history of shady business practices), and who has in fact never bought a product from the competing company, is nonetheless accused of being biased in favor of that competing company.

I am, of course, biased against MLID, as well as RedGamingTech and all those other low quality rumor-mongering clickbait YouTubers. I have so far seen zero reason to trust their content to be anything more than entertainment masquerading as news.

As for the ad honinems: really? Where? Seriously, please show some quotes. As far as I'm aware I haven't commented on you personally whatsoever. (And no, saying your approach to source criticism is poor is not an ad hominem.)
Posted on Reply
Add your own comment
Apr 19th, 2024 21:32 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts