Friday, June 4th 2021

AMD Breaks 30% CPU Market Share in Steam Hardware Survey

Today, Valve has updated its Steam Hardware Survey with the latest information about the market share of different processors. Steam Hardware Survey is a very good indicator of market movements, as it surveys users that are spread across millions of gaming systems that use Valve's Steam gaming platform. As Valve processes information, it reports it back to the public in a form of market share of different processors. Today, in the Steam Hardware Survey for May 2021, we got some interesting data to look at. Most notably, AMD has gained 0.65% CPU market share, increasing it from the previous 29.48% to 30.13%. This represents a major move for the company, which didn't own more than 30% market share with its CPUs on Steam Survey in years.

As the Steam Survey tracks even the market share of graphics cards, we got to see a slight change there as well. As far as GPUs go, AMD now holds 16.2% of the market share, which is a decrease from the previous 16.3%. For more details about Steam Hardware Survey for May 2021, please check out Steam's website here.
Source: Steam Hardware Survey
Add your own comment

104 Comments on AMD Breaks 30% CPU Market Share in Steam Hardware Survey

#76
TheLostSwede
ColddeckedI thought SeaMicro tech is what infinity fabric was based on?
Nah, IF is based on HyperTransport.
Posted on Reply
#77
The red spirit
Panther_SeraphinHence why Samsung/AMD are trying to add things like ray tracing etc into phones to make them "serious gaming" platforms. What I can see is that if the partnership plays out well Ninetendo may want in on that sort of chipset for the switch upgrade/replacement.
"serious" phone gaming is already dead. Hardware is there, but devs just don't publish anything truly worthy with controls not being an ass. I still remember the era, when EA released latest NFS games, Rockastar released GTAs, Gameloft released Modern Combat, and there were games like Angry Birds, Where's my Water and etc. Those games actually made mobile gaming truly great. There were Riptide GP, Nova series, Shadowgun 2, Dead Trigger, Gangstar Rio... So many great games, that had console like quality and actually pushed the boundaries of Android. I don't know a single game like that in current era, for which you would actually want to buy a phone just to play games. Right now it's Candy Crush making all the dough and maybe Genshin Impact, but what effectively ended phones as an actual gaming platform is that devs quit it with proper games. It doesn't matter if nVidia or RTG puts ray tracing on phones, the phones are dead, because they lack proper expensive titles that don't end up being hacked next day. Fortnite, CoD and PubesG don't have a proper story and aren't a proper games, they are multiplayer only games, therefore they aren't full games and nobody would actually want to pay for those. The yare just simple cash grab with nothing on the table. Mobile gaming is so sad today that not only I wouldn't want to pay for any of them, but somebody would have to pay me and beg me to play them. I swear to god, I would have more fun with some water, mud and stick than with any of current mobile games.

Hardware itself maters so little. I still remember that some crazy man ported a whole Civilization 5 (and it wasn't a cut down version either, you could do anything that you could on desktop version, but on phone) to Nokia 5230 running Symbian S60v5. There were some others like Asphalt 6, Toon Warz, some Daytona USA close, Riptide GP clone on that. And what that Nokia actually had? 400MHz single core chip, resistive touch screen (no multitouch), 128MB RAM, 70MB storage. So it was a complete potato and ass to develop for as it needed some optimizations to make games work without multitouch. But anyway, it still got some technically impressive titles despite all that. And now we have all the power, all the specs and no proper titles and that's why mobile gaming is pretty much dead aside from occasional P2W title.
TheLostSwedeWell, they sold an ATI business unit (in 2009, not 2008 btw) that wasn't performing particularly well at the time. The Imageon BU was actually doing very poorly for ATI and if you look at the link below, you'll see that they had very few design wins. Having only 50 device implementations in seven years isn't exactly what you'd call a thriving business, nor is 250 million units across some 30 different product SKUs and six years. That's less than 5 million units per device, which is plain terrible for something like this, even more so around that time, when phones were pretty much free if you got a contract and a lot of people had 2-3 devices. Some products, like the Tapwave Zodiac sold less than 200k units and it had a custom made GPU from ATI.
en.wikipedia.org/wiki/Imageon
In fact, it wasn't until 2008 that they had an SoC, rather than a discrete mobile graphics chip. So it's actually impossible to say what could've been, considering that they only had a single SKU with two different sub SKUs that was a complete solution that would've been suitable for phones.


And we know this how? If AMD hadn't sold what they saw as a costly ATI BU with very few customers, then maybe they would just have shuttered it altogether.
Did they sell it too cheap? Most likely, but the world wasn't going crazy for mergers back then as it has been doing the past few years.
Also, what says Qualcomm wouldn't have made their own GPU or continued to license technology from AMD?

It's fun to speculate, but that's all this is, speculation.
Well with that AMD sure sold their chance even if it was small and what they sold to Qualcomm eventually became Adreno and today it's the GPU in mobile space. It's crazy good and it sells very well. Even if Qualcomm actually made their own GPU from zero it might have taken them too long and they would have ended up as Mali. nVidia wasn't too bad with their Tegra stuff, but they ran way too hot, weren't exactly good for battery life and pretty much were gone after X1. Had they actually developed something proper, they could have succeeded. Anyway as user of Tegra 3, which ran at 90C while running GTA Vice City I surely didn't enjoy having my hands tried. At least nV made Tegra 3 chips, which were for a while the chips to get. But yeah, considering how "well" AMD was doing back then, maybe it's good that they didn't have mobile chips to care about. FX and Phenom II era was the worst AMD malaise era that they ever had and if not Ryzen and console sales, they would surely had been bankrupt by now.
Posted on Reply
#78
PerfectWave
seriously LOL: "Steam Hardware Survey is a very good indicator of market movements" how can you write something like that for real
Posted on Reply
#79
TheLostSwede
The red spiritWell with that AMD sure sold their chance even if it was small and what they sold to Qualcomm eventually became Adreno and today it's the GPU in mobile space. It's crazy good and it sells very well. Even if Qualcomm actually made their own GPU from zero it might have taken them too long and they would have ended up as Mali. nVidia wasn't too bad with their Tegra stuff, but they ran way too hot, weren't exactly good for battery life and pretty much were gone after X1. Had they actually developed something proper, they could have succeeded. Anyway as user of Tegra 3, which ran at 90C while running GTA Vice City I surely didn't enjoy having my hands tried. At least nV made Tegra 3 chips, which were for a while the chips to get. But yeah, considering how "well" AMD was doing back then, maybe it's good that they didn't have mobile chips to care about. FX and Phenom II era was the worst AMD malaise era that they ever had and if not Ryzen and console sales, they would surely had been bankrupt by now.
Crazy good?
Not sure I agree with that. See link for Qualcomms latest top of the range GPU performance. Make sure you read the comment below the benchmarks, as you'll see that the device in question is a lot like your old Tegra 3 device.
www.anandtech.com/show/16447/the-xiaomi-mi-11-review/3

Qualcomm is only better than the rest of the competition that runs Android.
ARM hasn't managed to come up with a really good GPU yet and we've yet to see a full implementation from Imagination Technologies, as for some reason, their GPUs only ends up as low-end implementations.
Apple is rather surprisingly the market leader, after making their own GPU from scratch (or at least so they claim).

AMD would've been bankrupt now if it wasn't for the fact that they bought ATI.
Posted on Reply
#80
The red spirit
TheLostSwedeCrazy good?
Not sure I agree with that. See link for Qualcomms latest top of the range GPU performance. Make sure you read the comment below the benchmarks, as you'll see that the device in question is a lot like your old Tegra 3 device.
www.anandtech.com/show/16447/the-xiaomi-mi-11-review/3
It's not even close to 90s, it's only mild 50s. By 90, I meant 90C hot and it never errored out. It would likely melt first and then error out.

Anyway, Nexus 7 was a such garbage that without marketing efforts at the time it would be never talked about. It had shit ton of other problems. Most of them were just shoddy manufacturing and shoddy software support. Looking back at those years, if I could come back to those times I would never ever buy a Nexus 7 again. It's essentially a British Leyland of tablets.
TheLostSwedeQualcomm is only better than the rest of the competition that runs Android.
As if they are selling to someone else. They are clearly dominating and if you want the best chips, It's Snapdragon only nowadays. Best midrange chip is Snapdragon too.
TheLostSwedeAMD would've been bankrupt now if it wasn't for the fact that they bought ATI.
Who knows. RTG alone isn't great, it's the consoles and APUs that are their biggest success.
Posted on Reply
#81
TheLostSwede
The red spiritWho knows. RTG alone isn't great, it's the consoles and APUs that are their biggest success.
A business they only have because they bought ATI...
Posted on Reply
#82
Max(IT)
With a better pricing the situation would be even better…

Speaking about GPUs, AMD offering is not so good and availability is even worse than Nvidia, so it’s hardly a surprise the result.
TheDeeGeeIt did work when it didn't WHEA Crash after an hour or so, it got to a good 70-75C during gaming, while my 11700 only reaches 50-55C.

Anyways, the motherboard was an Asus Rog Strix B550-E Gaming (piece of shit as well), tried various BIOS versions at the time, but it kept WHEAing. Also bought different memory as the HyperX Fury kit threw memtest errors all over the place, so i bought a G.Skill 3200 MHz kit instead. But even at non-XMP settings the system would WHEA crash.

The WHEA crashes said "cache hierarchy error" all the time. People told me that's simply a bad CPU, so i send everything back to the store and they refunded everything.

This was back in January this year. So i thought everything over until last week, do i want to try AMD again? Or go with what i'm familiar with? So i ended up getting a 11700 to upgrade my 4770K and i havn't regretted it.

During my thought process i also figured out i don't need the best of the best anymore to play games.
70° for a 5800X are quite normal and nothing strange there.
50° on a 10700 means your are running it at ridiculous Intel 65W power limit, which means performance are way lower than a 5800X.
an unlocked 10700 (I’m running one in my son’s PC) is not different from a 5800X.
KainXS5800x in particular runs hot for me but 5900x and 5600x are great cpus. It's to be expected that AMD is coming up with how intel is doing, they have flat out better cpus right now and they listen to the community. I'm surprised it's not higher tb but given the times upgrading your computer might not be a good idea for people with all the part shortages and scalping.
In my house now we have 3 computers, with 10700, 5800X and 5900X, so I can compare them.
5800X is quite hot, but not unreasonably so. Thermal density on that tiny 8 core chiplet is the reason, but it still runs below 80° in stress tests and around 72/73° maximum while gaming in most demanding games.
10700 has lower temperature only if you keep it at the ridiculous Intel long power limit of 65W, that can destroy performance. I unlocked it and it reaches 75° in stress test, with much higher power consumption than the Ryzen.
5900X is the best in this matter: under stress test it stays around 70° with PBO active. But it has a strange behavior during some gaming, where single core clock sometimes reach almost 5 GHz so there are thermal spikes around 74° (average temperature during gaming is well below 70°).
Posted on Reply
#83
Fouquin
TheLostSwedeSorry, but that's a drop in the ocean. In 2019 AMD shipped 79 million GPUs.
The real figure is higher than the extremely conservative 1% but the real figure also isn't publicly disclosed and I'm not about to spew that on a tech forum. Scale that figure up a couple times to be closer to the real figure. Keep in mind the kinds of chips Apple is buying are not those buckets of budget OEM bins that inflate AMD's reported shipping volume. They're taking up volume of the most expensive, largest dies first and then getting special consideration of chips like Navi 12 built to order. The statement stands on its own that they ship 'a lot' of chips to Apple.
Posted on Reply
#84
TheLostSwede
FouquinThe real figure is higher than the extremely conservative 1% but the real figure also isn't publicly disclosed and I'm not about to spew that on a tech forum. Scale that figure up a couple times to be closer to the real figure. Keep in mind the kinds of chips Apple is buying are not those buckets of budget OEM bins that inflate AMD's reported shipping volume. They're taking up volume of the most expensive, largest dies first and then getting special consideration of chips like Navi 12 built to order. The statement stands on its own that they ship 'a lot' of chips to Apple.
I'll admit that I overlooked the iMac and I guess as we're talking low-end to mainstream parts here, I guess Apple could do a few million units, but I still doubt they're anywhere close to AMD's biggest GPU customer, or was as it might be soon.
It's obviously nice to have steady business from a customer like Apple, but as with consoles, you're not looking at much margin at this kind of business.
What it likely does though is make the books look nice and it gives AMD a better negotiating position with everything from foundry partners to material makers, as they have more volume than the next guy, which means steady business for everyone else too.
Posted on Reply
#85
TheoneandonlyMrK
TheLostSwedeNah, IF is based on HyperTransport.
I thought you right on the hardware side but some of seamicro's pciex based protocols were further developed and used, could have it wrong though, I wouldn't bet on it.
Posted on Reply
#86
The red spirit
TheLostSwedeA business they only have because they bought ATI...
Well yes, but RTG itself hasn't truly made a graphics card itself that was so good that people would want it in quantities that they couldn't reasonably supply. APUs were only a success is because they had any integrated graphic at all, so lots of those ended up in prebuilts. And with consoles AMD lucked out, because MS and Sony could have just asked nVidia to do that job. AMD has been incredibly lucky and it really wasn't because RTG was so good. No, RTG is just good enough to not be terrible, but it was AMD who put them on the map. AMD could have bought S3 if they wanted to and they might have not sucked either. S3 actually made some pretty okay entry level (delta)chrome cards in early 2000s, their integrated GPUs in 2010s were acceptable too. And that was just S3 alone. And AMD at the time overpaid a lot for ATi, so at the time their purchase was great and took a lot of time to pay off. There was also XGI with their rather nice Volari Duo V8 card. I have read book Slingshot written by Hector Ruiz and there it was stated that AMD also needed some inhouse chipset maker and they expected ATi to do that too. If I'm not stupid, they could have bought S3, XGI and SiS for the price of ATi and have got all they need (chipsets, mobile shit, integrated graphics, beefy graphics), and some years later they also bought Matrox, where they got some expertise of how to make pro cards (which weren't that great from ATi alone, hardware was there, but they sucked at feature and technology support, their driver quality was poop and they weren't competitive with nVidia). The only thing what pushed them to get ATi is likely to avoid nVidia purchasing them as that would had been bad for them.
Posted on Reply
#87
TheLostSwede
The red spiritAnd with consoles AMD lucked out, because MS and Sony could have just asked nVidia to do that job.
Could they though? At the time, Nvidia didn't exactly have a CPU solution that would've been powerful enough to power the kind of consoles that both companies built around AMD's hardware. It's also a matter of cost and what we don't know here is if Nvidia would've offered a competitive price.
The red spiritAMD could have bought S3 if they wanted to and they might have not sucked either. S3 actually made some pretty okay entry level (delta)chrome cards in early 2000s, their integrated GPUs in 2010s were acceptable too. And that was just S3 alone. And AMD at the time overpaid a lot for ATi, so at the time their purchase was great and took a lot of time to pay off. There was also XGI with their rather nice Volari Duo V8 card. I have read book Slingshot written by Hector Ruiz and there it was stated that AMD also needed some inhouse chipset maker and they expected ATi to do that too. If I'm not stupid, they could have bought S3, XGI and SiS for the price of ATi and have got all they need (chipsets, mobile shit, integrated graphics, beefy graphics), and some years later they also bought Matrox, where they got some expertise of how to make pro cards (which weren't that great from ATi alone, hardware was there, but they sucked at feature and technology support, their driver quality was poop and they weren't competitive with nVidia). The only thing what pushed them to get ATi is likely to avoid nVidia purchasing them as that would had been bad for them.
Ok, I think you're not quite on the right page here, or haven't really followed the industry for long enough.

S3 was bought by VIA in 2001, so clearly that wasn't on the table, as AMD only bought ATI in 2006. S3 is owned by HTC these days, for some silly reasons that I could explain, but it has mainly to do with the Taiwanese staff bonus system and some graphics patents and a lawsuit.
XGI was spun out of SiS and Trident, which had been making graphics chips since the late 80's in Trident's case.
Could AMD have bought XGI? Maybe, but unlikely as they were still part of SiS.
Could AMD have bought SiS? Possibly, but it's much harder to buy and integrate two culturally very different companies.
Could AMD have bought VIA? Possibly, but same as above, although I guess that wouldn't have been allowed, as VIA owns Centaur Technology which has an x86 license, so...

ATI was after all Canadian and would've been much easier culture wise to integrate with, vs. any Asian company.
After well over a decade of living in Taiwan, I still don't understand how or why things are the way they are here, as there's so many things that aren't done in a sensible and logical way and that applies from everything from carpentry to marketing to product development and business management.
It's not all about money and it's also harder to buy a company in a different part of the world, due to vastly different regulation.

As for Matrox, they're still an independent company based in Canada and I doubt they would've been interested in an offer from AMD.

Also, I presume you've never used a graphics product from S3 or XGI, even less so SiS, if you think ATI was bad. Man, you have no idea...

But I'm glad you're confident AMD could've done better elsewhere, yet know so little about the industry and what has been going on. Some of us have actually worked in the industry and tested all these products and more that you most likely have never heard of. Did you know Micron Technology bought a rather decent graphics chip maker called Rendition back in 1998?
If we're going to speculate wildly here, AMD could've bought Imagination Technologies and gotten their PowerVR technology, or simply licensed their graphics technology, as Apple used to.
ATI did also buy a handful of different graphics chip makers, such as ArtX (they made the graphics chip for the GameCube and worked on integrated graphics with ALi, a Taiwanese chipset maker), BitBoys (out of Finland that claimed to have some revolutionary stuff that never made a real appearance), Tseng Labs (which was Tridents main competitor) and some other smaller players.

Regardless, at the time that AMD bought ATI, there weren't many players left to chose from. I guess they could've bought 3Dlabs from Creative Labs, but maybe that deal was never on the table. Beyond that, I don't know of anyone else that was still in business around that time so...

Anyhow, the TLDR; is this, AMD didn't really have any other real choice than ATI, regardless of what you've read or think.
Posted on Reply
#88
TheinsanegamerN
medi01Just another reminder of how utterly inadequate Steam hardware review is.
Just another reminder: If Steam is so bad, feel free to find more reliable data from another source.
lasThis is good news, but yeah, no Radeon 6000 series at all is very weird ... Especially because Ampere is better for mining. Ampere has been present on Steam HW Survey for months and months (since last year, all models are now present - except 3080 Ti obviously)

Can TSMC deliver at all? Or what is wrong .. I guess Nvidia went with Samsung for this exact reason. TSMC is simply too busy fulfilling orders from alot of different brands, Apple always get priority and this won't change and maybe they are also prioritizing console APUs over desktop chips, who knows

AMD missed a golden oppotunity tho - It looks like Nvidia shipped 100:1 Ampere vs RDNA2
ChaitanyaUp until advent of M1 from Apple, AMD was supplying a lot of GPUs to Apple. Also I think right now AMD might be prioritising console GPUs over desktop GPUs.
TheLostSwedeConsoles don't have discrete GPUs...
Imsochobothey still consume a lot of wafers, which is the same wafers, which goes under the same financial group as graphics so cpu's act more seperate to semicustom\graphics etc.
All of their ryzen 3000s, 5000s, RX 6000s, PS5s Xbones, ece are made on the same 7nm process on the same allocations. AMD is likely prioritizing OEM shipments first, second comes their console contracts and until recently Apple GPUs. Leftover stock is what is used for DIY CPUs and GPUs, and if they have restrictive supply, they will prioritize the higher margin CPUs.

Whatever flaws samsung may have, nvidia using them for the ampere line turned into a major blessing for nvidia.
Posted on Reply
#89
The red spirit
TheLostSwedeCould they though? At the time, Nvidia didn't exactly have a CPU solution that would've been powerful enough to power the kind of consoles that both companies built around AMD's hardware. It's also a matter of cost and what we don't know here is if Nvidia would've offered a competitive price.
They could have used any CPU. Considering how slow AMD Jaguar cores were, it's possible that even VIA could have built CPU good enough for XOne.
TheLostSwedeOk, I think you're not quite on the right page here, or haven't really followed the industry for long enough.

S3 was bought by VIA in 2001, so clearly that wasn't on the table, as AMD only bought ATI in 2006. S3 is owned by HTC these days, for some silly reasons that I could explain, but it has mainly to do with the Taiwanese staff bonus system and some graphics patents and a lawsuit.
AMD did some things with via, after they acquired Cyrix, they made some Geodes and later sold that division to VIA. They weren't on bad terms and VIA should had been quite cheap to acquire knowing how irrelevant they were in 2008.
TheLostSwedeCould AMD have bought VIA? Possibly, but same as above, although I guess that wouldn't have been allowed, as VIA owns Centaur Technology which has an x86 license, so...
Why is that a problem?
TheLostSwedeATI was after all Canadian and would've been much easier culture wise to integrate with, vs. any Asian company.
After well over a decade of living in Taiwan, I still don't understand how or why things are the way they are here, as there's so many things that aren't done in a sensible and logical way and that applies from everything from carpentry to marketing to product development and business management.
It's not all about money and it's also harder to buy a company in a different part of the world, due to vastly different regulation.
From what I read in Slingshot and on net, it seems that ATi acquisition didn't go well and AMD was still separated from ATi for a long time and they were opposing each other.
TheLostSwedeAs for Matrox, they're still an independent company based in Canada and I doubt they would've been interested in an offer from AMD.
No, they use AMD cores and they are essentially dead. At least their graphics card business is completely dead.
TheLostSwedeAlso, I presume you've never used a graphics product from S3 or XGI, even less so SiS, if you think ATI was bad. Man, you have no idea...
I only said that AMD Pro drivers suck from my limited experience with v3750. That things BSODs computer if it detects any video on website. And from reviews I rad none of FirePros supported things in those industries that clients cared about and basically you should have just bought nVidia instead as ATi was useless. I'm no genius, but that sounds terrible.
TheLostSwedeBut I'm glad you're confident AMD could've done better elsewhere, yet know so little about the industry and what has been going on. Some of us have actually worked in the industry and tested all these products and more that you most likely have never heard of. Did you know Micron Technology bought a rather decent graphics chip maker called Rendition back in 1998?
I heard of them both, but didn't know about acquisition.
TheLostSwedeIf we're going to speculate wildly here, AMD could've bought Imagination Technologies and gotten their PowerVR technology, or simply licensed their graphics technology, as Apple used to.
I heard that Kyro cards were pretty decent too and had decent drivers.
TheLostSwedeATI did also buy a handful of different graphics chip makers, such as ArtX (they made the graphics chip for the GameCube and worked on integrated graphics with ALi, a Taiwanese chipset maker), BitBoys (out of Finland that claimed to have some revolutionary stuff that never made a real appearance), Tseng Labs (which was Tridents main competitor) and some other smaller players.
But what happened to 3DLabs?
TheLostSwedeRegardless, at the time that AMD bought ATI, there weren't many players left to chose from. I guess they could've bought 3Dlabs from Creative Labs, but maybe that deal was never on the table. Beyond that, I don't know of anyone else that was still in business around that time so...

Anyhow, the TLDR; is this, AMD didn't really have any other real choice than ATI, regardless of what you've read or think.
Eh, who knows maybe they didn't have a better choice or maybe they did. It seems that AMD in that era were stressed out and barely had an idea what to do. For some reason they entered the malaise era of Phenom, then Phenom II and then FX and it took Ryzen to truly get them back on the map. They still failed to sell graphics cards as well as they could. Let's be honest, 7950 was great, but it didn't sell as well as GTX 680. Only RX 480 was truly something in sales. 4870 was great too, but it was still more like ATi instead of AMD. So it was likely already been planned and pretty much thought out regardless of acquisition.
Posted on Reply
#90
Jism
Why_Mestore.steampowered.com/hwsurvey/videocard/

From another link:

'AMD has failed to make similar headway in the graphics market, however, and the survey results showed that its share fell to 16.18% in May. That isn’t a drastic drop—AMD graphics cards have powered roughly 16% of survey respondents’ systems since late 2019—but it does highlight the company’s struggle in that segment.'
Who cares. the 16% is'nt representative at all because, AMD holds both console market as well, and there's more platform's then steam alone really.
Posted on Reply
#91
bencrutz
ColddeckedI thought SeaMicro tech is what infinity fabric was based on?
IIRC, seamicro freedom fabric was running on pcie x16 backplane connecting a lot of intel atom cards
Posted on Reply
#92
mechtech
Why_Mestore.steampowered.com/hwsurvey/videocard/

From another link:

'AMD has failed to make similar headway in the graphics market, however, and the survey results showed that its share fell to 16.18% in May. That isn’t a drastic drop—AMD graphics cards have powered roughly 16% of survey respondents’ systems since late 2019—but it does highlight the company’s struggle in that segment.'
It highlights TSMC struggle to keep up with world wide orders from world wide companies for sub-14nm process node, and not enough companies using Samsung or other foundries.........
Posted on Reply
#93
Dave65
Panther_Seraphin3 years later they would have been the only entry into the market.

Hence why Samsung/AMD are trying to add things like ray tracing etc into phones to make them "serious gaming" platforms. What I can see is that if the partnership plays out well Ninetendo may want in on that sort of chipset for the switch upgrade/replacement,
I have a Samsung s21 Ultra and I play zero games on it, I doubt ray tracing would push me to start now..
Posted on Reply
#94
TheLostSwede
@The red spirit You really need to read up a lot more, as you are clearly missing a lot of things that are involved when companies buy each other, especially with regards to government regulation, anti-competitive rules etc.
Also, some nations simply don't allow foreign takeover of local companies, or larger than a certain percentage of foreign ownership.

Oh and Geode was National Semiconductor and Cyrix, AMD got involved three years later.
mechtechIt highlights TSMC struggle to keep up with world wide orders from world wide companies for sub-14nm process node, and not enough companies using Samsung or other foundries.........
I guess that's what happens when you're the best at what you're doing and everyone wants to make their products with you.
Posted on Reply
#95
Midland Dog
lasThis is good news, but yeah, no Radeon 6000 series at all is very weird ... Especially because Ampere is better for mining. Ampere has been present on Steam HW Survey for months and months (since last year, all models are now present - except 3080 Ti obviously)

Can TSMC deliver at all? Or what is wrong .. I guess Nvidia went with Samsung for this exact reason. TSMC is simply too busy fulfilling orders from alot of different brands, Apple always get priority and this won't change and maybe they are also prioritizing console APUs over desktop chips, who knows

AMD missed a golden oppotunity tho - It looks like Nvidia shipped 100:1 Ampere vs RDNA2
people will say that doesnt matter but with the current pricing, thats a whole lot of money rdna2 didnt make
Posted on Reply
#96
mechtech
TheLostSwede@The red spirit You really need to read up a lot more, as you are clearly missing a lot of things that are involved when companies buy each other, especially with regards to government regulation, anti-competitive rules etc.
Also, some nations simply don't allow foreign takeover of local companies, or larger than a certain percentage of foreign ownership.

Oh and Geode was National Semiconductor and Cyrix, AMD got involved three years later.


I guess that's what happens when you're the best at what you're doing and everyone wants to make their products with you.
And/or no other options......................
Posted on Reply
#97
TheLostSwede
mechtechAnd/or no other options......................
Well, there are other options, just not as cutting edge...
Posted on Reply
#98
demian_vi
TheLostSwedeSays who? Do you actually know this, or are you making assumptions?
Also, do you know that the console SoCs are coming out of AMD's allocation at TSMC, or does Sony and Microsoft have their own allocation with TSMC?
The media is reporting it as AMD allocation, but since we don't have any business insight here and neither do they, we don't know how it works.
And why would AMD focus on Apple? They use a bunch of old hardware and there's no shortage on those nodes as such.
Sorry, but unless you have some actual facts here, this is just assumptions that you guys are making.

The point is, this is a Steam survey about PC hardware, it has nothing to do with consoles.


Define a lot. Those are all fairly low volume products, with the possibly exception being the MacBook Pro.
everyone here is just assuming, noone actually knows what happens behind closed door at AMD, TSMC etc
I don't think ever posted proof of what they are saying
Posted on Reply
#99
HenrySomeone
TheDeeGeeIt did work when it didn't WHEA Crash after an hour or so, it got to a good 70-75C during gaming, while my 11700 only reaches 50-55C.

Anyways, the motherboard was an Asus Rog Strix B550-E Gaming (piece of shit as well), tried various BIOS versions at the time, but it kept WHEAing. Also bought different memory as the HyperX Fury kit threw memtest errors all over the place, so i bought a G.Skill 3200 MHz kit instead. But even at non-XMP settings the system would WHEA crash.

The WHEA crashes said "cache hierarchy error" all the time. People told me that's simply a bad CPU, so i send everything back to the store and they refunded everything.

This was back in January this year. So i thought everything over until last week, do i want to try AMD again? Or go with what i'm familiar with? So i ended up getting a 11700 to upgrade my 4770K and i havn't regretted it.

During my thought process i also figured out i don't need the best of the best anymore to play games.
11700 at current prices is a very good choice, but maybe it would be even better to wait a couple more months for Alder Lake considering you already lasted 8 years with your 4770k...
Posted on Reply
Add your own comment