• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Microsoft Partners with AMD for Next-gen Xbox Hardware

Nvidia made amazing chip with the PS3 also the cell was amazing but hard to program. When the Last of us Came out on the PS3 it showed what good engineering could do on the PS3. Ever since then everything been AMD.
Yeah well that was the last time non budget ancient chips were used. IIRC the PS3 was very expensive to manufacture, but of course sold very well and they recouped cost with volume and games and controllers etc.

Ofc any manufacturer could do same thing with "modern" consoles, but profit from day 1 comes first.
 
The original PS3 almost bankrupted Sony due to its extremely high specs at the time of release, and the fact they had to sell each unit at a loss. They were losing more than $300 on each 60 GB unit sold, and they quickly got to work to butcher the console and start simplifying its hardware. They removed almost everything that made the PS3 special, hardware assisted backcompat, USB ports, even the NAND storage that the firmware used to be on was replaced by a simple NOR chip that contained just the bootloader. They simplified and cheapened the blu-ray disc drive, so many things. If you look at the original CECHA model console and the final super slim (CECH-4200) models, even the plastic the console's shell is built on is disgustingly cheap. That fancy motorized, single lens disc drive with a DSD decoder chip the original had was replaced by the cheapest, trashiest toploader you could ever imagine... it's disgusting, that's how the PS3 was made profitable. Every single new revision of the PS3 starting from the European launch models (CECHC - already lacked the PS2 EE CPU) was butchered and had some part of it removed, cheapened, or simplified.

Despite my love for the original PS3 and PS2 models, their butchering over time was my wake up call to become a PC diehard.
 
Yeah well that was the last time non budget ancient chips were used. IIRC the PS3 was very expensive to manufacture, but of course sold very well and they recouped cost with volume and games and controllers etc.

Ofc any manufacturer could do same thing with "modern" consoles, but profit from day 1 comes first.

It was a lot easier to subsidize back in 05/06 games costed 1/10th what they do now to develop and weren't significantly cheaper at retail, even Nintendo who's development budget is probably substantially less does not subsidize their consol and charges an industry first 80 usd for their games well since the snes/n64 days lol

I mean vs 2025 money a 50 usd game in 2005 was 80 usd when factoring in inflation again with substantially lower development cost.

We will never see a 1000 usd hardware subsidized to 599/699 again.
 
I was thinking this time they could try Nvidia
This console generation was already very expensive, Series X is very overpriced, costs more than PS5 in most places. Putting an Nvidia chip in there would make things a lot worse.
 
It was a lot easier to subsidize back in 05/06 games costed 1/10th what they do now to develop and weren't significantly cheaper at retail, even Nintendo who's development budget is probably substantially less does not subsidize their consol and charges an industry first 80 usd for their games well since the snes/n64 days lol

I mean vs 2025 money a 50 usd game in 2005 was 80 usd when factoring in inflation again with substantially lower development cost.

We will never see a 1000 usd hardware subsidized to 599/699 again.

This....logic...is something I have a problem with.

The cost of everything is going up...but the cost of gaming development should not be. It's not for the same reason that chicken is not a high cost meat, and why OPEC exists.


That all sound crazy, but hear me out. You're AMD, and you are looking at consoles. You look at this as an overhead situation, where you can plow the entire overhead of new chip design into console chip sales...because 20 million units at a $2 net profit is 40 million cleared. The games industry has been experiencing this for years...and yet somehow mega publishers have managed to build dozens of layers of non-value added into their structures and remain stupidly profitable. Imagine you as a developer add value with code, and you have a boss. That boss reports to a section head. That section head reports to a division head. That division head reports to the CEO...and you have 4 non-value add people. That would suck if they made the same amount as a code monkey...but CEOs make literally hundreds of times the salary.
Combine the overhead with the prevalence of tools, and the ability to tele-work. No need for a studio in Los Angeles. You can create a little enclave of code monkeys in a C tier city where a 50k salary is livable, instead of needing to hire 200k a year for the janitor to make ends meet. Likewise, you guy a relatively cheap license to Unreal and have an engine that people know and you don't have to do a lot of teaching on, driving costing down. There's inflation for everybody though...so that is a real concern if you aren't selling 20 million copies of what you used to sell 2 million of.


Those paying attention can see that inflation and overhead are the only two things truly driving cost...and Claire Unknown demonstrates that pretty remarkably. Excellent game, financial success, not costing $80, and it does all of this simply by ditching dead weight management...which somehow magically makes the game prices work. Hmmm. I think I hit the nail on the head. I think the argument is about overhead costs being silly high. I think, just maybe, that the actual problem here is why MS is divesting from Xbox, because they know that where their money lies is in services and releasing hardware where they can drive pricing by huge volumes...and it's maybe why they are focusing with AMD to release a platform that will push their profit much farther than trying to support the bloated and dying AAA beast.



So we can end on a joke, Tom Petty knew this years ago. He nearly named an album after the price the executives wanted to put on it, because it was a dollar more than most because they thought they could get people to pay the premium. Despite this, the games industry executives are repeating all the crappiness of the music and tv industries. Surprise...if you do the same thing you get the same result. Apparently too much Bolivian marching powder up the nose rots the brain, and prevents basic pattern recognition.
 
This....logic...is something I have a problem with.

The cost of everything is going up...but the cost of gaming development should not be. It's not for the same reason that chicken is not a high cost meat, and why OPEC exists.


That all sound crazy, but hear me out. You're AMD, and you are looking at consoles. You look at this as an overhead situation, where you can plow the entire overhead of new chip design into console chip sales...because 20 million units at a $2 net profit is 40 million cleared. The games industry has been experiencing this for years...and yet somehow mega publishers have managed to build dozens of layers of non-value added into their structures and remain stupidly profitable. Imagine you as a developer add value with code, and you have a boss. That boss reports to a section head. That section head reports to a division head. That division head reports to the CEO...and you have 4 non-value add people. That would suck if they made the same amount as a code monkey...but CEOs make literally hundreds of times the salary.
Combine the overhead with the prevalence of tools, and the ability to tele-work. No need for a studio in Los Angeles. You can create a little enclave of code monkeys in a C tier city where a 50k salary is livable, instead of needing to hire 200k a year for the janitor to make ends meet. Likewise, you guy a relatively cheap license to Unreal and have an engine that people know and you don't have to do a lot of teaching on, driving costing down. There's inflation for everybody though...so that is a real concern if you aren't selling 20 million copies of what you used to sell 2 million of.


Those paying attention can see that inflation and overhead are the only two things truly driving cost...and Claire Unknown demonstrates that pretty remarkably. Excellent game, financial success, not costing $80, and it does all of this simply by ditching dead weight management...which somehow magically makes the game prices work. Hmmm. I think I hit the nail on the head. I think the argument is about overhead costs being silly high. I think, just maybe, that the actual problem here is why MS is divesting from Xbox, because they know that where their money lies is in services and releasing hardware where they can drive pricing by huge volumes...and it's maybe why they are focusing with AMD to release a platform that will push their profit much farther than trying to support the bloated and dying AAA beast.



So we can end on a joke, Tom Petty knew this years ago. He nearly named an album after the price the executives wanted to put on it, because it was a dollar more than most because they thought they could get people to pay the premium. Despite this, the games industry executives are repeating all the crappiness of the music and tv industries. Surprise...if you do the same thing you get the same result. Apparently too much Bolivian marching powder up the nose rots the brain, and prevents basic pattern recognition.

Yeah that's a whole separate issue but even Sony who was the exclusive king for a long time is struggling to get games out the door anymore with each generation getting much more expensive on top of games having much longer development time other than insomniac their studios have been slow as molasses 5 years in we still haven't seen a ps5 exclusive god of war or Naughty Dog Game which is wild to me.
 
I'm hoping AMD and Microsoft bring CPU performance up to par, that's been the bottleneck in games since the previous gen
RDNA5, 12 Zen6 cores, minimum 24GB of VRAM and more mature upscaling should give plenty of hardware and software for next gen consoles.
Any of AMD's CCD/IO die based mobile solutions that are essentially soldered power limited desktop chips have worse battery life than even Intel ARL, let alone LNL. Their actual good efficiency APUs are monolithic, like on the desktop APUs
Not sure why you would ever care much about battery life of monster gaming laptops with chiplet-based CPUs and power bricks with 330-400W, let alone compare such CPU to Lunar Lake... Pure nonsense. As the review below shows, there are more than a dozen of categories to care about and assess such laptops against. New Schenker with X3D CPU is overall ahead of five other monster gaming laptops with Arrow Lake in this chart, all with 5090. I can see people shouting: "But, but, I want to game on battery in a park or on a plane" Fine...take 3 kilogram monster to a park and enjoy 15 extra minutes of Cyberpunk. Let's leave it there...

This console generation was already very expensive, Series X is very overpriced, costs more than PS5 in most places. Putting an Nvidia chip in there would make things a lot worse.
There is no doubt about this. We will live to see the cost of laptops with N1X. It will be more expensive than already outrageous initial prices of Qualcomm laptops last year. According to PassMark, they built 0.5% of laptop share with X Elite, only to drop in recent months to 0.3%
 
It's worth noting here that you've picked the single TPU tested card from the RDNA4 lineup capable of that efficiency, from both models and samples tested. The 9070XT and 9060XT are both considerably worse. I'd wager if a 5070Ti (closest power budget vs 9070XT) or even 5080 were limited to 233w to match the Powercolor Hellhound 9070, it would easily retain an efficiency advantage.

For example, I wouldn't claim Ampere was massively more efficient than RDNA2 because the A2000 exists. It just shows virtually all cards are pushed beyond their efficiency sweetspot, perhaps by quite a lot, and significantly power limiting a given GPU pumps that efficiency considerably.
 
On top of there is no way in hell Nvidia would do it without decent margins the whole reason they dropped out post PS3 which wasn't a very good gpu vs the AMD equipped X360 anyways.

Switch only works becuase Nintendo can use very low end hardware that is nearly a half decade old the soc was literally ready in 2022 and on a samsung process nobody is using.
2021
 

Yeah that's the rumor that it was taped out in 2021 but not finalized till 2022 and sent to developers in 2023.

The actual architecture launched in 2020 so either way it's half a decade old at this point.

Nobody actually knows the exact year Nintendo received finalized hardware though just rumors.
 
AMD should have levaraged their position of being sole X86 based hardware supplier to consoles with game developers atleast a decade ago.
 
This console generation was already very expensive, Series X is very overpriced, costs more than PS5 in most places. Putting an Nvidia chip in there would make things a lot worse.
Probably, but looking that the average PC user is happy to pay a higher price for an Nvidia branded GPU, that could also happen in consoles. People opting for an Nvidia based XBOX, expecting better performance and more features from such a console. Now the average console buyer might know little to nothing about Nvidia, but the average PC user who would also want a console, not to mention tech sites, could see/market it as an important advantage.
 
This collaboration will involve co-engineering silicon "across a range of devices.
Microsoft is making it crystal clear that its next-gen Xbox platform will focus on multiple devices and won't be tied to its own store for games

Those two sentences are interesting, I wonder how things worked out in the end.
 
Interesting, I wonder how well a pc like Xbox with a big apu would do with entry level pc gaming getting so expensive it could be a good time for a standalone pc box assuming they can give ample vram and closer to mid range performance. I have my doubts with silicon getting so expensive.
Don't we already have this device today? We know how it can do. The new one will do a little better and be a little more expensive. Or a lot more. We also know they'll be priced on the edge of their performance capability, with a bit more margin on top. Consoles are no longer sold at cost.

Can't say the world has changed due to this information... Both Sony and MS are just pooping out lazy device iterations now. There's nothing setting them apart other than a spec upgrade. In the meantime the unique selling points wrt gaming catalogue are shrinking too. Might as well get a PC, or a Deck, or whatever suits you best, or multiple devices, which a lot do already have.

I am, as I have always been, curious as to what the long term goal for Microsoft really is here. The only world in which you can float a product line like this with almost no USPs is if you corner the market or own the market. And they're far away from that too. But perhaps in the tiny minds of the mainstream crowd there is enough business to keep going. Not sure Xbox divisions' bottom line agrees with that though, it doesn't seem to make a profit yet. I mean.. did they REALLY think they would own a notable part of the market and keep janking enough profit out of a multi billion dollar acquisition to make more Call of Duties and the odd DOOM and be done with it?!

Probably, but looking that the average PC user is happy to pay a higher price for an Nvidia branded GPU, that could also happen in consoles.
Not a snowballs' chance in hell, but its ok to entertain the thought. :) Its not like any console gamer even halfway cares or even knows there's an AMD chip inside of it now. They press ON and play game.
 
Don't we already have this device today? We know how it can do. The new one will do a little better and be a little more expensive. Or a lot more. We also know they'll be priced on the edge of their performance capability, with a bit more margin on top. Consoles are no longer sold at cost.

Can't say the world has changed due to this information... Both Sony and MS are just pooping out lazy device iterations now. There's nothing setting them apart other than a spec upgrade. In the meantime the unique selling points wrt gaming catalogue are shrinking too. Might as well get a PC, or a Deck, or whatever suits you best, or multiple devices, which a lot do already have.

I am, as I have always been, curious as to what the long term goal for Microsoft really is here. The only world in which you can float a product line like this with almost no USPs is if you corner the market or own the market. And they're far away from that too. But perhaps in the tiny minds of the mainstream crowd there is enough business to keep going. Not sure Xbox divisions' bottom line agrees with that though, it doesn't seem to make a profit yet. I mean.. did they REALLY think they would own a notable part of the market and keep janking enough profit out of a multi billion dollar acquisition to make more Call of Duties and the odd DOOM and be done with it?!

I'm hoping it's somthing that mostly ditches windows with it's own streamlined ui but can natively run games from other game stores.

They did make 5.7 billion last quarter and over 20 billion for fiscal year 2024 from just the Xbox divisions which was up nearly 30% from the previous year due to call of duty most likely.

Apparently they were the top publisher on Playstation for the last quarter as well so random.

Not a snowballs' chance in hell, but its ok to entertain the thought. :) Its not like any console gamer even halfway cares or even knows there's an AMD chip inside of it now. They press ON and play game.

Maybe if it had a big Nvidia logo it works well on laptops lol.
 
Again, no it's not.

Speaking of learning to read:
https://en.m.wikipedia.org/wiki/Tegra look at Thor, which I mentioned.


Thor uses Blackwell GPU, much newer CPU cores, and released this year.
For AI, not gaming, which again, was my point.

Tegra Thor is an SoC designed for autonomous cars, as per your Wikipedia link. If you look at it's Neoverse CPU cores, they're not even the right ARM architecture for gaming. Why would Microsoft want that for a games console?

I'm sure Nvidia could use a 2560-core Blackwell GPU in a console if they wanted to, but it's the CPU side of the equation where Nvidia are a long way behind. Also, I'm not sure an RTX5050-tier GPU is really next-gen material - it would be slower than a Series X, and arguably only of use for a next-Gen Xbox if they split the tiers again with a very low-end Series S replacement.
 
They need to at least double the cpu and gpu performance, double the ram and increase their speed. Should be possible to release it at $849 in a year or two at a small loss.
 
For AI, not gaming, which again, was my point.

Tegra Thor is an SoC designed for autonomous cars, as per your Wikipedia link. If you look at it's Neoverse CPU cores, they're not even the right ARM architecture for gaming. Why would Microsoft want that for a games console?

I'm sure Nvidia could use a 2560-core Blackwell GPU in a console if they wanted to, but it's the CPU side of the equation where Nvidia are a long way behind. Also, I'm not sure an RTX5050-tier GPU is really next-gen material - it would be slower than a Series X, and arguably only of use for a next-Gen Xbox if they split the tiers again with a very low-end Series S replacement.
Speaking of learning to read which you implied several times was a thing I should do, you may notice, if you check again what was written -
It's not like Nvidia give a shit about gaming any more.

The Switch 2's Nvidia CPU is unimpressive as hell. Qualcomm put out better offerings in midrange phones, at half the power draw, too. Don't get me wrong, the Switch 2 will be incredibly successful but that's despite Nvidia, not because of them.

I'm not versed on why Nintendo chose another Nvidia design, but presumably a big part of the decision was compatibility with the Switch 1's architecture to allow Switch 2 to play Switch 1 games. Without that requirement, just about anything else would be cheaper and better.
Interesting that you're pretending this is by design, and not due to the simple fact it's an ancient chip. Wow, shocker, a six year old architecture on a similarly old process node doesn't perform well against contemporary designs, what a twist.
Please learn to read. My post started with "because Nvidia aren't playing"
Switch 2's Tegra is the best Nvidia have to offer, it's relevant to my point, but it's not my entire point.
Reading comprehension, that's your job at TPU, isn't it?
For AI, not gaming, which again, was my point.

Tegra Thor is an SoC designed for autonomous cars, as per your Wikipedia link. If you look at it's Neoverse CPU cores, they're not even the right ARM architecture for gaming. Why would Microsoft want that for a games console?

I'm sure Nvidia could use a 2560-core Blackwell GPU in a console if they wanted to, but it's the CPU side of the equation where Nvidia are a long way behind. Also, I'm not sure an RTX5050-tier GPU is really next-gen material - it would be slower than a Series X, and arguably only of use for a next-Gen Xbox if they split the tiers again with a very low-end Series S replacement.
1750238270347.png

"For AI, not gaming."
You should tell NVIDIA this, they don't seem to differentiate, and it seems to be going fairly well for them, considering their largest competitor with 1/10th their marketshare finally got around to copying them a few years late, as is their habit, to change the much lauded and marketed opensource runs on anything FSR 1-3 to FSR 4, which uses AI and dedicated hardware similar to Tensor cores, and runs on only their latest generation onwards (similar to RTX launch vs GTX, seven years ago), because of course that's the only way to actually compete. Also, you pretending there's a signficant difference between a drive platform and a gaming platform, when the very SoC used in the previous Switch was literally used for both, is misleading, whether intentionally or otherwise.

Looking at the ARM CPU in the Thor SoC it's simply a V9 architecture, with some high bandwidth interconnects, paired with a Blackwell GPU. A slightly tweaked version (such as the X series chips Nintendo has been using) would be excellent as a gaming SoC.

If you look closely, you may notice that the majority of X1 applications were not "gaming", but for phones, dev boards, shield TV upscaling, drive applications (wow, seems like SoCs designed to be put into cars are also great for gaming, and the other way around, yay CUDA) etc. You pretending that a SoC can only be used for one thing is disingenuous.

"Switch 2's Tegra is the best Nvidia have to offer"
False. We've been over this, and you stating that a Drive chip would be unsuitable purely because "it's designed for autonomous cars" is hilarious, since we both know the Tegra used for the Switch 1 was also used as a Drive platform, and being good at one thing means it's good at the other. Perks of CUDA baby. Again, nothing stopping Nintendo, besides profit margins, to request a semi custom variant of an off the shelf solution, like they've done before. Nope, purely a financial decision to go with ancient architectures on an ancient node on the platform (mobile) that would literally benefit most from more efficient newer tech.
"the Switch 2 will be incredibly successful but that's despite Nvidia, not because of them."
False, there was literally nothing stopping Nintendo from using, modifying, or requesting a newer platform, but, as anyone who can read and analyse history will know, Nintento has a very consistent track record of using massively outdated hardware in their consoles, for cost reasons. Again, this is you trying to conflate Nintendo's choice of using a very old chip, with NVIDIA's apparent inability to make something better (which they can and have). The Switch 1 could barely maintain 30 FPS. The Switch 2 has DLSS as a crutch but it's still a two gen old GPU architecture, and something like a 5 gen old CPU architecture, on a process node that was obsolete and compared badly against competing nodes on release in 2020. Lets not play the game of pretend, these are standard ARM architecture cores (the same as used in phones, fridges, cars, etc) with some connectivity tacked on, paired with standard CUDA cores, but simply very old versions of them. There would have been zero issues if newer generations were used, the only thing that would change are performance targets developers could aim for, and Nintendo's profit margin. You can see this in the rest of the component choices, small battery based on old chemistry, non OLED screen with ghosting issues, non hall effect/TMR sticks, etc. etc.
"it would be slower than a Series X, and arguably only of use for a next-Gen Xbox if they split the tiers again with a very low-end Series S replacement."
Even if, IF, MS didn't do another custom design like the Steam Deck APU which was reused many times for other devices, one of MS mandates to developers was that every single game released on the Xbox platform had to run on the Series S. This was one of the reasons why the Xbox platform has been unpopular to develop for, but also means that the equivalent PC GPU to the Series S would be something like a 5500XT or 1650 Super, raw horsepower that a theoretical RTX 5050 easily outperforms based on core count, without even taking into account DLSS. Again, something you are likely aware of, I'm sure. Beyond that, the Series X uses an RDNA2 GPU roughly equivalent to a 2070 S or a 6600XT, again, based on core counts, RTX 5050 would be a similar performance level to both of these to even before using AI magic and newer featuresets. But again, this is you talking about Nintendo, then switching to Xbox when it turns out you were wrong about Nintendo only having one option for the SoC. And again, this is assuming MS can't afford or wouldn't pay for a semi custom solution, which both NVIDIA and AMD have a long history of making, and MS has a long history of using in both their computers and consoles.
 
Strix Halo has superb efficiency with the updated interconnect and that is not a monolith.
Exactly, "the updated interconnect" that was available for some years now, and AMD chose cheaper interconnect tech because it was "good enough", as said, with Zen 6 they're finally paying up for the better tech. Just like the rest of their mobile chips that are simply soldered and power limited desktop ones (whether CPU or APU), were also "good enough", despite having efficiency issues and thus poor battery life. Note the differences between the expensive packaging used for the Strix halo (seen in hard to find $2000+ devices with zero upgradability due to everything being soldered), similar in appearance to something like an ARL or Apple M series chip with it's tiles placed as close to each other as possible.
1750240390220.png

And the cheap packaging used for the 9955X3D, with lower density interconnects forcing further physical separation, leading to the latency issues X3D tries to mitigate and the IF drawing idle power that cannot be lowered to what competing monolithic chips can achieve:

1750240390097.png
 
I'm glad nGreedia didn't get the contract... It would basically double the cost of the console, have proprietary features that would make it hard for MS/Sony to move away from nGreedia in the future, and be full of nasty DRM like Nintendo.
 
It is amazing how an article about AMD always ends up having the Nvidia fans commenting heavily on anything but. How did the Switch 2 come into this conversation?
 
The next XBox needs to just have Windows on it where you can switch it between gaming mode and desktop mode just like the Steamdeck.
 
Back
Top