• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Microsoft Could Refresh Xbox One Design with Polaris Based SoC

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,355 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Microsoft could make the Xbox One "slimmer" and more energy-efficient by leveraging AMD's upcoming 4th generation Graphics CoreNext architecture, codenamed "Polaris." Found in the footnotes of AMD's "Polaris" press-deck, mentioned as an ominous-sounding "Xbox One Polaris," the mention hints at a revision of Xbox One that features an SoC running a "Polaris" based SoC. Microsoft could leverage the energy-efficiency improvements, and possibly upcoming processes, such as 14 nm FinFET, to bring down power-draw, thermal requirements, and possibly cost.



View at TechPowerUp Main Site
 
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
Xbox ONE.HALF ;) Though I don't understand how could they fit it within existing model without changing performance characteristics. After all, Polaris is significantly better than existing GPU.
 

iO

Joined
Jul 18, 2012
Messages
526 (0.12/day)
Location
Germany
Processor R7 5700x
Motherboard MSI B450i Gaming
Cooling Accelero Mono CPU Edition
Memory 16 GB VLP
Video Card(s) AMD RX 6700 XT
Storage P34A80 512GB
Display(s) LG 27UM67 UHD
Case none
Power Supply SS G-650
These are just Netflix's internal names for their app, XBone is Polaris, PS4s is called Sirius. Nothing to do with AMDs new arch.
Just another BS rumour from one of those misterXmedia lunatics...
 
Joined
Apr 2, 2011
Messages
2,657 (0.56/day)
...interesting..

Isn't this pretty much assumed? The 360 had so many processor changes that it wasn't funny (though having to cripple the later systems in order to match performance on the initial systems did warrant a chuckle).


As much as I hate to say this, hasn't the PS4 pretty much already won this generation of consoles? MS rode the Kinect requirement into the ground, their games library is somewhat lackluster (when you remove all the multi-platform games), and MS's cash cow Halo is starting to get criticism even from the die hard players.


Interesting that, despite both being built on AMD's SOCs, this hasn't been teased out of Sony yet.
 
Joined
Oct 30, 2008
Messages
1,758 (0.31/day)
System Name Lailalo
Processor Ryzen 9 5900X Boosts to 4.95Ghz
Motherboard Asus TUF Gaming X570-Plus (WIFI
Cooling Noctua
Memory 32GB DDR4 3200 Corsair Vengeance
Video Card(s) XFX 7900XT 20GB
Storage Samsung 970 Pro Plus 1TB, Crucial 1TB MX500 SSD, Segate 3TB
Display(s) LG Ultrawide 29in @ 2560x1080
Case Coolermaster Storm Sniper
Power Supply XPG 1000W
Mouse G602
Keyboard G510s
Software Windows 10 Pro / Windows 10 Home
Real heat isn't coming from Sony. It's Nintendo. With rumors building that Nintendo is finally getting serious, plus the fact that all the makers low balled their hardware...wouldn't be surprised. This may be the first console war where we have developers upgrading hardware before it's life is up. Which isn't really new, we saw hardware addon expansions and upgrades as far back as the Genesis era. But the thing back then was, the machines were built with all sorts of tricks and expansion options. Modern hardware being mostly highly customized PCs is not really developed with the same mindset. This is the effect of appliance computers which Apple is currently the champion of.

Will be interesting to see how things play out. If Nintendo high balls, which it doesn't have to do much to do, then we might see next gen sooner. Think about it, a Zen + Polaris based console going up against these current paperweights...lol.
 
Joined
Apr 2, 2011
Messages
2,657 (0.56/day)
Real heat isn't coming from Sony. It's Nintendo. With rumors building that Nintendo is finally getting serious, plus the fact that all the makers low balled their hardware...wouldn't be surprised. This may be the first console war where we have developers upgrading hardware before it's life is up. Which isn't really new, we saw hardware addon expansions and upgrades as far back as the Genesis era. But the thing back then was, the machines were built with all sorts of tricks and expansion options. Modern hardware being mostly highly customized PCs is not really developed with the same mindset. This is the effect of appliance computers which Apple is currently the champion of.

Will be interesting to see how things play out. If Nintendo high balls, which it doesn't have to do much to do, then we might see next gen sooner. Think about it, a Zen + Polaris based console going up against these current paperweights...lol.

?

I'm working under the assumption that the successor to the WiiU is what you're talking about. I'm also working under the assumption that "under specified" nature of the PS3 and Xbox One is a reference to the fact that both Sony and MS tried to have their consoles near break even price, rather than have to have a console on the market for 5+ years before they turned a profit by decreasing production costs.

Under the above two assumptions, how does Nintendo fit in? Their 3rd party support started to dry up in the Gamecube generation, and has been functionally non-existent in the current generation. If they decided to compete with MS and Sony they'd have to release a console that not only made money, but was also as easy to port to as the functionally similar PS4 and Xbox One. Similarly, Nintendo would have to churn out first party software at an insane rate to make its market penetration substantial enough to warrant the development costs for the publishers. They'd have to do all of this, while still being price competitive (which means insane specifications aren't likely). How exactly does this make Nintendo a force that MS needs to compete with?



Now I can see Nintendo actually capable of a genuine "year of Nintendo." They've got enough financial resources, enough old IP, and more than enough know-how to bring themselves back onto the market in a big way, but why? Nintendo basically has the lock on handheld gaming. They can sell a respectable (if not substantial) amount of hardware based only on their first party production. Nintendo has demonstrated that they understand the market is overdue for a massive shift, and they're sitting back and waiting for MS or Sony to overplay their hands and let a publisher drag them down. It's a quintessentially Japanese tradition to believe that doing it best is enough, as their management has time and again demonstrated. Nintendo doesn't have to be the biggest, but they have to make the best games. That mentality doesn't lend itself to getting involved with petty squabbles amongst "upstart" developers. I honestly believe Nintendo is insulating themselves for another games crash, which is why they are silent. They're more than content to let Sony and MS waste their efforts competing, while toxic business practices brings the game market down around them.
 
Joined
Nov 10, 2006
Messages
4,665 (0.73/day)
Location
Washington, US
System Name Rainbow
Processor Intel Core i7 8700k
Motherboard MSI MPG Z390M GAMING EDGE AC
Cooling Corsair H115i, 2x Noctua NF-A14 industrialPPC-3000 PWM
Memory G. Skill TridentZ RGB 4x8GB (F4-3600C16Q-32GTZR)
Video Card(s) ZOTAC GeForce RTX 3090 Trinity
Storage 2x Samsung 950 Pro 256GB | 2xHGST Deskstar 4TB 7.2K
Display(s) Samsung C27HG70
Case Xigmatek Aquila
Power Supply Seasonic 760W SS-760XP
Mouse Razer Deathadder 2013
Keyboard Corsair Vengeance K95
Software Windows 10 Pro
Benchmark Scores 4 trillion points in GmailMark, over 144 FPS 2K Facebook Scrolling (Extreme Quality preset)
his may be the first console war where we have developers upgrading hardware before it's life is up.
:confused:

 
Last edited:
Joined
Dec 19, 2008
Messages
284 (0.05/day)
Location
WA, USA
System Name Desktop
Processor AMD Ryzen 5950X
Motherboard ASUS Strix B450-I
Cooling be quiet! Dark Rock TF 2
Memory 32GB DDR4 3600
Video Card(s) AMD RX 6800
Storage 480GB MyDigitalSSD NVME
Display(s) AOC CU34G2X
Power Supply 850w
Mouse Razer Basilisk V3
Keyboard Steelseries Apex 5
  • Like
Reactions: xvi
Joined
Mar 9, 2012
Messages
98 (0.02/day)
Processor Core 2 Quad Q9550 @ 3.7 GHz
Motherboard Gigabyte GA-P35-S3L
Memory 8 GB DDR2-870
Video Card(s) Geforce GTX 1060 6GB
As far as I can remember, there has never been a CPU or GPU architectural change in a console's lifecycle; only die-shrinks or some other minor changes, to bring down production costs, power consumption, etc.

This is a very, VERY suspicious rumor and looks quite baseless, and yet more and more "tech" sites are repeating it, even though they should know better.

The 360 had so many processor changes that it wasn't funny (though having to cripple the later systems in order to match performance on the initial systems did warrant a chuckle).

Would you care to elaborate? What do you mean "cripple"? All those CPU and GPU changes were basically die-shrinks. AFAIK, they all were the exact same designs with the exact same specs, simply built on different nodes and different chip combinations.

Perhaps you meant this: http://arstechnica.com/gaming/2010/08/microsoft-beats-intel-amd-to-market-with-cpugpu-combo-chip/

The CPU and the GPU were the same. The only issue was the FSB. Yeah, I see why it might look like crippling, but it's quite a small one and had to be done to maintain absolute compatibility.
 
Last edited:
Joined
Apr 2, 2011
Messages
2,657 (0.56/day)
As far as I can remember, there has never been a CPU or GPU architectural change in a console's lifecycle; only die-shrinks or some other minor changes, to bring down production costs, power consumption, etc.

This is a very, VERY suspicious rumor and looks quite baseless, and yet more and more "tech" sites are repeating it, even though they should know better.



Would you care to elaborate? What do you mean "cripple"? All those CPU and GPU changes were basically die-shrinks. AFAIK, they all were the exact same designs with the exact same specs, simply built on different nodes and different chip combinations.

Perhaps you meant this: http://arstechnica.com/gaming/2010/08/microsoft-beats-intel-amd-to-market-with-cpugpu-combo-chip/

The CPU and the GPU were the same. The only issue was the FSB. Yeah, I see why it might look like crippling, but it's quite a small one and had to be done to maintain absolute compatibility.

How exactly did you come to that conclusion?

https://en.wikipedia.org/wiki/Xbox_360_technical_specifications

The 360 started out with a separate 90 nm CPU and GPU. Over the course of several revisions it managed to get down to a single chip 45nm design, with the CPU and GPU being of different sizes for a substantial number of revisions. Your linked article does an excellent job of showing how they radically redesigned the silicon from a dual chip system into an SOC. They had to cripple FSB speeds to maintain parity among the models on offer http://blog.gsmarena.com/heres-what’s-inside-the-latest-xbox-360-slim-gaming-console/, because their architecture changed sufficiently enough that they'd experience better performance without the crippling.

I'm not sure exactly what's powering your logic here, so I need some assistance from you. They changed the structure of the chips; they altered how it accessed memory, and they managed to go to a process which effectively allowed them to do all of this by decreasing the area required to create their chips by 75%. That, in my book, is completely redesigning their architecture to allow an SOC to perform at exactly the same level as a dual chip system. Just because they didn't create additional performance doesn't mean they didn't redesign their silicon.

If that isn't correct, then I don't exactly know what you'd call a redesign. Simply adopting a new architecture on the GPU would require as much effort to maintain parity (as well as determining the right level of crippling) as MS put into making sure their SOC solution matched the performance of their dual chip systems (those that managed to avoid the fiery RROD deaths that killed so many of the first couple of generations of hardware). I can't find the link to the article now, but at the time I remember a hardware take down even making note of the vastly increased trace lengths on some of the new 360's interconnects designed to induce enough latency into the system to match older model's performance.
 
Joined
Mar 9, 2012
Messages
98 (0.02/day)
Processor Core 2 Quad Q9550 @ 3.7 GHz
Motherboard Gigabyte GA-P35-S3L
Memory 8 GB DDR2-870
Video Card(s) Geforce GTX 1060 6GB
How exactly did you come to that conclusion?

https://en.wikipedia.org/wiki/Xbox_360_technical_specifications

The 360 started out with a separate 90 nm CPU and GPU. Over the course of several revisions it managed to get down to a single chip 45nm design, with the CPU and GPU being of different sizes for a substantial number of revisions. Your linked article does an excellent job of showing how they radically redesigned the silicon from a dual chip system into an SOC. They had to cripple FSB speeds to maintain parity among the models on offer http://blog.gsmarena.com/heres-what’s-inside-the-latest-xbox-360-slim-gaming-console/, because their architecture changed sufficiently enough that they'd experience better performance without the crippling.

I'm not sure exactly what's powering your logic here, so I need some assistance from you. They changed the structure of the chips; they altered how it accessed memory, and they managed to go to a process which effectively allowed them to do all of this by decreasing the area required to create their chips by 75%. That, in my book, is completely redesigning their architecture to allow an SOC to perform at exactly the same level as a dual chip system. Just because they didn't create additional performance doesn't mean they didn't redesign their silicon.

If that isn't correct, then I don't exactly know what you'd call a redesign. Simply adopting a new architecture on the GPU would require as much effort to maintain parity (as well as determining the right level of crippling) as MS put into making sure their SOC solution matched the performance of their dual chip systems (those that managed to avoid the fiery RROD deaths that killed so many of the first couple of generations of hardware). I can't find the link to the article now, but at the time I remember a hardware take down even making note of the vastly increased trace lengths on some of the new 360's interconnects designed to induce enough latency into the system to match older model's performance.

You're right about the chips/packages themselves, but I specifically wrote CPU and GPU, not chip or SoC. The design changed from a chip design standpoint but the CPU and the GPU components remained the same from an architectural standpoint. That was my point. It's not like they used a newer generation CPU or GPU.

The only part of that SoC that was entirely different was that new FSB replacement block which was necessary since the interconnections inside an SoC have much lower latency than two separate chips.

All this brings us to the main discussion that introducing a major architectural change, like the one the article is suggesting, is very doubtful. How can they make a completely new GPU architecture to behave exactly the same as an old one?
 
Last edited:
Joined
Apr 2, 2011
Messages
2,657 (0.56/day)
You're right about the chips/packages themselves, but I specifically wrote CPU and GPU, not chip or SoC. The design changed from a chip design standpoint but the CPU and the GPU components remained the same from an architectural standpoint. That was my point. It's not like they used a newer generation CPU or GPU.

The only part of that SoC that was entirely different was that new FSB replacement block which was necessary since the interconnections inside an SoC have much lower latency than two separate chips.

All this brings us to the main discussion that introducing a major architectural change, like the one the article is suggesting, is very doubtful. How can they make a completely new GPU architecture to behave exactly the same as an old one?

I have one question. What exactly qualifies as an architectural change to you?

My qualifier is anything requiring substantial redesign of silicon structure of layout. This ranges from something as simple as changing how memory is accessed (HBM versus GDDR5, or in the case of the Xbox 360 the shared memory access), to as substantial as redesigning the pipeline for the processes. Under that definition, the integration of the GPU and CPU, along with a substantial change in the memory access, is redesigning the architecture of the 360.

I can acquiesce that if your definition is different, and much more narrow, that there has never been an architectural change to a console. I reject that limited of a definition, but can respect the logic in coming to it.
 
Joined
Oct 1, 2010
Messages
2,361 (0.48/day)
Location
Marlow, ENGLAND
System Name Chachamaru-IV | Retro Battlestation
Processor AMD Ryzen 9 5900X | Intel Pentium II 450MHz
Motherboard ASUS ROG STRIX X570-F Gaming | MSI MS-6116 (Intel 440BX chipset)
Cooling Noctua NH-D15 SE-AM4
Memory 32GB Corsair DDR4-3000 (16-20-20-38) | 512MB PC133 SDRAM
Video Card(s) nVIDIA GeForce RTX 4070 FE | 3dfx Voodoo3 3000
Storage 1TB WD_Black SN850 NVME SSD (OS), Toshiba 3TB (Storage), Toshiba 3TB (Steam)
Display(s) Samsung Odyssey G5 27" @ 1440p144 & Dell P2312H @ 1080p60
Case SilverStone Seta A1 | Beige box
Audio Device(s) Creative Sound Blaster AE-7 (Speakers), Creative Zen Hybrid headset | Sound Blaster AWE64
Power Supply EVGA Supernova 750 G2 | 250W ASETEC
Mouse Roccat Kone Air| Microsoft Serial Mouse v2.0A
Keyboard Vortex Race3 | Dell AT102W
Software Microsoft Windows 11 Pro | Microsoft Windows 98SE
I would consider an architecture change to be something like going from one generation of CPU/GPU to the next. Progressing integration is nothing new. Several microcomputer manufacturers back in the 80s repeatedly streamlined the components in their systems to decrease price and make them more reliable. Going from five chips in a system to two or three by designing multi-role chips that did the job of what used to be multiple chips. The base architecture was still the same, as they weren't using different CPUs or graphics chips, they just integrated them together into a single package.
 
Joined
Mar 9, 2012
Messages
98 (0.02/day)
Processor Core 2 Quad Q9550 @ 3.7 GHz
Motherboard Gigabyte GA-P35-S3L
Memory 8 GB DDR2-870
Video Card(s) Geforce GTX 1060 6GB
I have one question. What exactly qualifies as an architectural change to you?

CPU and GPU architectures are not subjective things for us to pick and choose and define for ourselves as we like. The architectures are defined and even named, in most cases, by their creators.

A simple example of nvidia architectures: ..., Tesla, Fermi, Kepler, Maxwell 1st gen, Maxwell 2nd gen, ... (IINM, there were also some architectural changes within the Fermi and Kepler families too, when going from 1xx line to 11x for the former and going from 1xx to 2xx for the latter, so I suppose they could be listed separately too. The Tesla architecture also changed once or twice in its lifetime.)

My qualifier is anything requiring substantial redesign of silicon structure of layout. This ranges from something as simple as changing how memory is accessed (HBM versus GDDR5, or in the case of the Xbox 360 the shared memory access), to as substantial as redesigning the pipeline for the processes. Under that definition, the integration of the GPU and CPU, along with a substantial change in the memory access, is redesigning the architecture of the 360.

I can acquiesce that if your definition is different, and much more narrow, that there has never been an architectural change to a console. I reject that limited of a definition, but can respect the logic in coming to it.

Did you miss the part where I separated the chip/SoC design from the CPU and the GPU? I even agreed with you that the chip itself is different from before.

I am specifically talking about the architectures of the CPU and GPU blocks. As far as we know, those two blocks in the newer 45nm SoCs are exactly the same as the blocks in the older 2-chip setups. Only the nodes are different.

Yes, the "architecture" of the 360 itself changed, but the architecture of the main computing components and blocks didn't.

Red_Machine explained it quite nicely in his comment.
 
Last edited:
Joined
Apr 2, 2011
Messages
2,657 (0.56/day)
CPU and GPU architectures are not subjective things for us to pick and choose and define for ourselves as we like. The architectures are defined and even named, in most cases, by their creators.

A simple example of nvidia architectures: ..., Tesla, Fermi, Kepler, Maxwell 1st gen, Maxwell 2nd gen, ... (IINM, there were also some architectural changes within the Fermi and Kepler families too, when going from 1xx line to 11x for the former and going from 1xx to 2xx for the latter, so I suppose they could be listed separately too. The Tesla architecture also changed once or twice in its lifetime.)



Did you miss the part where I separated the chip/SoC design from the CPU and the GPU? I even agreed with you that the chip itself is different from before.

I am specifically talking about the architectures of the CPU and GPU blocks. As far as we know, those two blocks in the newer 45nm SoCs are exactly the same as the blocks in the older 2-chip setups. Only the nodes are different.

Yes, the "architecture" of the 360 itself changed, but the architecture of the main computing components and blocks didn't.

Red_Machine explained it quite nicely in his comment.

So, here's my problem. Intel.

They function on a tick-tock cycle. They theoretically release a "new" microarchitecture, then the following cycle they die shrink it. This die shrink hasn't often coincided with any real changes to the functional blocks on their processor, but instead of being general let's focus on Sandy Bridge and Ivy Bridge. Intel lists these as separate microarchitectures, despite them functionally being the same structure on the CPU side. The only real change between the two is the die shrink, and all of the peripherals that are included. To prove my point, take a gander at the two pictures:
http://www.guru3d.com/articles-pages/core-i7-3960x-processor-amp-msi-x79a-gd65-review,2.html
http://www.guru3d.com/articles-pages/core-i7-4820k-processor-review,2.html


So I ask again, how do you define an architecture change? In the case of Intel, something as simple as changing interconnection possibilities qualifies. Hell, I'd even be willing to admit I was wrong if Intel only called them an architecture change and you wanted to argue microarchitecture. Problem is, in Intel's book IB and SB are separate microarchitectures. Is that not exactly what the integration of the Xbox 360 did. More than simply slapping the two chips onto a single die, they offered a new way to access memory. They didn't just die shrink, they created an entirely new chunk of silicon. This new silicon is designed only to parity the performance of the old, but we've accepted that from Intel since SB.

I'm going to have to go with how Intel labels this stuff. Arguing against them is pointless, even if you don't accept their liberal definitions. They've written the book on consumer CPUs, by nearly obliterating their competition.
 
Joined
Mar 9, 2012
Messages
98 (0.02/day)
Processor Core 2 Quad Q9550 @ 3.7 GHz
Motherboard Gigabyte GA-P35-S3L
Memory 8 GB DDR2-870
Video Card(s) Geforce GTX 1060 6GB
So, here's my problem. Intel.

They function on a tick-tock cycle. They theoretically release a "new" microarchitecture, then the following cycle they die shrink it. This die shrink hasn't often coincided with any real changes to the functional blocks on their processor, but instead of being general let's focus on Sandy Bridge and Ivy Bridge. Intel lists these as separate microarchitectures, despite them functionally being the same structure on the CPU side. The only real change between the two is the die shrink, and all of the peripherals that are included. To prove my point, take a gander at the two pictures:
http://www.guru3d.com/articles-pages/core-i7-3960x-processor-amp-msi-x79a-gd65-review,2.html
http://www.guru3d.com/articles-pages/core-i7-4820k-processor-review,2.html


So I ask again, how do you define an architecture change? In the case of Intel, something as simple as changing interconnection possibilities qualifies. Hell, I'd even be willing to admit I was wrong if Intel only called them an architecture change and you wanted to argue microarchitecture. Problem is, in Intel's book IB and SB are separate microarchitectures. Is that not exactly what the integration of the Xbox 360 did. More than simply slapping the two chips onto a single die, they offered a new way to access memory. They didn't just die shrink, they created an entirely new chunk of silicon. This new silicon is designed only to parity the performance of the old, but we've accepted that from Intel since SB.

I'm going to have to go with how Intel labels this stuff. Arguing against them is pointless, even if you don't accept their liberal definitions. They've written the book on consumer CPUs, by nearly obliterating their competition.

http://www.anandtech.com/show/5626/ivy-bridge-preview-core-i7-3770k/2
http://www.anandtech.com/show/4830/intels-ivy-bridge-architecture-exposed/2

Read those. It does not have the "same" structure. Ivy bridge was not a simple die shrink with just some changes to the interconnections. Even though it is based on sandy bridge and is VERY similar to it, nearly identical, it's not the same. They wouldn't have bothered to label it as a different architecture if it was the exact same. They do exaggerate though, for changes that are rather small. They could've maybe called it sandy bridge 2nd gen.

It is not exactly what the integration of the Xbox 360 did, because there were changes to ivy bridge's pipeline itself. Quite minor ones, but still. As I pointed out before, the CPU and GPU blocks of the chips used in 360s remained the same, as far as we know.
 
Last edited:
Joined
Apr 2, 2011
Messages
2,657 (0.56/day)
http://www.anandtech.com/show/5626/ivy-bridge-preview-core-i7-3770k/2
http://www.anandtech.com/show/4830/intels-ivy-bridge-architecture-exposed/2

Read those. It does not have the "same" structure. Ivy bridge was not a simple die shrink with just some changes to the interconnections. Even though it is based on sandy bridge and is VERY similar to it, nearly identical, it's not the same. They wouldn't have bothered to label it as a different architecture if it was the exact same. They do exaggerate though, for changes that are rather small. They could've maybe called it sandy bridge 2nd gen.

It is not exactly what the integration of the Xbox 360 did, because there were changes to ivy bridge's pipeline itself. Quite minor ones, but still. As I pointed out before, the CPU and GPU blocks of the chips used in 360s remained the same, as far as we know.

OK, I'm done discussing. You're willing to say that SB to IB had very minor changes, just like the very minor changes present in the 360. Despite agreeing to that, you're unwilling to acquiesce to the fact that minor changes in one instance can be linked to the minor changes in another. I don't believe we can have a fruitful discussion beyond that, without ripping apart a couple of 360's and examining the chips. I'm not willing to put forward that much effort. If you'd like the last word, you're welcome to it.
 
Joined
Mar 9, 2012
Messages
98 (0.02/day)
Processor Core 2 Quad Q9550 @ 3.7 GHz
Motherboard Gigabyte GA-P35-S3L
Memory 8 GB DDR2-870
Video Card(s) Geforce GTX 1060 6GB
OK, I'm done discussing. You're willing to say that SB to IB had very minor changes, just like the very minor changes present in the 360. Despite agreeing to that, you're unwilling to acquiesce to the fact that minor changes in one instance can be linked to the minor changes in another. I don't believe we can have a fruitful discussion beyond that, without ripping apart a couple of 360's and examining the chips. I'm not willing to put forward that much effort. If you'd like the last word, you're welcome to it.

Because the changes are not the same. I clearly explained why and provided you with links to read. The changes to intel CPUs are at the computational level, no matter how small they are.

On the other hand, changing the memory or the interconnects between the different blocks in a chip/SoC would not change the blocks themselves. It's quite obvious. Unless you can find a paper where it says the CPU and GPU blocks of the newer 45nm 360s are different, even in the smallest, then their architecture/microarchitecture didn't change.

It would defy console logic for MS to willingly change the GPU and CPU on 360. The FSB situation was an unfortunate side effect that was unavoidable, unless they decided to stay with a two-chip design, but they didn't because they wanted to reduce the manufacturing cost first and foremost.

I'm not saying you're wrong, but you're mixing chip/SoC design with CPU and GPU design.

Yes, the new SoC itself perhaps can be called a new chip "architecture", but that's not the word that is used for such purpose. It's simply a new package, containing the same computational blocks as before.
 
Last edited:
Top