• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

"Full Die Shot" Analysis of Nintendo Switch 2 SoC Indicates Samsung 8 nm Production Origins

Samsung 8nm lol no way. This is why RTX 4000 were so much more power efficient than 3000 since Nvidia moved away from that trash lithography to TSMC 5N.
 
Idk but I thought the strength of Nintendo is because of its efficient OS? And it only needs to render at 1080p on its internal display.

But I don't agree with the pricing, thought they sell them for cheap to attract people, but selling 50% more than the older Switch while not having a cutting-edge hardware is absurd. Their game is more expensive too. Idk where they are going with this strategy.
 
Wonder how this compares with the latest SOCs from the mobile phones, and also with an AMD and nVidia video card? Is there such comparison, I wonder...
 
This. But at the same time, if I got to a company and ask for a custom chip, why would I accept getting whatever leftovers they want to give me?
Unless Nintendo got this for no up front cost and no down payment, this was a bad deal for Nintendo.
The chip really doesn't fit the requirements of the end product.
Wasn't the rumor always that nintendo was buying spare EoL hardware for pennies from nvidia? I know that was assumed for switch 1, and with Thor solutions being out now from nvidia it's the perfect time to pass off their ancient garbage to nintendo. The dock has extra cooling for the extra power and handheld I'm assuming they're still going to target ~720p-1080p upscaled and have vrr to fall back on to help 30-45fps feel better.
 
Wasn't the rumor always that nintendo was buying spare EoL hardware for pennies from nvidia? I know that was assumed for switch 1, and with Thor solutions being out now from nvidia it's the perfect time to pass off their ancient garbage to nintendo. The dock has extra cooling for the extra power and handheld I'm assuming they're still going to target ~720p-1080p upscaled and have vrr to fall back on to help 30-45fps feel better.
At one point the rumour was that they were going to use that upcoming, now potentially delayed, MTK+Nvidia chip, but clearly that wasn't the case.
It's just sad to see what they ended up with, as it's a chip that draws more power than it should, based on the target application.
Admittedly it would've been hard for Nintendo to get a similar GPU from anywhere else, but not impossible.
I have no idea what the rumours was with regards to cost, but based on the final retail price of the Switch 2, it's clearly more expensive than it really should be.
I hope it can run 1080p without upscaling, but I guess we'll find out soon enough.
 
Why are people even surprised at Nintendo pushing last last last gen hardware to its loyal customers? It's not like they won't pay for it anyway.
Insert AMD, Intel, Nvidia etc, they all do it.
 
Well they maybe could pick something a year old for the tech enthusiasts, but it would be a dud product costing $1000 to buy?
It only needs to be good enough for its intended purpose, it doesnt need to run the latest AAA at 4k 100fps.
I think it is already overpriced as it is.
And yet most modern games don't hit above 4k 60fps. I want the NS2 to hit a consistent 1080p 120 but I doubt that'll happen as well

8nm node in 2025 for a 10 yr device huh? honestly, i was on the fence about this anyway with their $95 controller replacement, and metroid prime 4 video doesn't even look that good, the terrain in the gameplay video from last month looks like its a ps3 game. fk it, cancelling my pre-order
Whoa there, but Samus now has psychic abilities to open doors. That's a game changer! But in all honesty, I hope it's a good game.
 
Insert AMD, Intel, Nvidia etc, they all do it.
Not on halo products.

If it were the case, we'd be getting latest gen Ryzen/Radeon/GeForce/Arc on Samsung 8nm.

Last product with that node I can recall is RTX 3000.
 
The A78's replacement, which by now is the A725, had three predecessors, the A710, A715 and A720, all of which would have been excellent choices, as they're all ARMv9 based cores, whereas the A78 is using the older ARMv8 instruction set.

Cortex A710, A715, A720 & A725 are not optimized for implementation on a 10 nm/8 nm node, they are optimized for 5 nm and lower. the A78 is already a bit too powerful for 10 nm/8 nm not to mention Nintendo wanting to waste as little power budget as possible on the CPU.

It would most likely have cost Nintendo little to nothing to have gone with the newer Arm cores, but it seem like Nvidia might not have a license for them.

If by "cost them little to nothing" you meant potentially billions of dollars, then yes you are correct.

Even ignoring the fact that these newer cores won't work as advertised on 10 nm/8 nm, you still have to pay NVIDIA millions of dollars to redesign the chip, pay ARM more licensing fees, increase the BoM cost by having a bigger die and lower yields.... etc.

On top of that, it's odd that there are no Little cores at all, since a pair of A55 cores would've been able to handle background tasks and what no, while freeing up the A78 cores for the important stuff. It just doesn't look like a good chip for a battery powered device.

Little cores are useless for a gaming device, plus there is a reason why even mobile SoCs are moving away from little cores.
 
Cortex A710, A715, A720 & A725 are not optimized for implementation on a 10 nm/8 nm node, they are optimized for 5 nm and lower. the A78 is already a bit too powerful for 10 nm/8 nm not to mention Nintendo wanting to waste as little power budget as possible on the CPU.
If they don't want to waste power budget on the CPU, they clearly screwed up big-time then, as they went with eight A78 cores on an old, power hungry node...
If by "cost them little to nothing" you meant potentially billions of dollars, then yes you are correct.
What are you on about?
Even ignoring the fact that these newer cores won't work as advertised on 10 nm/8 nm, you still have to pay NVIDIA millions of dollars to redesign the chip, pay ARM more licensing fees, increase the BoM cost by having a bigger die and lower yields.... etc.
Well, you read my comment whatever way you want, but nowhere did I state that they had to use an old node, nor Nvidia as partner, but ok, you make up some shit to make me look bad.
Little cores are useless for a gaming device, plus there is a reason why even mobile SoCs are moving away from little cores.
I guess you didn't bother reading what I wrote then? Did I mention they where for gaming? You do understand how these SoC work I presume? And if it was so bad to have these lower-power cores, why did Intel move to three levels of them in their more recent mobile chips?
You always have a boatload of background tasks, which these low power cores can deal with, while the big, powerful cores can focus on the big tasks, like gaming.

I presume you work for Nintendo or Nvidia and was involved in picking this chip?
 
I wonder how much nintendo paid nvidia for this old chip?
 
Why are people even surprised at Nintendo pushing last last last gen hardware to its loyal customers? It's not like they won't pay for it anyway.
Maybe Nintendo/JHH is underestimating comrade DJT o_O
 
Last edited:
If they don't want to waste power budget on the CPU, they clearly screwed up big-time then, as they went with eight A78 cores on an old, power hungry node...
Its called balancing cost, power and performance.
What are you on about?

Well, you read my comment whatever way you want, but nowhere did I state that they had to use an old node, nor Nvidia as partner, but ok, you make up some shit to make me look bad.

By changing NVIDIA they have to either give up backward compatibility or resort to emulation which will also potentially will cost them millions (if not billions) of dollars.
No need to be defensive, nobody is trying to make you look bad lol we are in a tech forum not a beauty pageant.

I guess you didn't bother reading what I wrote then? Did I mention they where for gaming? You do understand how these SoC work I presume? And if it was so bad to have these lower-power cores, why did Intel move to three levels of them in their more recent mobile chips?
You always have a boatload of background tasks, which these low power cores can deal with, while the big, powerful cores can focus on the big tasks, like gaming.
Little cores on x86 is for fixing the fatal issue with x86 performance cores which is efficiency on no/very low load, ARM cores don't have that problem.
Every use case scenario have different requirements.

All consoles reserve 1-2 cores for system as the developers can't access the full cores.
Switch 1 SoC had little cores that Nintendo didn't want/use, and for Switch 2 they won't waste precious die space and millions on development costs for no benefit whatsoever.
I presume you work for Nintendo or Nvidia and was involved in picking this chip?
No just some knowledge and common sense.
 
Its called balancing cost, power and performance.
Nothing balanced about this chip though.
By changing NVIDIA they have to either give up backward compatibility or resort to emulation which will also potentially will cost them millions (if not billions) of dollars.
No need to be defensive, nobody is trying to make you look bad lol we are in a tech forum not a beauty pageant.
Eh? I think you got your numbers off by a fair few zeroes here.
Also, what emulation? The thing runs Android, it would just be tuning for a different GPU.
Little cores on x86 is for fixing the fatal issue with x86 performance cores which is efficiency on no/very low load, ARM cores don't have that problem.
Every use case scenario have different requirements.
Yet the small Arm cores handles background tasks, which this thing will have as well. Plus, you'd save on battery when you're not in a game, just like the little cores are the ones mostly used on your phone when you're not doing something CPU intensive. Most of my "use time" is on the small cores on low frequencies, this is since last reboot from a couple of weeks ago though, so it's not super accurate, as I haven't run anything taxing since then. On a mobile device of any kind, you want this kind of behaviour, so you don't waste battery power on mundane tasks.

1746896798815.png

All consoles reserve 1-2 cores for system as the developers can't access the full cores.
Proof of this?
Switch 1 SoC had little cores that Nintendo didn't want/use, and for Switch 2 they won't waste precious die space and millions on development costs for no benefit whatsoever.
That only shows that Nintendo doesn't understand mobile devices well.
Also, Nintendo clearly spent ZERO dollars on development costs, since they took a bodge job from Nvidia that was never intended for mobile devices.
Just because a company is bad at hardware, doesn't mean this is the right way to do things.
No just some knowledge and common sense.
I agree to disagree on this.
How many devices have you been involved in developing? I've at least been part of a dozen or so, nothing as fancy as this, but mostly Arm Cortex-something devices.

Considering the SoC is likely to cost US50-100 on its own (based on what Nvidia's developer boards cost), it's really a terrible choice of chip, both in terms of power efficiency and performance, notwithstanding the GPU. But I guess that's how it goes if you don't want to spend any money on the hardware development.
 
Nothing balanced about this chip though.
I'm not talking about the chip itself, I'm talking about Nintendo choosing which chip to use based on cost and then how to allocate the power budget ratio between the CPU ,GPU ...etc.

And BTW the chip itself is pretty much balanced for its intended purpose, for a ~10W mobile device it has an oversized GPU which is still a good thing.
Eh? I think you got your numbers off by a fair few zeroes here.
Also, what emulation? The thing runs Android, it would just be tuning for a different GPU.
The Switch 2 runs Android? Thats news to me and to Nintendo as well.
"Tuning for a different GPU" doesn't work buddy, this is not a PC, the whole binary code for the game will change and it'll needs a lot of development time by developers to port the games again to the new completely different GPU or Nintendo has to develop an emulator.
Yet the small Arm cores handles background tasks, which this thing will have as well. Plus, you'd save on battery when you're not in a game, just like the little cores are the ones mostly used on your phone when you're not doing something CPU intensive. Most of my "use time" is on the small cores on low frequencies, this is since last reboot from a couple of weeks ago though, so it's not super accurate, as I haven't run anything taxing since then. On a mobile device of any kind, you want this kind of behaviour, so you don't waste battery power on mundane tasks.
Spending 1W instead of 1.5W won't do much for a gaming device.
For a phone it works because you're trying to make the phone last at least a day on the battery versus 2 hours on gaming device.

Completely different targets, thats why little cores are a waste for a gaming device.
Proof of this?
That only shows that Nintendo doesn't understand mobile devices well.
Lol yes I'm pretty sure all the engineers there don't and you do, they should really hire you.
Also, Nintendo clearly spent ZERO dollars on development costs, since they took a bodge job from Nvidia that was never intended for mobile devices.
Just because a company is bad at hardware, doesn't mean this is the right way to do things.
Spent zero dollar on development costs? Buddy stop making shit up, the T239 is semi custom chip made for Nintendo from the T234 original, NVIDIA isn't a charity that'll do this for free for Nintendo.

I agree to disagree on this.
How many devices have you been involved in developing? I've at least been part of a dozen or so, nothing as fancy as this, but mostly Arm Cortex-something devices.

Considering the SoC is likely to cost US50-100 on its own (based on what Nvidia's developer boards cost), it's really a terrible choice of chip, both in terms of power efficiency and performance, notwithstanding the GPU. But I guess that's how it goes if you don't want to spend any money on the hardware development.

If you bothered to click the links in the OP then you'll see that they estimated the cost of the SoC at $21.517.
 
The Switch 2 runs Android? Thats news to me and to Nintendo as well.
"Tuning for a different GPU" doesn't work buddy, this is not a PC, the whole binary code for the game will change and it'll needs a lot of development time by developers to port the games again to the new completely different GPU or Nintendo has to develop an emulator.
Ok, some of it is heavily Android derived at least, like the graphics driver, so it would be very easy to tune for a different GPU, as they use a Linux-ish driver.
Components derived from Android code include the Stagefright multimedia framework, as well as components of the graphics stack[5] including the display server (derived from SurfaceFlinger) and the graphics driver (which seems to be derived from Nvidia's proprietary Linux driver).
There's no such thing as half cores.
Lol yes I'm pretty sure all the engineers there don't and you do, they should really hire you.
You know jack shit about who I am of what I have worked with, so maybe they should hire me, you wouldn't know anyhow.
Spent zero dollar on development costs? Buddy stop making shit up, the T239 is semi custom chip made for Nintendo from the T234 original, NVIDIA isn't a charity that'll do this for free for Nintendo.
So what you're saying is that you agree, since Nvidia did all the work, cool.
You clearly don't know how this industry works.
Yes, Nintendo spent nothing on the development cost and Nvidia spent as little as possible as well, since they re-used what they had, with minimal effort.
The cost Nintendo is paying, is the cost for the chips Nvidia sells to them, not the development of said chip.
Based on your flawed logic, all the graphics card makers are paying Nvidia up front for the company to make GPUs for them, which is not the case.
If you bothered to click the links in the OP then you'll see that they estimated the cost of the SoC at $21.517.
Estimated by some random person in xina, sure... That's not a reliable source and even if that was the cost to Nvidia, that is not what Nintendo would pay them, or do you pay cost price for everything you buy? I can tell you've never sourced components either.
 
Last edited:
Ok, some of it is heavily Android derived at least, like the graphics driver, so it would be very easy to tune for a different GPU, as they use a Linux-ish driver.
Lol just because they took a few components from AOSP then its suddenly runs on Android, what a great logic.
I told you they should hire you as everything is so easy to you, only a tune and it'll work lol.
There's no such thing as half cores, but ok dude.
Lol what?

13 threads out of the 16 total are accessible to developers, 13 threads = 6.5 cores, is this hard for you to understand?
So what you're saying is that you agree, since Nvidia did all the work, cool.
You clearly don't know how this industry works.
Of course NVIDIA will do the chip designing work, they are the company who made the SoC after all.
R&D, cutting the GPU by 1/4, completely redesigning the CPU portion not to mention the other things are all "minimal work" to you, lol.
You clearly don't know how this industry works.
Yes you are clearly the one that does with your $100 per SoC estimate lol.
Yes, Nintendo spent nothing on the development cost and Nvidia spent as little as possible as well, since they re-used what they had, with minimal effort.
The cost Nintendo is paying, is the cost for the chips Nvidia sells to them, not the development of said chip.
Based on your flawed logic, all the graphics card makers are paying Nvidia up front for the company to make GPUs for them, which is not the case.
You don't know the details of the deal between NVIDIA and Nintendo, in the end the cost of development will be paid by Nintendo one way or another.
Estimated by some random person in xina, sure... That's not a reliable source and even if that was the cost to Nvidia, that is not what Nintendo would pay them, or do you pay cost price for everything you buy? I can tell you've never sourced components either.
Random person in xina? lol, the whole die analysis is made by KurnalSalts and his report is ~100 pages on the T239.

Depends on the deal with NVIDIA whether they paid a one time fee for the tech and have full control on manufacturing (most likely) or they are purchasing it per unit, at a cost of $21.5 Nintendo is for sure end up paying <$50 per SoC in the end.

FYI, Inorder for the Switch 2 to be anywhere near profitable at the retail price of $450 the BoM has to be <$200.

Just the thought of you thinking the SoC alone can cost $100 is beyond laughable.
 
Its called balancing cost, power and performance.
Power and performance were both seriously screwed up by using old everything design on even older node.
In battery powered device every little bit matters for getting good battery life and we're not talking about little bits here, but glacial erratics.

And low cost and Nvidia isn't likely combination the way Jensen wants to always charge arm, both legs and half the internal organs.
Or the part in question was result of journey to "oldest sediments" in dumpster.

All consoles reserve 1-2 cores for system as the developers can't access the full cores.
"Tube fed" console situation isn't relevant for battery powered mobile device.
In desk consoles it doesn't matter if cores dedicated for OS and background tasks are full size cores.
Unless their idling is as bad as some Intel NetBurst...

13 threads out of the 16 total are accessible to developers, 13 threads = 6.5 cores, is this hard for you to understand?
SMT thread isn't half of core.
And no developer would use that thread lacking exclusive access to core for anything important precisely because of lack of guaranteed processing resources.
 
Why the shock and surprise? Nothing about this is new. The original Switch was based on Maxwell, which was already similarly old by the time it released in 2017. Desktop and mobile versions of Ampere were also built on Samsung 8N, so stands to reason they would use it here, too. The current Nvidia 4N node used by Ada and Blackwell is currently dedicated to things of higher importance than a Switch console, using this node keeps cost down and availability up. It's a no brainer.

I actually think this is the smartest choice for Nintendo and one that the PC market should largely follow. There is no need to use the latest nodes on gaming GPUs, you will either have the cheap GPUs with the prices you're demanding ($100-600 range), or you're going to have latest node, bleeding edge hardware.

AMD should be building some cards on RDNA 2's 7 nm, and Nvidia should also be making some Samsung 8N cards to flood the market with. Restart production on the RTX 3080 12 GB and 3080 Ti, make a few 3090 Ti cards too while at it. AMD's still using GDDR6 should have a healthy stock, resume production on Navi 21 cards as well. Make these available for a nice price and literally all is well. Would you say no to a $599 3090 Ti? Or a $499 6950 XT? I wouldn't.
 
Power and performance were both seriously screwed up by using old everything design on even older node.
In battery powered device every little bit matters for getting good battery life and we're not talking about little bits here, but glacial erratics.

And low cost and Nvidia isn't likely combination the way Jensen wants to always charge arm, both legs and half the internal organs.
Or the part in question was result of journey to "oldest sediments" in dumpster.
And whats the alterative process they can use?
Samsung 7nm & 6nm are short lived nodes and were only marginally better.
Samsung 4nm will cost them ~3X and they won't be able to undercut the PS5 and XSX in price while using this node.
TSMC N4 is more expensive than Samsung 4nm and is overbooked.

Only real alternative is TSMC N6 which will double the cost in exchange for better performance and efficiency, assuming TSMC has a spare ~10000 wafers per month capacity which is highly unlikely, not to mention the need of a full chip redesign.

NVIDIA knew when they made a deal with Nintendo that it'll be a low profit margin, otherwise Nintendo wouldn't be able to sell Switch consoles under $200.
"Tube fed" console situation isn't relevant for battery powered mobile device.
In desk consoles it doesn't matter if cores dedicated for OS and background tasks are full size cores.
Unless their idling is as bad as some Intel NetBurst...
Reserving cores are better than wasting silicon die on useless cores that'll only be used for background tasks.
Again, there's a reason why Nintendo didn't use the small cores on the original Switch even though it exists on the die.
SMT thread isn't half of core.
And no developer would use that thread lacking exclusive access to core for anything important precisely because of lack of guaranteed processing resources.
Nothing is guaranteed on a console, almost everything is shared including the memory, the developers still have access to that extra thread which they could tap into wherever they see it fit.
 
I am so glad I bought a new 512gb LCD steamdeck for $328. This is a piece of crap.
 
Lol just because they took a few components from AOSP then its suddenly runs on Android, what a great logic.
I told you they should hire you as everything is so easy to you, only a tune and it'll work lol.
Ok, so I was wrong here, but they use standard drivers, so your complaint about it being hard to change to a different GPU is still moot.
Lol what?

13 threads out of the 16 total are accessible to developers, 13 threads = 6.5 cores, is this hard for you to understand?
And SMT is so good in games... So good in fact that there's a mode on a lot of motherboards to turn it off for gaming now...
Of course NVIDIA will do the chip designing work, they are the company who made the SoC after all.
R&D, cutting the GPU by 1/4, completely redesigning the CPU portion not to mention the other things are all "minimal work" to you, lol.
So following your logic, you're saying that every time Nvidia does a cheaper version of a GPU, it costs them 100's of million of even billion of dollars?
Yes, this is minimal work, clearly you don't know much about chip design either.
Once you have a working design, it's quite easy today to make variants of that chip, hence why Nvidia has a whole bunch of these Arm chips with just minor differences. If it was as complex and expensive as you suggest, it wouldn't happen. Same with all chips out there, we would only see one version ever. Yes, it costs a good chunk of cash to tape out a chip, but not anywhere near the numbers you're suggesting.
Yes you are clearly the one that does with your $100 per SoC estimate lol.
You need to learn to read, I said US$50-100, that's a pretty big range.
Considering that the Jetson Orin Nano 8GB costs US$249, with 8 GB of RAM, an Ethernet chip and a micro SD card slot, plus some power regulation components, it's not hard to draw the conclusion that the SoC itself is somewhere in the $50 region, maybe a tad more for Nintendo, since Nvidia wants to make up for the customisation costs.
But yeah, you clearly know best, who has never made a single device...
You don't know the details of the deal between NVIDIA and Nintendo, in the end the cost of development will be paid by Nintendo one way or another.
No, I don't, but this is commonly how it works, no sensible company wants to pay up front for the development cost of hardware. Yes, Nintendo most likely had to pay an NRE fee, as that is an industry norm, but nothing more than that. I obviously don't know what Nvidia would charge in terms of NRE fee for something like this, but it's not 100's of millions or billions as you're suggesting, I even doubt it's in the 10's of millions, since they already had a chip design that they just modified slightly.
Random person in xina? lol, the whole die analysis is made by KurnalSalts and his report is ~100 pages on the T239.
Good for whoever that is.
Depends on the deal with NVIDIA whether they paid a one time fee for the tech and have full control on manufacturing (most likely) or they are purchasing it per unit, at a cost of $21.5 Nintendo is for sure end up paying <$50 per SoC in the end.
Highly unlikely, as Nintendo wouldn't want to 1. have to deal with the fabs and 2. the chips is obviously marked Nvidia and not Nintendo.
This is not like the Xbox chips, which are branded Xbox.
FYI, Inorder for the Switch 2 to be anywhere near profitable at the retail price of $450 the BoM has to be <$200.
Consoles aren't profitable? I though that was a common fact?
They're sold at cost at best.
Just the thought of you thinking the SoC alone can cost $100 is beyond laughable.
Again, I said US$50-100, not $100, but you keep using the higher price in the range.
See comment above about cost.

It's funny how you're so sure that I'm wrong, no matter what, yet you've not come up with any proof that I'm wrong.
 
Ok, so I was wrong here, but they use standard drivers, so your complaint about it being hard to change to a different GPU is still moot.
So these components are a graphics driver now? Lol, you don't even know what these are.
Here is a hint for you, The Switch runs on custom FreeBSD based OS.
And SMT is so good in games... So good in fact that there's a mode on a lot of motherboards to turn it off for gaming now...
Just because it can be bad sometimes in PC games then it must be bad on consoles as well, got your logic lol.
Its so bad that Sony hurt their own console by keeping it enabled. Sony should really compete with Nintendo to hire you to fix their console.
So following your logic, you're saying that every time Nvidia does a cheaper version of a GPU, it costs them 100's of million of even billion of dollars?
Yes, this is minimal work, clearly you don't know much about chip design either.
Once you have a working design, it's quite easy today to make variants of that chip, hence why Nvidia has a whole bunch of these Arm chips with just minor differences. If it was as complex and expensive as you suggest, it wouldn't happen. Same with all chips out there, we would only see one version ever. Yes, it costs a good chunk of cash to tape out a chip, but not anywhere near the numbers you're suggesting.
Yes thats why NVIDIA with their tens of thousands of engineers take over a month to design the a smaller GPU out of the bigger one, where it should be just minimal work.
Thats why AMD pulled out of making big die GPUs even though they are making >$200 profit per GPU.
BTW the variants you are telling about in the ARM chips are called binning, its the same die.
You need to learn to read, I said US$50-100, that's a pretty big range.
Considering that the Jetson Orin Nano 8GB costs US$249, with 8 GB of RAM, an Ethernet chip and a micro SD card slot, plus some power regulation components, it's not hard to draw the conclusion that the SoC itself is somewhere in the $50 region, maybe a tad more for Nintendo, since Nvidia wants to make up for the customisation costs.
But yeah, you clearly know best, who has never made a single device...
Lol no you know better with the Android Switch OS.
No, I don't, but this is commonly how it works, no sensible company wants to pay up front for the development cost of hardware. Yes, Nintendo most likely had to pay an NRE fee, as that is an industry norm, but nothing more than that. I obviously don't know what Nvidia would charge in terms of NRE fee for something like this, but it's not 100's of millions or billions as you're suggesting, I even doubt it's in the 10's of millions, since they already had a chip design that they just modified slightly.
Lol, Sony and Microsoft both do pay upfront.
And no on the cost part as well, the cost of designing the T239 is about $150 and a cup of coffee, its minimal work after all.
Consoles aren't profitable? I though that was a common fact?
They're sold at cost at best.
Nintendo always sells hardware for profit, unlike Sony and Microsoft who rely more on software and services.
Again, I said US$50-100, not $100, but you keep using the higher price in the range.
See comment above about cost.

It's funny how you're so sure that I'm wrong, no matter what, yet you've not come up with any proof that I'm wrong.
No there wasn't any proof of anything in any of my replies, I've must posted some random shit then, I blame my 6.5 core PS5, my Android running Switch, my preordered minimal work Switch 2 and the random guy on the internet die shots and analysis for that.
 
Back
Top