System Name | Jaspe |
---|---|
Processor | Ryzen 1500X |
Motherboard | Asus ROG Strix X370-F Gaming |
Cooling | Stock |
Memory | 16Gb Corsair 3000mhz |
Video Card(s) | EVGA GTS 450 |
Storage | Crucial M500 |
Display(s) | Philips 1080 24' |
Case | NZXT |
Audio Device(s) | Onboard |
Power Supply | Enermax 425W |
Software | Windows 10 Pro |
Nothing new, typical Nintendo.So a five year old Arm core...
Processor | AMD Ryzen 7 5700G |
---|---|
Motherboard | Gigabyte B450M S2H |
Cooling | Scythe Kotetsu Mark II |
Memory | 2 x 16GB SK Hynix CJR OEM DDR4-3200 @ 4000 20-22-20-48 |
Video Card(s) | Colorful RTX 2060 SUPER 8GB GDDR6 |
Storage | 250GB WD BLACK SN750 M.2 + 4TB WD Red Plus + 4TB WD Purple |
Display(s) | AOpen 27HC5R 27" 1080p 165Hz curved VA |
Case | AIGO Darkflash C285 |
Audio Device(s) | Creative SoundBlaster Z + Kurtzweil KS-40A bookshelf / Sennheiser HD555 |
Power Supply | Great Wall GW-EPS1000DA 1kW |
Mouse | Razer Deathadder Essential |
Keyboard | Cougar Attack2 Cherry MX Black |
Software | Windows 10 Pro x64 22H2 |
Processor | Intel® Core™ i7-13700K |
---|---|
Motherboard | Gigabyte Z790 Aorus Elite AX |
Cooling | Noctua NH-D15 |
Memory | 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5 |
Video Card(s) | KUROUTOSHIKOU RTX 5080 GALAKURO |
Storage | 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD |
Display(s) | Acer Predator X34 3440x1440@100Hz G-Sync |
Case | NZXT PHANTOM410-BK |
Audio Device(s) | Creative X-Fi Titanium PCIe |
Power Supply | Corsair 850W |
Mouse | Logitech Hero G502 SE |
Software | Windows 11 Pro - 64bit |
Benchmark Scores | 30FPS in NFS:Rivals |
Processor | 5900x |
---|---|
Motherboard | MSI MEG UNIFY |
Cooling | Arctic Liquid Freezer 2 360mm |
Memory | 4x8GB 3600c16 Ballistix |
Video Card(s) | EVGA 3080 FTW3 Ultra |
Storage | 1TB SX8200 Pro, 2TB SanDisk Ultra 3D, 6TB WD Red Pro |
Display(s) | Acer XV272U |
Case | Fractal Design Meshify 2 |
Power Supply | Corsair RM850x |
Mouse | Logitech G502 Hero |
Keyboard | Ducky One 2 |
Wasn't the rumor always that nintendo was buying spare EoL hardware for pennies from nvidia? I know that was assumed for switch 1, and with Thor solutions being out now from nvidia it's the perfect time to pass off their ancient garbage to nintendo. The dock has extra cooling for the extra power and handheld I'm assuming they're still going to target ~720p-1080p upscaled and have vrr to fall back on to help 30-45fps feel better.This. But at the same time, if I got to a company and ask for a custom chip, why would I accept getting whatever leftovers they want to give me?
Unless Nintendo got this for no up front cost and no down payment, this was a bad deal for Nintendo.
The chip really doesn't fit the requirements of the end product.
System Name | Overlord Mk MLI |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | Gigabyte X670E Aorus Master |
Cooling | Noctua NH-D15 SE with offsets |
Memory | 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68 |
Video Card(s) | Gainward GeForce RTX 4080 Phantom GS |
Storage | 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000 |
Display(s) | Acer XV272K LVbmiipruzx 4K@160Hz |
Case | Fractal Design Torrent Compact |
Audio Device(s) | Corsair Virtuoso SE |
Power Supply | be quiet! Pure Power 12 M 850 W |
Mouse | Logitech G502 Lightspeed |
Keyboard | Corsair K70 Max |
Software | Windows 10 Pro |
Benchmark Scores | https://valid.x86.fr/yfsd9w |
At one point the rumour was that they were going to use that upcoming, now potentially delayed, MTK+Nvidia chip, but clearly that wasn't the case.Wasn't the rumor always that nintendo was buying spare EoL hardware for pennies from nvidia? I know that was assumed for switch 1, and with Thor solutions being out now from nvidia it's the perfect time to pass off their ancient garbage to nintendo. The dock has extra cooling for the extra power and handheld I'm assuming they're still going to target ~720p-1080p upscaled and have vrr to fall back on to help 30-45fps feel better.
System Name | H7 Flow 2024 |
---|---|
Processor | AMD 5800X3D |
Motherboard | Asus X570 Tough Gaming |
Cooling | Custom liquid |
Memory | 32 GB DDR4 |
Video Card(s) | Intel ARC A750 |
Storage | Crucial P5 Plus 2TB. |
Display(s) | AOC 24" Freesync 1m.s. 75Hz |
Mouse | Lenovo |
Keyboard | Eweadn Mechanical |
Software | W11 Pro 64 bit |
Insert AMD, Intel, Nvidia etc, they all do it.Why are people even surprised at Nintendo pushing lastlastlastgen hardware to its loyal customers? It's not like they won't pay for it anyway.
And yet most modern games don't hit above 4k 60fps. I want the NS2 to hit a consistent 1080p 120 but I doubt that'll happen as wellWell they maybe could pick something a year old for the tech enthusiasts, but it would be a dud product costing $1000 to buy?
It only needs to be good enough for its intended purpose, it doesnt need to run the latest AAA at 4k 100fps.
I think it is already overpriced as it is.
Whoa there, but Samus now has psychic abilities to open doors. That's a game changer! But in all honesty, I hope it's a good game.8nm node in 2025 for a 10 yr device huh? honestly, i was on the fence about this anyway with their $95 controller replacement, and metroid prime 4 video doesn't even look that good, the terrain in the gameplay video from last month looks like its a ps3 game. fk it, cancelling my pre-order
System Name | "Lots of people name their swords. Lots of cunts." |
---|---|
Processor | R7 7800X3D |
Motherboard | ASRock B650M PG Riptide |
Cooling | Wraith Max + 2x Noctua Redux NF-P14r + 3x NF-P12r |
Memory | 2x 16GB ADATA XPG Lancer Blade DDR5-6000 C30 |
Video Card(s) | Sapphire Pulse RX 9070 XT |
Storage | ADATA Legend 970 2TB PCIe 5.0 |
Display(s) | Dell 32" S3222DGM - 1440p 165Hz / P2422H 1080p 60Hz |
Case | HYTE Y40 |
Audio Device(s) | Microsoft Xbox TLL-00008 |
Power Supply | Cooler Master MWE 750 V2 |
Mouse | Alienware AW320M |
Keyboard | Alienware AW510K |
Software | W11 Pro |
Not on halo products.Insert AMD, Intel, Nvidia etc, they all do it.
Processor | AMD Ryzen 9 7950X |
---|---|
Motherboard | ASUS ROG Crosshair X670E Hero |
Cooling | NZXT Kraken Z73 RGB |
Memory | 64GB G.Skill Trident Z RGB DDR5 6000MT/s CL30 |
Video Card(s) | Zotac NVIDIA RTX 4090 Trinity |
Storage | 2TB Samsung 990 Pro M.2 SSD + 15.36TB Micron 9300 Pro U.2 SSD |
Display(s) | LG C9 UHD 120Hz OLED TV |
Case | Lian Li O11D XL ROG |
Audio Device(s) | Denon AVR + 5.1 Setup |
Power Supply | Corsair HX1000i |
Mouse | Logitech G903 |
Keyboard | Thermaltake Level 20 RGB w/ Cherry MX Blue keys |
The A78's replacement, which by now is the A725, had three predecessors, the A710, A715 and A720, all of which would have been excellent choices, as they're all ARMv9 based cores, whereas the A78 is using the older ARMv8 instruction set.
It would most likely have cost Nintendo little to nothing to have gone with the newer Arm cores, but it seem like Nvidia might not have a license for them.
On top of that, it's odd that there are no Little cores at all, since a pair of A55 cores would've been able to handle background tasks and what no, while freeing up the A78 cores for the important stuff. It just doesn't look like a good chip for a battery powered device.
System Name | Overlord Mk MLI |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | Gigabyte X670E Aorus Master |
Cooling | Noctua NH-D15 SE with offsets |
Memory | 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68 |
Video Card(s) | Gainward GeForce RTX 4080 Phantom GS |
Storage | 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000 |
Display(s) | Acer XV272K LVbmiipruzx 4K@160Hz |
Case | Fractal Design Torrent Compact |
Audio Device(s) | Corsair Virtuoso SE |
Power Supply | be quiet! Pure Power 12 M 850 W |
Mouse | Logitech G502 Lightspeed |
Keyboard | Corsair K70 Max |
Software | Windows 10 Pro |
Benchmark Scores | https://valid.x86.fr/yfsd9w |
If they don't want to waste power budget on the CPU, they clearly screwed up big-time then, as they went with eight A78 cores on an old, power hungry node...Cortex A710, A715, A720 & A725 are not optimized for implementation on a 10 nm/8 nm node, they are optimized for 5 nm and lower. the A78 is already a bit too powerful for 10 nm/8 nm not to mention Nintendo wanting to waste as little power budget as possible on the CPU.
What are you on about?If by "cost them little to nothing" you meant potentially billions of dollars, then yes you are correct.
Well, you read my comment whatever way you want, but nowhere did I state that they had to use an old node, nor Nvidia as partner, but ok, you make up some shit to make me look bad.Even ignoring the fact that these newer cores won't work as advertised on 10 nm/8 nm, you still have to pay NVIDIA millions of dollars to redesign the chip, pay ARM more licensing fees, increase the BoM cost by having a bigger die and lower yields.... etc.
I guess you didn't bother reading what I wrote then? Did I mention they where for gaming? You do understand how these SoC work I presume? And if it was so bad to have these lower-power cores, why did Intel move to three levels of them in their more recent mobile chips?Little cores are useless for a gaming device, plus there is a reason why even mobile SoCs are moving away from little cores.
Processor | AMD Ryzen 5700X |
---|---|
Motherboard | MSI X470 |
Cooling | Assassin King 120 |
Memory | 16GB DDR4 |
Video Card(s) | 6700XT |
Storage | SK Hynix Platinum |
Maybe Nintendo/JHH is underestimating comrade DJTWhy are people even surprised at Nintendo pushing lastlastlastgen hardware to its loyal customers? It's not like they won't pay for it anyway.
System Name | Overlord Mk MLI |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | Gigabyte X670E Aorus Master |
Cooling | Noctua NH-D15 SE with offsets |
Memory | 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68 |
Video Card(s) | Gainward GeForce RTX 4080 Phantom GS |
Storage | 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000 |
Display(s) | Acer XV272K LVbmiipruzx 4K@160Hz |
Case | Fractal Design Torrent Compact |
Audio Device(s) | Corsair Virtuoso SE |
Power Supply | be quiet! Pure Power 12 M 850 W |
Mouse | Logitech G502 Lightspeed |
Keyboard | Corsair K70 Max |
Software | Windows 10 Pro |
Benchmark Scores | https://valid.x86.fr/yfsd9w |
A lot less than this, for sure.I wonder how much nintendo paid nvidia for this old chip?
Processor | AMD Ryzen 9 7950X |
---|---|
Motherboard | ASUS ROG Crosshair X670E Hero |
Cooling | NZXT Kraken Z73 RGB |
Memory | 64GB G.Skill Trident Z RGB DDR5 6000MT/s CL30 |
Video Card(s) | Zotac NVIDIA RTX 4090 Trinity |
Storage | 2TB Samsung 990 Pro M.2 SSD + 15.36TB Micron 9300 Pro U.2 SSD |
Display(s) | LG C9 UHD 120Hz OLED TV |
Case | Lian Li O11D XL ROG |
Audio Device(s) | Denon AVR + 5.1 Setup |
Power Supply | Corsair HX1000i |
Mouse | Logitech G903 |
Keyboard | Thermaltake Level 20 RGB w/ Cherry MX Blue keys |
Its called balancing cost, power and performance.If they don't want to waste power budget on the CPU, they clearly screwed up big-time then, as they went with eight A78 cores on an old, power hungry node...
What are you on about?
Well, you read my comment whatever way you want, but nowhere did I state that they had to use an old node, nor Nvidia as partner, but ok, you make up some shit to make me look bad.
Little cores on x86 is for fixing the fatal issue with x86 performance cores which is efficiency on no/very low load, ARM cores don't have that problem.I guess you didn't bother reading what I wrote then? Did I mention they where for gaming? You do understand how these SoC work I presume? And if it was so bad to have these lower-power cores, why did Intel move to three levels of them in their more recent mobile chips?
You always have a boatload of background tasks, which these low power cores can deal with, while the big, powerful cores can focus on the big tasks, like gaming.
No just some knowledge and common sense.I presume you work for Nintendo or Nvidia and was involved in picking this chip?
System Name | Overlord Mk MLI |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | Gigabyte X670E Aorus Master |
Cooling | Noctua NH-D15 SE with offsets |
Memory | 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68 |
Video Card(s) | Gainward GeForce RTX 4080 Phantom GS |
Storage | 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000 |
Display(s) | Acer XV272K LVbmiipruzx 4K@160Hz |
Case | Fractal Design Torrent Compact |
Audio Device(s) | Corsair Virtuoso SE |
Power Supply | be quiet! Pure Power 12 M 850 W |
Mouse | Logitech G502 Lightspeed |
Keyboard | Corsair K70 Max |
Software | Windows 10 Pro |
Benchmark Scores | https://valid.x86.fr/yfsd9w |
Nothing balanced about this chip though.Its called balancing cost, power and performance.
Eh? I think you got your numbers off by a fair few zeroes here.By changing NVIDIA they have to either give up backward compatibility or resort to emulation which will also potentially will cost them millions (if not billions) of dollars.
No need to be defensive, nobody is trying to make you look bad lol we are in a tech forum not a beauty pageant.
Yet the small Arm cores handles background tasks, which this thing will have as well. Plus, you'd save on battery when you're not in a game, just like the little cores are the ones mostly used on your phone when you're not doing something CPU intensive. Most of my "use time" is on the small cores on low frequencies, this is since last reboot from a couple of weeks ago though, so it's not super accurate, as I haven't run anything taxing since then. On a mobile device of any kind, you want this kind of behaviour, so you don't waste battery power on mundane tasks.Little cores on x86 is for fixing the fatal issue with x86 performance cores which is efficiency on no/very low load, ARM cores don't have that problem.
Every use case scenario have different requirements.
Proof of this?All consoles reserve 1-2 cores for system as the developers can't access the full cores.
That only shows that Nintendo doesn't understand mobile devices well.Switch 1 SoC had little cores that Nintendo didn't want/use, and for Switch 2 they won't waste precious die space and millions on development costs for no benefit whatsoever.
I agree to disagree on this.No just some knowledge and common sense.
Processor | AMD Ryzen 9 7950X |
---|---|
Motherboard | ASUS ROG Crosshair X670E Hero |
Cooling | NZXT Kraken Z73 RGB |
Memory | 64GB G.Skill Trident Z RGB DDR5 6000MT/s CL30 |
Video Card(s) | Zotac NVIDIA RTX 4090 Trinity |
Storage | 2TB Samsung 990 Pro M.2 SSD + 15.36TB Micron 9300 Pro U.2 SSD |
Display(s) | LG C9 UHD 120Hz OLED TV |
Case | Lian Li O11D XL ROG |
Audio Device(s) | Denon AVR + 5.1 Setup |
Power Supply | Corsair HX1000i |
Mouse | Logitech G903 |
Keyboard | Thermaltake Level 20 RGB w/ Cherry MX Blue keys |
I'm not talking about the chip itself, I'm talking about Nintendo choosing which chip to use based on cost and then how to allocate the power budget ratio between the CPU ,GPU ...etc.Nothing balanced about this chip though.
The Switch 2 runs Android? Thats news to me and to Nintendo as well.Eh? I think you got your numbers off by a fair few zeroes here.
Also, what emulation? The thing runs Android, it would just be tuning for a different GPU.
Spending 1W instead of 1.5W won't do much for a gaming device.Yet the small Arm cores handles background tasks, which this thing will have as well. Plus, you'd save on battery when you're not in a game, just like the little cores are the ones mostly used on your phone when you're not doing something CPU intensive. Most of my "use time" is on the small cores on low frequencies, this is since last reboot from a couple of weeks ago though, so it's not super accurate, as I haven't run anything taxing since then. On a mobile device of any kind, you want this kind of behaviour, so you don't waste battery power on mundane tasks.
Proof of this?
Lol yes I'm pretty sure all the engineers there don't and you do, they should really hire you.That only shows that Nintendo doesn't understand mobile devices well.
Spent zero dollar on development costs? Buddy stop making shit up, the T239 is semi custom chip made for Nintendo from the T234 original, NVIDIA isn't a charity that'll do this for free for Nintendo.Also, Nintendo clearly spent ZERO dollars on development costs, since they took a bodge job from Nvidia that was never intended for mobile devices.
Just because a company is bad at hardware, doesn't mean this is the right way to do things.
I agree to disagree on this.
How many devices have you been involved in developing? I've at least been part of a dozen or so, nothing as fancy as this, but mostly Arm Cortex-something devices.
Considering the SoC is likely to cost US50-100 on its own (based on what Nvidia's developer boards cost), it's really a terrible choice of chip, both in terms of power efficiency and performance, notwithstanding the GPU. But I guess that's how it goes if you don't want to spend any money on the hardware development.
System Name | Overlord Mk MLI |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | Gigabyte X670E Aorus Master |
Cooling | Noctua NH-D15 SE with offsets |
Memory | 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68 |
Video Card(s) | Gainward GeForce RTX 4080 Phantom GS |
Storage | 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000 |
Display(s) | Acer XV272K LVbmiipruzx 4K@160Hz |
Case | Fractal Design Torrent Compact |
Audio Device(s) | Corsair Virtuoso SE |
Power Supply | be quiet! Pure Power 12 M 850 W |
Mouse | Logitech G502 Lightspeed |
Keyboard | Corsair K70 Max |
Software | Windows 10 Pro |
Benchmark Scores | https://valid.x86.fr/yfsd9w |
Ok, some of it is heavily Android derived at least, like the graphics driver, so it would be very easy to tune for a different GPU, as they use a Linux-ish driver.The Switch 2 runs Android? Thats news to me and to Nintendo as well.
"Tuning for a different GPU" doesn't work buddy, this is not a PC, the whole binary code for the game will change and it'll needs a lot of development time by developers to port the games again to the new completely different GPU or Nintendo has to develop an emulator.
Components derived from Android code include the Stagefright multimedia framework, as well as components of the graphics stack[5] including the display server (derived from SurfaceFlinger) and the graphics driver (which seems to be derived from Nvidia's proprietary Linux driver).
There's no such thing as half cores.
You know jack shit about who I am of what I have worked with, so maybe they should hire me, you wouldn't know anyhow.Lol yes I'm pretty sure all the engineers there don't and you do, they should really hire you.
So what you're saying is that you agree, since Nvidia did all the work, cool.Spent zero dollar on development costs? Buddy stop making shit up, the T239 is semi custom chip made for Nintendo from the T234 original, NVIDIA isn't a charity that'll do this for free for Nintendo.
Estimated by some random person in xina, sure... That's not a reliable source and even if that was the cost to Nvidia, that is not what Nintendo would pay them, or do you pay cost price for everything you buy? I can tell you've never sourced components either.If you bothered to click the links in the OP then you'll see that they estimated the cost of the SoC at $21.517.
Processor | AMD Ryzen 9 7950X |
---|---|
Motherboard | ASUS ROG Crosshair X670E Hero |
Cooling | NZXT Kraken Z73 RGB |
Memory | 64GB G.Skill Trident Z RGB DDR5 6000MT/s CL30 |
Video Card(s) | Zotac NVIDIA RTX 4090 Trinity |
Storage | 2TB Samsung 990 Pro M.2 SSD + 15.36TB Micron 9300 Pro U.2 SSD |
Display(s) | LG C9 UHD 120Hz OLED TV |
Case | Lian Li O11D XL ROG |
Audio Device(s) | Denon AVR + 5.1 Setup |
Power Supply | Corsair HX1000i |
Mouse | Logitech G903 |
Keyboard | Thermaltake Level 20 RGB w/ Cherry MX Blue keys |
Lol just because they took a few components from AOSP then its suddenly runs on Android, what a great logic.Ok, some of it is heavily Android derived at least, like the graphics driver, so it would be very easy to tune for a different GPU, as they use a Linux-ish driver.
Lol what?There's no such thing as half cores, but ok dude.
Of course NVIDIA will do the chip designing work, they are the company who made the SoC after all.So what you're saying is that you agree, since Nvidia did all the work, cool.
You clearly don't know how this industry works.
Yes you are clearly the one that does with your $100 per SoC estimate lol.You clearly don't know how this industry works.
You don't know the details of the deal between NVIDIA and Nintendo, in the end the cost of development will be paid by Nintendo one way or another.Yes, Nintendo spent nothing on the development cost and Nvidia spent as little as possible as well, since they re-used what they had, with minimal effort.
The cost Nintendo is paying, is the cost for the chips Nvidia sells to them, not the development of said chip.
Based on your flawed logic, all the graphics card makers are paying Nvidia up front for the company to make GPUs for them, which is not the case.
Random person in xina? lol, the whole die analysis is made by KurnalSalts and his report is ~100 pages on the T239.Estimated by some random person in xina, sure... That's not a reliable source and even if that was the cost to Nvidia, that is not what Nintendo would pay them, or do you pay cost price for everything you buy? I can tell you've never sourced components either.
Processor | i7 4770K |
---|---|
Motherboard | Asus Z87-Expert |
Cooling | Noctua NH-U12S, &case fans all controlled by Aquaero 6 |
Memory | 2x8GB TeamGroup Xtreem LV 2133MHz |
Video Card(s) | Vega 64 |
Storage | Samsung 840 Pro + 2x 5GB WD Red@RAID1 |
Display(s) | Dell U3014 |
Case | Lian Li PC-A71B |
Audio Device(s) | Sound Blaster ZxR, Objective2 (2x), AKG K702&712, Beyerdynamic DT990 |
Power Supply | Seasonic Prime Titanium 650 (+Eaton 5P 1550 as "backup power") |
Mouse | Logitech G700 |
Keyboard | Logitech G810 |
Power and performance were both seriously screwed up by using old everything design on even older node.Its called balancing cost, power and performance.
"Tube fed" console situation isn't relevant for battery powered mobile device.All consoles reserve 1-2 cores for system as the developers can't access the full cores.
SMT thread isn't half of core.13 threads out of the 16 total are accessible to developers, 13 threads = 6.5 cores, is this hard for you to understand?
Processor | 13th Gen Intel Core i9-13900KS |
---|---|
Motherboard | ASUS ROG Maximus Z790 Apex Encore |
Cooling | Pichau Lunara ARGB 360 + Honeywell PTM7950 |
Memory | 32 GB G.Skill Trident Z5 RGB @ 7600 MT/s |
Video Card(s) | Palit GameRock OC GeForce RTX 5090 32 GB |
Storage | 500 GB WD Black SN750 + 4x 300 GB WD VelociRaptor WD3000HLFS HDDs |
Display(s) | 55-inch LG G3 OLED |
Case | Cooler Master MasterFrame 700 benchtable |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic IntelliMouse |
Keyboard | IBM Model M type 1391405 |
Software | Windows 10 Pro 22H2 |
Benchmark Scores | I pulled a Qiqi~ |
Processor | AMD Ryzen 9 7950X |
---|---|
Motherboard | ASUS ROG Crosshair X670E Hero |
Cooling | NZXT Kraken Z73 RGB |
Memory | 64GB G.Skill Trident Z RGB DDR5 6000MT/s CL30 |
Video Card(s) | Zotac NVIDIA RTX 4090 Trinity |
Storage | 2TB Samsung 990 Pro M.2 SSD + 15.36TB Micron 9300 Pro U.2 SSD |
Display(s) | LG C9 UHD 120Hz OLED TV |
Case | Lian Li O11D XL ROG |
Audio Device(s) | Denon AVR + 5.1 Setup |
Power Supply | Corsair HX1000i |
Mouse | Logitech G903 |
Keyboard | Thermaltake Level 20 RGB w/ Cherry MX Blue keys |
And whats the alterative process they can use?Power and performance were both seriously screwed up by using old everything design on even older node.
In battery powered device every little bit matters for getting good battery life and we're not talking about little bits here, but glacial erratics.
And low cost and Nvidia isn't likely combination the way Jensen wants to always charge arm, both legs and half the internal organs.
Or the part in question was result of journey to "oldest sediments" in dumpster.
Reserving cores are better than wasting silicon die on useless cores that'll only be used for background tasks."Tube fed" console situation isn't relevant for battery powered mobile device.
In desk consoles it doesn't matter if cores dedicated for OS and background tasks are full size cores.
Unless their idling is as bad as some Intel NetBurst...
Nothing is guaranteed on a console, almost everything is shared including the memory, the developers still have access to that extra thread which they could tap into wherever they see it fit.SMT thread isn't half of core.
And no developer would use that thread lacking exclusive access to core for anything important precisely because of lack of guaranteed processing resources.
System Name | Firestarter |
---|---|
Processor | 7950X |
Motherboard | X670E Steel Legend |
Cooling | LF 2 420 |
Memory | 4x16 G.Skill X5 6000@CL36 |
Video Card(s) | RTX Gigabutt 4090 Gaming OC |
Storage | SSDS: OS: 2TB P41 Plat, 4TB SN850X, 1TB SN770. Raid 5 HDDS: 4x4TB WD Red Nas 2.0 HDDs, 1TB ext HDD. |
Display(s) | 42C3PUA, some dinky TN 10.1 inch display. |
Case | Fractal Torrent |
Audio Device(s) | PC38X |
Power Supply | GF3 TT Premium 850W |
Mouse | Razer Basilisk V3 Pro |
Keyboard | Steel Series Apex Pro |
VR HMD | Pimax Crystal with Index controllers |
System Name | Overlord Mk MLI |
---|---|
Processor | AMD Ryzen 7 7800X3D |
Motherboard | Gigabyte X670E Aorus Master |
Cooling | Noctua NH-D15 SE with offsets |
Memory | 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68 |
Video Card(s) | Gainward GeForce RTX 4080 Phantom GS |
Storage | 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000 |
Display(s) | Acer XV272K LVbmiipruzx 4K@160Hz |
Case | Fractal Design Torrent Compact |
Audio Device(s) | Corsair Virtuoso SE |
Power Supply | be quiet! Pure Power 12 M 850 W |
Mouse | Logitech G502 Lightspeed |
Keyboard | Corsair K70 Max |
Software | Windows 10 Pro |
Benchmark Scores | https://valid.x86.fr/yfsd9w |
Ok, so I was wrong here, but they use standard drivers, so your complaint about it being hard to change to a different GPU is still moot.Lol just because they took a few components from AOSP then its suddenly runs on Android, what a great logic.
I told you they should hire you as everything is so easy to you, only a tune and it'll work lol.
And SMT is so good in games... So good in fact that there's a mode on a lot of motherboards to turn it off for gaming now...Lol what?
13 threads out of the 16 total are accessible to developers, 13 threads = 6.5 cores, is this hard for you to understand?
So following your logic, you're saying that every time Nvidia does a cheaper version of a GPU, it costs them 100's of million of even billion of dollars?Of course NVIDIA will do the chip designing work, they are the company who made the SoC after all.
R&D, cutting the GPU by 1/4, completely redesigning the CPU portion not to mention the other things are all "minimal work" to you, lol.
You need to learn to read, I said US$50-100, that's a pretty big range.Yes you are clearly the one that does with your $100 per SoC estimate lol.
No, I don't, but this is commonly how it works, no sensible company wants to pay up front for the development cost of hardware. Yes, Nintendo most likely had to pay an NRE fee, as that is an industry norm, but nothing more than that. I obviously don't know what Nvidia would charge in terms of NRE fee for something like this, but it's not 100's of millions or billions as you're suggesting, I even doubt it's in the 10's of millions, since they already had a chip design that they just modified slightly.You don't know the details of the deal between NVIDIA and Nintendo, in the end the cost of development will be paid by Nintendo one way or another.
Good for whoever that is.Random person in xina? lol, the whole die analysis is made by KurnalSalts and his report is ~100 pages on the T239.
Highly unlikely, as Nintendo wouldn't want to 1. have to deal with the fabs and 2. the chips is obviously marked Nvidia and not Nintendo.Depends on the deal with NVIDIA whether they paid a one time fee for the tech and have full control on manufacturing (most likely) or they are purchasing it per unit, at a cost of $21.5 Nintendo is for sure end up paying <$50 per SoC in the end.
Consoles aren't profitable? I though that was a common fact?FYI, Inorder for the Switch 2 to be anywhere near profitable at the retail price of $450 the BoM has to be <$200.
Again, I said US$50-100, not $100, but you keep using the higher price in the range.Just the thought of you thinking the SoC alone can cost $100 is beyond laughable.
Processor | AMD Ryzen 9 7950X |
---|---|
Motherboard | ASUS ROG Crosshair X670E Hero |
Cooling | NZXT Kraken Z73 RGB |
Memory | 64GB G.Skill Trident Z RGB DDR5 6000MT/s CL30 |
Video Card(s) | Zotac NVIDIA RTX 4090 Trinity |
Storage | 2TB Samsung 990 Pro M.2 SSD + 15.36TB Micron 9300 Pro U.2 SSD |
Display(s) | LG C9 UHD 120Hz OLED TV |
Case | Lian Li O11D XL ROG |
Audio Device(s) | Denon AVR + 5.1 Setup |
Power Supply | Corsair HX1000i |
Mouse | Logitech G903 |
Keyboard | Thermaltake Level 20 RGB w/ Cherry MX Blue keys |
So these components are a graphics driver now? Lol, you don't even know what these are.Ok, so I was wrong here, but they use standard drivers, so your complaint about it being hard to change to a different GPU is still moot.
Just because it can be bad sometimes in PC games then it must be bad on consoles as well, got your logic lol.And SMT is so good in games... So good in fact that there's a mode on a lot of motherboards to turn it off for gaming now...
Yes thats why NVIDIA with their tens of thousands of engineers take over a month to design the a smaller GPU out of the bigger one, where it should be just minimal work.So following your logic, you're saying that every time Nvidia does a cheaper version of a GPU, it costs them 100's of million of even billion of dollars?
Yes, this is minimal work, clearly you don't know much about chip design either.
Once you have a working design, it's quite easy today to make variants of that chip, hence why Nvidia has a whole bunch of these Arm chips with just minor differences. If it was as complex and expensive as you suggest, it wouldn't happen. Same with all chips out there, we would only see one version ever. Yes, it costs a good chunk of cash to tape out a chip, but not anywhere near the numbers you're suggesting.
Lol no you know better with the Android Switch OS.You need to learn to read, I said US$50-100, that's a pretty big range.
Considering that the Jetson Orin Nano 8GB costs US$249, with 8 GB of RAM, an Ethernet chip and a micro SD card slot, plus some power regulation components, it's not hard to draw the conclusion that the SoC itself is somewhere in the $50 region, maybe a tad more for Nintendo, since Nvidia wants to make up for the customisation costs.
But yeah, you clearly know best, who has never made a single device...
Lol, Sony and Microsoft both do pay upfront.No, I don't, but this is commonly how it works, no sensible company wants to pay up front for the development cost of hardware. Yes, Nintendo most likely had to pay an NRE fee, as that is an industry norm, but nothing more than that. I obviously don't know what Nvidia would charge in terms of NRE fee for something like this, but it's not 100's of millions or billions as you're suggesting, I even doubt it's in the 10's of millions, since they already had a chip design that they just modified slightly.
Nintendo always sells hardware for profit, unlike Sony and Microsoft who rely more on software and services.Consoles aren't profitable? I though that was a common fact?
They're sold at cost at best.
No there wasn't any proof of anything in any of my replies, I've must posted some random shit then, I blame my 6.5 core PS5, my Android running Switch, my preordered minimal work Switch 2 and the random guy on the internet die shots and analysis for that.Again, I said US$50-100, not $100, but you keep using the higher price in the range.
See comment above about cost.
It's funny how you're so sure that I'm wrong, no matter what, yet you've not come up with any proof that I'm wrong.