• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD to Power Next-Generation NES

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,668 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Nintendo is working on a next-generation gaming console to succeed even the fairly recent Wii U. The company is reacting to the plummeting competitiveness of its current console to the likes of PlayStation 4 and the Xbox One. Reports suggest that Nintendo would make a course-correction on the direction in which it took its game console business with the Wii, and could come up with a system that's focused on serious gaming, as much as it retains its original "fun" quotient. In that manner, the console could be more NES-like, than Wii-like.

Nintendo could ring up AMD for the chip that will drive its next console. It's not clear if AMD will supply a fully-integrated SoC that combines its own x86 CPU cores with its GCN graphics processor; or simply supply the GPU component for an SoC that combines components from various other manufacturers. The Wii U uses IBM's CPU cores, with AMD's GPU, combined onto a single chip. There's no word on when Nintendo plans to announce the new console, but one can expect a lot more news in 2015-16.

View at TechPowerUp Main Site
 
But is that a smart decision? In a way it is, because anyone not interested in tennis and other aerobics offered on Wii didn't even consider Wii at all. But then again, when they'll bring themselves back to the same level as Xbox and PS again, they'll lose the drastical difference they had with Wii. Meaning they'll have a direct competition again. I hope they have some tricks up the sleeve and that they won't be moronic exclusives but something else.
 
But is that a smart decision? In a way it is, because anyone not interested in tennis and other aerobics offered on Wii didn't even consider Wii at all. But then again, when they'll bring themselves back to the same level as Xbox and PS again, they'll lose the drastical difference they had with Wii. Meaning they'll have a direct competition again. I hope they have some tricks up the sleeve and that they won't be moronic exclusives but something else.
One of the reasons the Wii was a winner was because it was cheaper than the PS3 and X360.
If they bump up the hardware, they'll have to bump up the price, and that didn't work out very well for the Wii U
 
Liking this news a great deal. As a gamer who basically has Nintendo in their blood (starting my gaming at the age of 2 with the original Super Mario Bros.), I was quite disappointed with what Wii U had to offer, which is why I still have yet to own one - and I've owned/own most every Nintendo system ever made. Even the original Wii had games that made the system worthwhile, at least to me. But when I look at what Wii U has to offer, I just kinda shrug; yes it has Smash Bros., but so does the 3DS. It also has Mario Kart 8 (long timer fan of Mario Kart btw) and the usual Mario games, but what about those "must have" Zelda or (especially) Metroid games? What about decent 3rd party support? Wii U is kind of a dud in both cases. At any rate, I would love to see the big N become more competitive again.

*sigh* Am I the only one here who longs for the good ole' days of console gaming, when you and your friends would constantly argue over the SNES and Genesis and why which one was better? For the record, I had my flag in both camps, as I owned both. Great times. :toast:
 
There's no reason to use PowerPC. It's not going to be cheaper (and definitely not powerful, they were really anemic on release in the last consoles).
 
I'd want a 4 core Jaguar @ >2Ghz + >1048 GCN cores and 4GB of RAM for games. Nintendo has been notorious for using proven(old) tech so I don't think they'd use the latest designs from AMD. OS can be run on an ARM and a dedicated pool of RAM or something.

I really hope they don't ditch the Wii-mote, they just need to include the classic controller with every console.

Loving the Wii U so far, this year had good releases but I still play Wii games more than Wii U ones and see no real use for the tabled pad, although the remote play is neat. Had Nintendo ditched the tablet pad and sell the Wii U for $50 less we could be looking at a different landscape.
 
One of the reasons the Wii was a winner was because it was cheaper than the PS3 and X360.
If they bump up the hardware, they'll have to bump up the price, and that didn't work out very well for the Wii U

What good is it for console to be cheaper if it has no games you're interested in? One of the reasons why I've never ever even virtually considered Wii myself. But I had brief moments of wanting PS3 or the X360...
 
They need to expand their studios and output more games, and not just games with mario. A new console is going to mean nothing if they can't deliver reasons to own the system, especially considering they are going to be this late to the current gen as the Wii u isn't exactly able to compete with the current gen consoles both in performance or 3rd party support.
 
All they need to do is keep making the SNES, it is the best console ever, after all. :)
 
Sounds like they totally gave up on general x86 CPUs.
 
Put an PS4-styled APU on the NES and they are ready to go
 
Nintendo has a longer history with ATi that many can remember.
The GameCube was a console using ATi graphics chip (hollywood), and later the Wii used the same chip.

Personally, i understand Nintendo. Chosing AMD for their console is going to make games easier to develop to it.

Unfortunately, AMD's sluggy chip lithography and effiecency development makes me cringe when incredible solutions exist from Intel in the CPU side and NVIDIA from the GPU one.
 
One of the reasons the Wii was a winner was because it was cheaper than the PS3 and X360.

True, but it was also a winner because at that time they beat Microsoft and Sony to the market, who both then had major delays, and thus Nintendo had a product out for that crucial end of the year holiday sales.
 
Down with backwards compatibility! Who needs that! Booooooooo!

Meanwhile, I'll play original NES games on my computer...and SNES, and Sega, and PlayStation, and N64, and Xbox, and so on, and so forth.
 
well seeing as my room mates already have a PS4 and an Xbox one... and i've got my gaming pc... i might end up getting one... depends on what else is out at the time
 
I'd want a 4 core Jaguar @ >2Ghz + >1048 GCN cores and 4GB of RAM for games. Nintendo has been notorious for using proven(old) tech so I don't think they'd use the latest designs from AMD. OS can be run on an ARM and a dedicated pool of RAM or something.

I really hope they don't ditch the Wii-mote, they just need to include the classic controller with every console.

Loving the Wii U so far, this year had good releases but I still play Wii games more than Wii U ones and see no real use for the tabled pad, although the remote play is neat. Had Nintendo ditched the tablet pad and sell the Wii U for $50 less we could be looking at a different landscape.


I actually like the controller, it works great for letting a kid play without interrupting my TV or game time, and it allows kits to play competitively or co-op much easier. I think the failure is the general stagnation of the gaming landscape, few games other than point and click follow the story line games have been made recently, or shoot em up rinse and repeat games.

My only gripe is the occasional freeze with USB storage I have experienced, and the second pad would make it even better, use the power of the console, and it really does have enough, to let kids play two pads against and with each other in games, hell I would put it in the Van if they could do that.
 
The AMD APU makes sense in terms of cost savings and a unified memory architecture. Off the shelf parts are already capable of impressive visuals without the custom design route taken by Microsoft and Sony. The x86 architecture also provides a reliable future and good backwards compatibility down the road.

I also wouldn't be surprised if Intel stepped into the console/APU space to provide a competing solution.
 
The AMD APU makes sense in terms of cost savings and a unified memory architecture. Off the shelf parts are already capable of impressive visuals without the custom design route taken by Microsoft and Sony. The x86 architecture also provides a reliable future and good backwards compatibility down the road.

I also wouldn't be surprised if Intel stepped into the console/APU space to provide a competing solution.

Unless intel were to sell them at cost to continue their ill gotten monopoly, then I doubt it will happen. I doubt even those assholes have the clout to pay off m$, sony or nintendo to use their crap (intel graphics would be a very hard sell regardless).

Plus, it's probably not worth it for them. They can make better margins elsewhere.
 
One of the reasons the Wii was a winner was because it was cheaper than the PS3 and X360.

It was also a winner because they kept their console costs low and therefore profits high. Reusing older hardware by bumping up clock speed was a brilliant move to get the hardware out the door and shortcut software development time.
 
Nintendo has a longer history with ATi that many can remember.
The GameCube was a console using ATi graphics chip (hollywood), and later the Wii used the same chip.

Personally, i understand Nintendo. Chosing AMD for their console is going to make games easier to develop to it.

Unfortunately, AMD's sluggy chip lithography and effiecency development makes me cringe when incredible solutions exist from Intel in the CPU side and NVIDIA from the GPU one.

Actually they don't. NVIDIA may have better GPU at the moment, but they don't have any CPU capabilities. Intel on the other hand does have it, but they produce rubbish GPU's. AMD is really the only one which produces the best of both worlds in form of APU's. Chap and powerful units that are perfect for consoles that need low cost and high performance. Which can easily be delivered since it's programmed closer to the metal and you waste less power on compatibility. That's why basically last gen GPU's run new games as fast as latest desktop GPU's run the same games.

That may slightly change with Direct3D 12 though, but still...
 
Unless intel were to sell them at cost to continue their ill gotten monopoly, then I doubt it will happen. I doubt even those assholes have the clout to pay off m$, sony or nintendo to use their crap (intel graphics would be a very hard sell regardless).

Plus, it's probably not worth it for them. They can make better margins elsewhere.

Your comments would have been accurate 10 years ago. Since that time ARM has become a significant threat and Intel is moving towards protecting their market share more than their margins.

The Intel graphics have also improved quite a bit since then. You might be surprised where they are in a couple of years.
 
Your comments would have been accurate 10 years ago. Since that time ARM has become a significant threat and Intel is moving towards protecting their market share more than their margins.

The Intel graphics have also improved quite a bit since then. You might be surprised where they are in a couple of years.

I remember larrabee o_O

I guess they think they'll have to spend more than 2 billion like the larrabee disaster to make a worthwhile gpu. Of course, they have the cash...
 
Nintendo has a longer history with ATi that many can remember.
The GameCube was a console using ATi graphics chip (hollywood), and later the Wii used the same chip.

Personally, i understand Nintendo. Chosing AMD for their console is going to make games easier to develop to it.

Unfortunately, AMD's sluggy chip lithography and effiecency development makes me cringe when incredible solutions exist from Intel in the CPU side and NVIDIA from the GPU one.

I wouldn't define Intel solutions to be "Incredible" and Nvidia is not far far away from AMD on the GPU side.
Why would you cringe when AMD is the only one to have a proper APU solution?. AMD has decent offerings, far from the "distaster" you describe.
 
I'll just paste my thoughts from NPU:

I think it's pretty simple what they should do. Hear me out.

Say a console generation lasts 7-8 years. I figure the other two will aim for 2020.
Let's say the end goal of this generation is for the xbox one to still run titles at 720p/30fps, while next gen will be aimed at 4k (in ideal situations, similar to this gen and 1080p).

If Nintendo were to build something that were roughly 5x faster than the xbox one, it could run those similar xb1 titles at 1080p60 for the length of time those consoles are in competition. Vicariously, if 3-4 years after that sony/microsoft were to release another generation aimed at 4k (roughly 9x-10 faster than xbox 1), even if crossplatform titles were 1080p/60 on those systems, they could still be 720p/30 on the then older Nintendo console.

Make sense? Basically pull a Samsung (in reference to Galaxy vs Apple phones) and split the generation. They could claim a good couple to few years as top dog, while maintaining relevancy as production gets cheaper and even new consoles arrive.

In my estimation this would require something similar to a 290x at around 1100mhz for it's gpu (what everyone probably cares about). While that may sound asinine for a console at this moment, especially if part of an apu, the power/size of 14nm (be it Global Foundries or tsmc 16nm+) as well as advances that should be available through stacked dram (HMC) should all come to fruition by late 2015 to earliy-ish 2016. If Nintendo were to jump reasonably quickly on those techs to make a competitive product, I bet that could have something out the door by 2017...and hence the aforementioned scenario occurs.
 
I wouldn't define Intel solutions to be "Incredible" and Nvidia is not far far away from AMD on the GPU side.
Why would you cringe when AMD is the only one to have a proper APU solution?. AMD has decent offerings, far from the "distaster" you describe.

You wouldn't define the ability to get 2C/4T Haswell +iGPU at under 5W? i would. And no, NVIDIA isn't far far away, one far is enough.

AMD has decent offering, and decent brought enough potato-like trouble to both current-gen consoles with an array of devs having the lack of ability to supply either 1080P image and/or 60FPS in most games.
No, no thanks. Hopefully, maybe Nintendo will use a more powerful tech in the 20NM gen. Maybe not.
 
Back
Top