Friday, November 16th 2012

AMD Powers Brilliant HD Game and Video Performance for Nintendo's Wii U

AMD is proud to support Nintendo's newly launched Wii U home console as the supplier of custom AMD Radeon HD GPU technology. As announced at E3 in 2011, the custom-for-Nintendo AMD graphics processor enables Wii U to provide exciting, immersive game play, brilliant HD video and game graphics and new forms of interaction for consumers. Since 2001, AMD technology has been included in more than 118 million Nintendo Wii and GameCube hardware units around the world.1

The AMD GPU will help bring Nintendo's popular franchises into HD for the first time with new innovative game-play experiences, and the new Wii U GamePad controller, which creates a second window into the game world. "Wii U and its GamePad controller offer completely new and unexpected game-play and entertainment experiences," said Genyo Takeda, General Manager, Integrated Research & Development Division, Nintendo Co., Ltd. "We chose AMD to support our HD gaming efforts with its best-in-class graphics capabilities, and we're proud to call them a technology partner."

"Our relationship with Nintendo is the next exciting chapter in the long AMD history of supplying the game console market with our elite graphics expertise," said Saeid Moshkelani, corporate vice president and general manager, Semi-Custom Business Unit, AMD. "Working so closely with Nintendo to create the ideal custom graphics processor for Wii U is another example of how AMD stands for giving consumers the best video entertainment and gaming experience -- whether that's a next-generation console, desktop and notebook PC, big screen HDTV or tablet."

The new Wii U is available just in time for the Holidays along with other AMD-powered notebooks, tablets and desktops, rounding out tech-savvy consumers' holiday wish list. According to the Consumer Electronics Association, three in four gift-giving adults plan to buy a consumer-electronics product as a gift. AMD-powered products provide superior computing experiences for consumers -- from more brilliant graphics when gaming to longer battery life when on the go -- ultimately letting people do more every day.
Add your own comment

49 Comments on AMD Powers Brilliant HD Game and Video Performance for Nintendo's Wii U

#26
seronx
jihadjoeYes there is someone in AMD stealing their moneys... the CPU division.
The CPU division is paid for it is the GPU division stealing from the GPU division. If you have noticed everyone of importance has been booted out on the GPU side while the majority of CPU engineers and SoC engineers are still at AMD from 2007-2009. Most of the oldies from before 2001 from AMD and before 2006 from ATI are pretty much gone and have been replaced by younger engineers.

AMD is going to sell VLIW4/VLIW5 patents/documentation/etc and the ATI name(and brand) for $1.5-$3b(not final price) if you don't know what I'm getting at.
Posted on Reply
#27
Ikaruga
VinskaIt's not the size that matters, but how you use it...badum-tish!
Did it ever occur to you that he was talking about the size, and I was talking about bad programmers?

A huge percent of the games what you will see on the Wii-u will be multi-platform games, titles released on the xbox360 and the Ps3 , where devs had to do most of the work on the "powerful" CPUs. What I meant is that porting those games to the wii-u (where the GPU is the strong component), gonna give a hard time to devs with not enough experience or coding skills. And even with good programmers, you gonna need a different approach sometimes (for examlpe, I think Rage would need to use "GPU transcoding" on the Wii-u, something similar what it has on the PC)
Posted on Reply
#28
librin.so.1
@Ikaruga

Yes, I am fully aware of this and I do know what they were talking about. If You rolled a 1 on "spot a joke" skill check, well, that post of mine was this. ;)
Posted on Reply
#29
Ikaruga
Vinska@Ikaruga

Yes, I am fully aware of this and I do know what they were talking about. If You rolled a 1 on "spot a joke" skill check, well, that post of mine was this. ;)
ok, sorry:toast:
Posted on Reply
#30
M3T4LM4N222
AMD denied the rumors that they're looking for a buyer... even though their Assets have decreased, their liabilities have reduced at a quicker rate. They are actually in a better position financially than they were in 2008. In 2008 they had 7,672 million in assets and 7,545 million in debt. This margin has since widened so that they have 4,954 million in assets and 3,364 million in debt. Probably due to the large success of Phenom II and the 5xxx/6xxx/7xxx series.


AMD's long term debt has gone from 5.5B in 2008 to 2B in 2012. They are no where near being bankrupt or in being any major financial trouble.


AMD is also powering every next-gen console (Wii U, PS4, Xbox) with either Trinity or some custom GPU and since consoles sell so well they'll probably make a decent amount of money off that.
Posted on Reply
#31
Isenstaedt
jihadjoeYes there is someone in AMD stealing their moneys... the CPU division.
So true. I wonder if AMD would still be alive if they hadn't brought ATi.
Posted on Reply
#32
xenocide
Not sure why so many people are shocked by the size of the SoC, the newest revisions of the 360 have almost the exact same thing in them, and I'd imagine the PS3's most recent hardware revision is similar.

I'm still waiting for details on the GPU. The CPU is basically a revised version of the CPU in the 360, with maybe one more core. I've heard conflicting reports of it being a tri-core in some places, and a quad-core in others. If it's a tri-core then I'm pretty sure it's quite literally the same CPU as the 360, since it's made by the same people and even sports the same clock speed. The GPU has my interest, because the Wii's GPU was depressingly bad--even for its time.

I am interested to see how the Wii U does, but I have no interest in buying one. I loved Nintendo for quite some time, but I can't justify buying a system just hoping it features a good Zelda and Metroid game. The last Metroid game (for Wii) was pretty lackluster in my books, and even the last couple Zelda games haven't been all that wonderful (not since Majora's Mask imo). I can already tell you this system is going to feature Nintendo throwing developers under the

bus. Every feature they've kind of skirted around so far has ended in them saying it will depend on the developers. VoIP in game? The developer has to make sure they set it up the correct way. Using the tablet with the TV changed to another setting? Developers responsibility. Usefullness of the tablet as a whole? Up to the developer. Honestly, I love Nintendo games on Nintendo systems, but that's about it. I just don't have the money to throw down on a "Zelda\Metroid Console".
Posted on Reply
#33
Mussels
Freshwater Moderator
TheMailMan78I played one a few weeks ago and I gotta say it smokes the PS3 or 360 in image quality. Every game I played had AA and was smooth. Ill be getting one when the price comes down a lil'.
wait, really?

was the graphics simplified to achieve that level of smoothness?
Posted on Reply
#34
Xzibit
TheMailMan78I played one a few weeks ago and I gotta say it smokes the PS3 or 360 in image quality. Every game I played had AA and was smooth. Ill be getting one when the price comes down a lil'.
$349.99 not too tempting

$299.99 tempting for Nintendo exclusives

$249.99 I'm sold

Maybe after E3 next year when any news of the other two offerings force the price down. A Zelda game will pursuade me more.
Posted on Reply
#35
Ikaruga
xenocideThe CPU is basically a revised version of the CPU in the 360, with maybe one more core. I've heard conflicting reports of it being a tri-core in some places, and a quad-core in others. If it's a tri-core then I'm pretty sure it's quite literally the same CPU as the 360, since it's made by the same people and even sports the same clock speed. The GPU has my interest, because the Wii's GPU was depressingly bad--even for its time.
Link to source pls?
I just can't believe what you said. Nintendo confirmed backwards compatibility to Wii titles. Emulation of the Hollywood won't be a problem on any VLIW5 GPU, but the CPU has to be the same architecture or something different but a lot lot faster.
I think it's more likely that they either added the original Broadway + 2 new modern cores, or they enhanced it by making it multicore and/or added out-of-order execution. Don't forget that 99% of the rumors leaking from devs are talking about large cache and/or edrams on the die, so the size is quite small indeed if you would subtract the size of silicon needed for those.

There are about 3-4 different speculations out there which could be quite plausible, but "Nintendo using the Xeon" was never one of them.The bottom line is that there are absolutely no creditable info about the CPU out there which could be confirmed atm, so if you know anything what the rest of the Internet doesn't, please share;)
Posted on Reply
#36
FordGT90Concept
"I go fast!1!11!1!"
natr0nYou would think AMD would be making tons of $$$ out of this deal , yet they are looking for a buyer.
Because ATI was only about 10% the size of AMD when AMD bought ATI out. That was during AMD's hayday too (not long after Athlon X2 processors came out). AMD has shrunk a lot since then and ATI has never been, and likely never will be, big enough to sustain AMD.


Remember, the Wii was basically Nintendo's last ditch effort at console gaming. That's why it was built with such low-end hardware: Nintendo couldn't afford to lose money on every console sold like Sony and Microsoft did. They now have the resources to fiscally compete with Sony and Microsoft so they focused on where the Wii was weakest: hardware.

It shows in the price tag too: $300 debut price versus $250 for the Wii.
Posted on Reply
#38
Mussels
Freshwater Moderator
IkarugaWii-U Teardown, (if anyone is interested)
damint, why a video? I WANT PICTURES
Posted on Reply
#39
Ikaruga
Musselsdamint, why a video? I WANT PICTURES
I already linked one earlier:p You can see that they took pictures, so they will probably publish it later somewhere on their site.

At least we know that they are using on of the speed bins of the K4W4G1646B (probably 1600Mhz, but that's just my guess).
Posted on Reply
#40
TheMailMan78
Big Member
Musselswait, really?

was the graphics simplified to achieve that level of smoothness?
That's the thing.......not at all. What I saw first hand reminded me of PC level graphics. Now If they had BF3 running on it I could give you much more of a description but what they did have running left me with a smile. It felt like a real console with awesome graphics. Not a watered down PC trying to act like a console.
Posted on Reply
#42
KainXS
besided the rumors no one knows and we won't know anything until someone gets a clear shot of the actual cpu/gpu package, all nintendo has commented on is a IBM CPU and a AMD GPU thats all no real details though besides what we know from pictures, the GPU has 2GB of GDDR3 shared.
Posted on Reply
#43
xenocide
IkarugaLink to source pls?
I just can't believe what you said. Nintendo confirmed backwards compatibility to Wii titles.
If only the Xbox CPU and Wii CPU were made by the same company largely using the same instruction sets. Maybe developed by a very large 100 year old company that's big and blue and all over the console industry.

I'm not saying it's the exact same hardware, only that it's very similar. It's not unreasonable to think a revised version of the Xenon could support a majority of the functionality that atrocious Wii CPU had. Worst case scenario, the thing was so weak they could probably just emulate anything it ran...
Posted on Reply
#44
TRWOV
I think it has been mentioned several times that the Wii U CPU is a PowerPC variant which would mean Gekko again. I'd think it's a tri-core at least as it has been mentioned to be a "multi-core processor".
Posted on Reply
#45
cdawall
where the hell are my stars
seronxAMD is going to sell VLIW4/VLIW5 patents/documentation/etc and the ATI name(and brand) for $1.5-$3b(not final price) if you don't know what I'm getting at.
I feel like you have been sniffing the stuff that came out of the will it blend mixer...particularly a blend of an Intel CPU and nVidia GPU.
Posted on Reply
#46
drdeathx
seronxThe CPU division is paid for it is the GPU division stealing from the GPU division. If you have noticed everyone of importance has been booted out on the GPU side while the majority of CPU engineers and SoC engineers are still at AMD from 2007-2009. Most of the oldies from before 2001 from AMD and before 2006 from ATI are pretty much gone and have been replaced by younger engineers.

AMD is going to sell VLIW4/VLIW5 patents/documentation/etc and the ATI name(and brand) for $1.5-$3b(not final price) if you don't know what I'm getting at.
Who says the CPU division is paid for? :laugh:

AMD's net loss is not from just the GPU division.
Posted on Reply
#47
xenocide
TRWOVI think it has been mentioned several times that the Wii U CPU is a PowerPC variant which would mean Gekko again. I'd think it's a tri-core at least as it has been mentioned to be a "multi-core processor".
The most detailed description the gave was "a high performance multi-core CPU made by IBM based on the PowerPC design". They also mentioned it would have a hefty amount of eDRAM, and that it shared some of the tech that went into IBM Watson (which is pretty god damn vague).
Posted on Reply
#48
Ikaruga
xenocideThe most detailed description the gave was "a high performance multi-core CPU made by IBM based on the PowerPC design". They also mentioned it would have a hefty amount of eDRAM, and that it shared some of the tech that went into IBM Watson (which is pretty god damn vague).
It's probably something very cheap hardware, since they also have to give you the controller for the price, and consoles are not made for enthusiast, so only content matters and not the hardware. If Nintendo can convince third parties somehow to develop to the hardware, they will achieve their goal.
That being said, whatever slow the hardware is, this guy claims that his sister also playing with the same game on the controller screen while he plays 60fps@HD(1080i?) on the TV, which is probably an efficiency world record with a 30-40W-ish power draw.
Posted on Reply
Add your own comment
May 5th, 2024 11:48 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts