• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NEW! Wii-U Will Feature ATI GPU

The 360 was not based off the Radeon HD 2K line(AMD were still releasing the X1800's when it launched), the Radeon HD 2k line was based off the 360 GPU development, its a X19XX GPU with modifications, A alpha version of the HD GPU's in a way.

Pretty accurate. It was an X19xx with some of the features that didn't appear in the PC Market until the HD 2xxx series. What's more, the original Dev Kit 360's actually featured an X8xx-ish GPU. My theory is this showed a huge improvement in visuals over time since launch titles were developed on X800's, but later titles were developed on X1900's.

An important thing to remember, is Nintendo loves running nearly silent systems. I would not be surprised if the Wii U's CPU and GPU are underclocked quite a bit to accomplish this.
 
The only difference between a console GPU and PC GPU in today's hardware from both camps, is the eDRAM which can provide more bandwidth than GDDR5.

The software factor is not a hardware fault so it's not an influence for comparing hardware. There are games well programmed for PC, and that could be a lot better if they weren't being ported to current consoles: Crysis 3 - CryEngine3 Tech Trailer - YouTube

The thing is that even those effects are being used, it seems like there's not much difference between generations. I hope that knowing future consoles will have all that power to render only 720p graphics we see more visual improvement. Of course PC gamers will be screwed up and will have to crossfire/SLI to render to higher resolution buffers and support the extra shading, texturing and rendering loads if they get hardware by the time new consoles appear.
 
Well its nice to know that Nintendo decided not to put that much older hardware in the Wii U even though its 2 generations old.
 
All of these are rumours. They could even be using an equivalent to a radeon hd 7750. 28nm process, 512 shaders, 32 TMUs and 16 ROPs... 72GB/s of bandwidth and a TDP of 55W. 800Mhz of core clock and 1125 Mhz for memory. To the final user it cost $109 at launch (february 2012). For Nintendo it would be by now $50.

The new super-slim PS3 is said to be using 32nm for the CPU and 28nm for the GPU.

But, knowing they say they're using 40nm low power process, and the 400 shaders number is the only one spec marked as rumour, maybe they're using the equivalent to a mobility radeon 5870 but with higher memory speed to achieve those 75GB/s. The mobility hd 5870 has an exact TDP of 50W. The core has 800 shaders, 40 TMUs and 16 ROPs (800:40:16) at 700 Mhz. It's fun because AMD marked this GPU as Broadway XT, a similar name to the CPU of Wii, Broadway.
This would make the GPU on Wii U similar in performance to a desktop HD 4870 but with hardware features of Direct3D 11 and OpenGL 4 GPUs. If you add the eDRAM then you can get some extra features for free, like anti-aliasing or something like that. Of course some numbers and architectural components/efficiency can vary due to Nintendo's specific desires (they mention many modern features have been incorporated), but I think that's the ballpark, a mobile Radeon HD 5000.

Hell, we could say that the Wii U has an HD 7970M equivalent, with a TDP of 75W when running at 850 Mhz, it could run at 640 Mhz and have half the bus width to achieve 50W TDP and still have a core with 1280 shaders, 80 TMUs and 32 ROPs.

It's like getting an NVIDIA GTX460 and substitute with it a GTX260 (more or less, GTX260 was 55nm while GTX460 was 40nm, hence Nintendo had to use low power process by getting a mobile GPU, at the end of the day you won't be overclocking the console). The gain is advanced tessellation capabilites, multithreading and all of that but with the same performance of a GTX260 or low penalty if using a lot of power on the new features.
And knowing that the target resolution is 720p, it's freaking good to make that decision. Current ports can be rendered to 1080p and future games using the extra capabilites can be rendered to 720p and games for PS4 and Xbox 720 could be watered down without any major problems (no critical changes in GPU architecture on the horizon save for performance improvements).

Edit: Added though of a 7970M derived possibility.
 
Last edited:
The Wii U is using a 45nm CPU despite the exact same company being able to produce a 32nm version, I doubt they would splurge for a newer 28nm GPU.
 
Back
Top