Discussion in 'News' started by Cristian_25H, Nov 16, 2012.
It's not the size that matters, but how you use it...badum-tish!
The CPU division is paid for it is the GPU division stealing from the GPU division. If you have noticed everyone of importance has been booted out on the GPU side while the majority of CPU engineers and SoC engineers are still at AMD from 2007-2009. Most of the oldies from before 2001 from AMD and before 2006 from ATI are pretty much gone and have been replaced by younger engineers.
AMD is going to sell VLIW4/VLIW5 patents/documentation/etc and the ATI name(and brand) for $1.5-$3b(not final price) if you don't know what I'm getting at.
Did it ever occur to you that he was talking about the size, and I was talking about bad programmers?
A huge percent of the games what you will see on the Wii-u will be multi-platform games, titles released on the xbox360 and the Ps3 , where devs had to do most of the work on the "powerful" CPUs. What I meant is that porting those games to the wii-u (where the GPU is the strong component), gonna give a hard time to devs with not enough experience or coding skills. And even with good programmers, you gonna need a different approach sometimes (for examlpe, I think Rage would need to use "GPU transcoding" on the Wii-u, something similar what it has on the PC)
Yes, I am fully aware of this and I do know what they were talking about. If You rolled a 1 on "spot a joke" skill check, well, that post of mine was this.
AMD denied the rumors that they're looking for a buyer... even though their Assets have decreased, their liabilities have reduced at a quicker rate. They are actually in a better position financially than they were in 2008. In 2008 they had 7,672 million in assets and 7,545 million in debt. This margin has since widened so that they have 4,954 million in assets and 3,364 million in debt. Probably due to the large success of Phenom II and the 5xxx/6xxx/7xxx series.
AMD's long term debt has gone from 5.5B in 2008 to 2B in 2012. They are no where near being bankrupt or in being any major financial trouble.
AMD is also powering every next-gen console (Wii U, PS4, Xbox) with either Trinity or some custom GPU and since consoles sell so well they'll probably make a decent amount of money off that.
So true. I wonder if AMD would still be alive if they hadn't brought ATi.
Not sure why so many people are shocked by the size of the SoC, the newest revisions of the 360 have almost the exact same thing in them, and I'd imagine the PS3's most recent hardware revision is similar.
I'm still waiting for details on the GPU. The CPU is basically a revised version of the CPU in the 360, with maybe one more core. I've heard conflicting reports of it being a tri-core in some places, and a quad-core in others. If it's a tri-core then I'm pretty sure it's quite literally the same CPU as the 360, since it's made by the same people and even sports the same clock speed. The GPU has my interest, because the Wii's GPU was depressingly bad--even for its time.
I am interested to see how the Wii U does, but I have no interest in buying one. I loved Nintendo for quite some time, but I can't justify buying a system just hoping it features a good Zelda and Metroid game. The last Metroid game (for Wii) was pretty lackluster in my books, and even the last couple Zelda games haven't been all that wonderful (not since Majora's Mask imo). I can already tell you this system is going to feature Nintendo throwing developers under the
bus. Every feature they've kind of skirted around so far has ended in them saying it will depend on the developers. VoIP in game? The developer has to make sure they set it up the correct way. Using the tablet with the TV changed to another setting? Developers responsibility. Usefullness of the tablet as a whole? Up to the developer. Honestly, I love Nintendo games on Nintendo systems, but that's about it. I just don't have the money to throw down on a "Zelda\Metroid Console".
was the graphics simplified to achieve that level of smoothness?
$349.99 not too tempting
$299.99 tempting for Nintendo exclusives
$249.99 I'm sold
Maybe after E3 next year when any news of the other two offerings force the price down. A Zelda game will pursuade me more.
Link to source pls?
I just can't believe what you said. Nintendo confirmed backwards compatibility to Wii titles. Emulation of the Hollywood won't be a problem on any VLIW5 GPU, but the CPU has to be the same architecture or something different but a lot lot faster.
I think it's more likely that they either added the original Broadway + 2 new modern cores, or they enhanced it by making it multicore and/or added out-of-order execution. Don't forget that 99% of the rumors leaking from devs are talking about large cache and/or edrams on the die, so the size is quite small indeed if you would subtract the size of silicon needed for those.
There are about 3-4 different speculations out there which could be quite plausible, but "Nintendo using the Xeon" was never one of them.The bottom line is that there are absolutely no creditable info about the CPU out there which could be confirmed atm, so if you know anything what the rest of the Internet doesn't, please share
Because ATI was only about 10% the size of AMD when AMD bought ATI out. That was during AMD's hayday too (not long after Athlon X2 processors came out). AMD has shrunk a lot since then and ATI has never been, and likely never will be, big enough to sustain AMD.
Remember, the Wii was basically Nintendo's last ditch effort at console gaming. That's why it was built with such low-end hardware: Nintendo couldn't afford to lose money on every console sold like Sony and Microsoft did. They now have the resources to fiscally compete with Sony and Microsoft so they focused on where the Wii was weakest: hardware.
It shows in the price tag too: $300 debut price versus $250 for the Wii.
Wii-U Teardown, (if anyone is interested)
damint, why a video? I WANT PICTURES
I already linked one earlier You can see that they took pictures, so they will probably publish it later somewhere on their site.
At least we know that they are using on of the speed bins of the K4W4G1646B (probably 1600Mhz, but that's just my guess).
That's the thing.......not at all. What I saw first hand reminded me of PC level graphics. Now If they had BF3 running on it I could give you much more of a description but what they did have running left me with a smile. It felt like a real console with awesome graphics. Not a watered down PC trying to act like a console.
What is the gpu?
besided the rumors no one knows and we won't know anything until someone gets a clear shot of the actual cpu/gpu package, all nintendo has commented on is a IBM CPU and a AMD GPU thats all no real details though besides what we know from pictures, the GPU has 2GB of GDDR3 shared.
If only the Xbox CPU and Wii CPU were made by the same company largely using the same instruction sets. Maybe developed by a very large 100 year old company that's big and blue and all over the console industry.
I'm not saying it's the exact same hardware, only that it's very similar. It's not unreasonable to think a revised version of the Xenon could support a majority of the functionality that atrocious Wii CPU had. Worst case scenario, the thing was so weak they could probably just emulate anything it ran...
I think it has been mentioned several times that the Wii U CPU is a PowerPC variant which would mean Gekko again. I'd think it's a tri-core at least as it has been mentioned to be a "multi-core processor".
I feel like you have been sniffing the stuff that came out of the will it blend mixer...particularly a blend of an Intel CPU and nVidia GPU.
Who says the CPU division is paid for?
AMD's net loss is not from just the GPU division.
The most detailed description the gave was "a high performance multi-core CPU made by IBM based on the PowerPC design". They also mentioned it would have a hefty amount of eDRAM, and that it shared some of the tech that went into IBM Watson (which is pretty god damn vague).
It's probably something very cheap hardware, since they also have to give you the controller for the price, and consoles are not made for enthusiast, so only content matters and not the hardware. If Nintendo can convince third parties somehow to develop to the hardware, they will achieve their goal.
That being said, whatever slow the hardware is, this guy claims that his sister also playing with the same game on the controller screen while he plays 60fps@HD(1080i?) on the TV, which is probably an efficiency world record with a 30-40W-ish power draw.
Separate names with a comma.