Friday, May 24th 2013

Xbox One Chip Slower Than PlayStation 4

After bagging chip supply deals for all three new-generation consoles -- Xbox One, PlayStation 4, and Wii U, things are looking up for AMD. While Wii U uses older-generation hardware technologies, Xbox One and PlayStation 4 use the very latest AMD has to offer -- "Jaguar" 64-bit x86 CPU micro-architecture, and Graphics CoreNext GPU architecture. Chips that run the two consoles have a lot in common, but also a few less-than-subtle differences.

PlayStation 4 chip, which came to light this February, is truly an engineer's fantasy. It combines eight "Jaguar" 64-bit x86 cores clocked at 1.60 GHz, with a fairly well spec'd Radeon GPU, which features 1,156 stream processors, 32 ROPs; and a 256-bit wide unified GDDR5 memory interface, clocked at 5.50 GHz. At these speeds, the system gets a memory bandwidth of 176 GB/s. Memory isn't handled like UMA (unified memory architecture), there's no partition between system- and graphics-memory. The two are treated as items on the same 8 GB of memory, and either can use up a majority of it.

Xbox One chip is a slightly different beast. It uses the same eight "Jaguar" 1.60 GHz cores, but a slightly smaller Radeon GPU that packs 768 stream processors, and a quad-channel DDR3-2133 MHz memory interface, which offers a memory bandwidth of 68.3 GB/s, and holding 8 GB of memory. Memory between the two subsystems are shared in a similar way to PlayStation 4, with one small difference. Xbox One chip uses a large 32 MB SRAM cache, which operates at 102 GB/s, but at infinitesimally lower latency than GDDR5. This cache cushions data-transfers for the GPU. Microsoft engineers are spinning this off as "200 GB/s of memory bandwidth," by somehow clubbing bandwidths of the various memory types in the system.

The two consoles also differ with software. While PlayStation 4 runs a Unix-derived operating system with OpenGL 4.2 API, Xbox One uses software developers are more familiar with -- a 64-bit Windows NT 6.x kernel-based operating system, running DirectX 11 API. Despite these differences, the chips on the two consoles should greatly reduce multi-platform production costs for game studios, as the two consoles together have a lot in common with PC.Source: Heise.de
Add your own comment

148 Comments on Xbox One Chip Slower Than PlayStation 4

#1
Aquinus
Resident Wat-man
by: NinkobEi
Now let us compare that to console sales in 2012
Ever think that is because no one wants to buy games for a console that over 7 years old? This is fallout from Sony and MS waiting as long as possible before revamping the console yet again. I wouldn't buy any games for 360 for the simple fact that they're expensive and it doesn't look nearly as good. When the PS4/Xbox One comes out I seriously think those numbers will change again.

by: NinkobEi
I bet some of us TPU'ers could build a budget pc sub-600 that is on par with the Xbone. This will be doubly true in a couple of years.
This was true of the 360 as well, but it didn't change a whole lot. Also you have to keep in mind that the Xbox and PS are closed systems. They don't have nearly as much overhead as regular PCs do. I would think this to be true of any hardware, including a computer you build 2 years ago then go to replace now. Technology is always getting faster and if you wait you will reap the benefits.

I don't think MS would be doing this if they didn't think they could make some money off of it.
Posted on Reply
#2
Mussels
Moderprator
by: AsRock
Sales are bound to drop due to the supposedplanned release of the new consoles. And for people like my self only play a few classy games it's more than $60 you paying with a xbox as you gotta tag on the monthly fee o top and that would be like paying $100+. That's why they need to get the new models out the door ASAP.

So if i get one of the ewer systems it would have to not have a monthly fee.

It would be nice what the power usage is of these newer ones too how much better are they.
my quad core APU laptop can idle at 11W.


i'd expect the range to go between 10W-65W, at an educated guess (i dont know what things like kinect will do to power)
Posted on Reply
#3
Dent1
by: Jstn7477
I will agree that some games do play fine at 60 FPS (which is what the more poorly optimized games tend to go down to on my machine on some occasions despite none of my CPU cores or GPU being maxed out), but if you play multiplayer FPS games competitively e.g. Team Fortress 2, Counter-Strike, Quake, etc. there is quite a difference in smoothness between 60Hz and 100-120Hz. Some people have hung onto their CRTs for years and play at stupid resolutions like 1024*768 and 100Hz for these particular games (if they have yet to purchase a 120Hz 1080p LCD) because your local frame rate determines how many snapshots are sent back to the server. Again, many single player games (especially slower paced ones) at 60 Hz play nicely, but in multiplayer FPS games where people tweak the hell out of their net settings, reduce their interp settings and whatnot, it's hard to be part of the 60Hz norm. Call of Duty (not that I play it) supposedly has the best hit registration at 125 or 250 client FPS from what I heard as well.
All games, should and would run as intended at 60 FPS, providing the frame rate is stable and consistent.

Unless you are specially trained super soldier your brain and eyes won't be able to process 120FPS.
Posted on Reply
#4
Frick
Fishfaced Nincompoop
by: Dent1
All games, should and would run as intended at 60 FPS, providing the frame rate is stable and consistent.

Unless you are specially trained super soldier your brain and eyes won't be able to process 120FPS.
Oh ye gods not this argument again.
Posted on Reply
#5
Steevo
by: NinkobEi
I bet some of us TPU'ers could build a budget pc sub-600 that is on par with the Xbone. This will be doubly true in a couple of years.
But unlike the console we will have the overhead of the OS and all its integration, lack of DDR5 RAM and insane bandwidth and or low latency, custom built for a purpose.

Current comparisons show anywhere from a 0% to 400% increase in overhead when rendering a scene.

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/2

The most common used number is about 25% overhead even on a "clean" gaming only machine as the host OS has threads running and interrupting, fouling caches and memory mismanagement.

So a minimum 25% higher than a 8 core 8GB of truly high speed RAM and perhaps 10% more performance with the new memory architecture? Sure a Ivy with a Titan or a "Ghz" edition with overclocks will beat it, but that isn't the point. The point in the cost for playing a said game and the entertainment factor of it.

http://www.zdnet.com/valve-linux-runs-our-games-faster-than-windows-7-7000002060/

Linux runs it faster as it can be tweaked more by third parties to reduce the OS overhead, but it will still have some. Both the PS4 and Xbox will have even more benefit in matching hardware and software.
Posted on Reply
#6
AsRock
TPU addict
by: Mussels
my quad core APU laptop can idle at 11W.


i'd expect the range to go between 10W-65W, at an educated guess (i dont know what things like kinect will do to power)
Reasonable guess, that Kinect has vents on it hehe. Hopefully better than a aged 90w SONY PS3 slim.

I'll probably wait till they bring out the second model or even a much lower priced one as it be only used for the odd game although it will not be a XBox.
Posted on Reply
#7
ogharaei
TPU Proofreader
I think the Playstation will win this time around. Really disappointed that Microsoft won't support Indies with the Xbox One.
Posted on Reply
#8
TRWOV
I won't buy either but as long as the AAA titles come over to the PC I'll be happy. I haven't even played half of my Gamecube/PS2 library yet (yay! bargain bins!) not to mention my Wii/U games (100+).

Now that both consoles have x86 hardware there's few reasons to not port games to the PC so hopefully we'll see a big influx of games this time...not that there's a shortage of good games lately anyway.
Posted on Reply
#9
scoutingwraith
by: TRWOV
I won't buy either but as long as the AAA titles come over to the PC I'll be happy. I haven't even played half of my Gamecube/PS2 library yet (yay! bargain bins!) not to mention my Wii/U games (100+).

Now that both consoles have x86 hardware there's few reasons to not port games to the PC so hopefully we'll see a big influx of games this time...not that there's a shortage of good games lately anyway.
This...

I dont mean to sound like a troll or anything but i will wait it out and see who has the better exclusives. I have a 360 with probably 40+ games that i need to beat and i think 15+ are Rpgs. lol. I will probably buy another 360 for as cheap as possible just in case if my old system decides to bite the dust.
Posted on Reply
#10
m1dg3t
As long as consoles/consumer TVs are using HDMI don't get your hopes up for more than 1080p/1600p @ 60Hz. Thank you HDCP! :nutkick:

2560 x 1600 would be pretty sweet for TVs though...
Posted on Reply
#11
Fourstaff
by: m1dg3t
As long as consoles/consumer TVs are using HDMI don't get your hopes up for more than 1080p/1600p @ 60Hz. Thank you HDCP! :nutkick:

2560 x 1600 would be pretty sweet for TVs though...
HDMI will be incrementally upgraded to support 4K iirc.
Posted on Reply
#12
m1dg3t
by: Fourstaff
HDMI will be incrementally upgraded to support 4K iirc.
Sure, our kids might have it. More than likely by the time displays/HDMI are supporting those capabilities in the mainstream i prolly won't be interested :roll:

Don't forget there has to be content that is produced in those parameters in order to use those specs ;)

DL-DVI & Optical fTw! :pimp:
Posted on Reply
#13
EpicShweetness
by: FordGT90Concept
On consumer Windows, developers have to go through layer after layer of software to reach the hardware which means it is slower--but less likely to crash (and other undesirable outcomes) the computer. The reason why there isn't a direct access to the hardware in consumer Windows it has to account for the hundreds of graphics devices out there.
Lets not also forget that consoles have no "kernal" as it were. In situations were you launch a game a console has the ability to completely halt this "kernal" were a Windows machine runs a kernal for other tasks and computations requested by and of software to hardware.

One other thing lets not forget that the Xbox 360 and PS3 have notable differences in their visual ability, but were that difference shows its self is when the game devs are willing to take the time to exploit it. Example Uncharted 2, bar none. It is the fact this game took quiet awhile in the lifespan of the consoles life to show up that also supports this. Thus we will have a repeat were game devs must "learn" the platform at first and be willing to. So at first we will see no notable difference between the 2, but when we do the PS4 will shock n' awe us.
Posted on Reply
#14
Aquinus
Resident Wat-man
by: EpicShweetness
Lets not also forget that consoles have no "kernal" as it were. In situations were you launch a game a console has the ability to completely halt this "kernal" were a Windows machine runs a kernal for other tasks and computations requested by and of software to hardware.
Do some research or don't talk about something you don't know. The 360 and PS3, like modern PCs, have a kernel that runs and takes control when the device boots. Also, to say that older consoles do not have kernels is incorrect. Older consoles might not have had a unified kernel that everything uses like the 360 does, but each game had its own variant of a kernel to manage the hardware that it's running on. Now as we have smartened up about technology, the best things have prevailed and that's why you now see things like *nix based kernels on smartphones left and right and microcontrollers in your coffee machine (like my Keurig).
Posted on Reply
#15
BrainCruser
by: jmcslob
For some reason I think they will both play basically the same and in the end MS will beat SONY for some unknown reason that has absolutely no logic to it...

I think it will be awile before devs can actually take advantage full advantage of either system and by then it won't really matter which system is better....it doesn't now and I see no reason for it to matter in the future.

Besides MS is giving greedy Devs exactly what they want and then some...
Actually there is a very good chance they will be able to use them within months since the PC code line will work for almost anything, and that extra ram oh, a devs paradise.
Posted on Reply
#16
remixedcat
This is scary:
In a mind blowing demo, the Kinect then switched to a mode in which it monitored the heart rate of a person standing in front of it using the color cameras to measure how flush the skin was and the infrared cameras to track blood flow underneath the skin. This could ostensibly allow a developer to determine whether a user was scared, or even lying, and could also have health monitoring implications.

Then things started to get super freaky. A demo was run that showed the faces of people standing in front of the sensor. The Kinect was able to not only detect which controllers they were holding (player 1, player 2 etc), but also exactly who they were and whether they were happy, sad or neutral. This was done using imagery of the face to see whether they were smiling or frowning. It was pretty wild.
Posted on Reply
#17
AsRock
TPU addict
by: remixedcat
This is scary:
New way of collecting data on people without annoying them without ads ?.. Ooh Have fun with that XBox..

Those camera's sound more creepy each time they open their mouths.
Posted on Reply
#18
NeoXF
by: 1c3d0g
:roll: Are you new here?!? Ahahaha! That's the dumbest dumb comment I've read in a while. :laugh:

Come on, man! :shadedshu You seriously think these shitty consoles can beat any current PC, let alone the next-generation ones? A Haswell + Titan will crush any of these so-called "gaming machines" hands down. Even the developers themselves (from both platforms) have said they won't take on the high-end PC's head on, instead focusing on "good" (read: not great, let alone the best) middle-of-the-road performance for a gaming console. This time they focused more on entertainment, making the consoles a "media hub" etc. for the living room, not raw power.

By the time the consoles launch and developers get experience coding for them, Broadwell + Maxwell will be out, so these consoles stand NO CHANCE in beating PC's. Mark my words on that. :rockout:
Uhm, aside from totally missing my point and flaring out PC fanboy nonsense, what other purpose did your comment serve?


IMHO, if SONY wants to do a PS5, they can do it much sooner at this point... seeing as how they probably wouldn't have to worry about compatibility, if they keep on the x86 track that is... and seeing as how PS3 are selling for so little at this time, I can see PS4 either selling at a similar low price or being upgraded to a faster one, with full compatibility, down the road. Same goes for M$ at this point. Because I don't really see hardware price as big as a issue as some people make it out to be... I mean, gawd... we pay so much for the games themselves...
Posted on Reply
#19
EpicShweetness
by: Aquinus
Do some research or don't talk about something you don't know.
Notice the quotation marks :banghead:
You want me to be technical I'll PM ya a 10 page essay :shadedshu
Posted on Reply
#20
BiggieShady
by: Mussels
actually its not that ridiculous. no one has solid numbers, but coding for set hardware is a MASSIVE benefit.
Oh yes, beside direct memory access there is a benefit for really simple actions that are executed in large numbers each frame such as issuing draw calls, which is what Steevo is talking about.

by: Steevo


Current comparisons show anywhere from a 0% to 400% increase in overhead when rendering a scene.

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/2

The most common used number is about 25% overhead even on a "clean" gaming only machine as the host OS has threads running and interrupting, fouling caches and memory mismanagement.
Luckily issuing draw calls is the only significant accumulated overhead here, because everything else for scene rendering is already in VRAM. I suppose that's why NVidia wants to integrate general purpose ARM cores into Maxwell.

... and now some speculation: future of pc gaming could be: cpu does AI, physics, and streams over PCI-E bus only changes/differences for dynamic game objects/characters, on GPU diff gets merged with last frame state and general purpose ARM cores issue draw calls crazy fast for all geometry and their positions/rotations already in VRAM prepared by the cpu.
Posted on Reply
#21
Frick
Fishfaced Nincompoop
by: EpicShweetness
Notice the quotation marks :banghead:
You want me to be technical I'll PM ya a 10 page essay :shadedshu
It's a difference between simplifying and being inaccurate. I'm pretty sure the 360 and PS3 has kernels (unless "kernal" is something else entirely).

Anyway, about processing power: Wouldn't the connection to Azure cancel this out a bit? That is the part I'm actually almost excited about.
Posted on Reply
#23
Aquinus
Resident Wat-man
by: EpicShweetness
Notice the quotation marks :banghead:
You want me to be technical I'll PM ya a 10 page essay :shadedshu
Go ahead, maybe I should send you a picture of my degree in Computer Science. :slap:

by: EpicShweetness
"kernal"
You mean how you can't spell kernel right?

by: OneCool
This thread = :banghead:
I fixed your post. :)
Posted on Reply
#24
theoneandonlymrk
by: Aquinus
Go ahead, maybe I should send you a picture of my degree in Computer Science. :slap:



You mean how you can't spell kernel right?



I fixed your post. :)
Is this a bad time to point out that the xbox one has 3x kernel as disclosed by Ms and one is always active sometimes two.
Posted on Reply
#25
Frick
Fishfaced Nincompoop
About prices: If the prices out now are more or less correct they will be somewhat close. In some places it's up to a €165 difference, but in most places they are identical, about €770. Doesn't really say anything but still.
Posted on Reply
Add your own comment