Friday, May 24th 2013

Xbox One Chip Slower Than PlayStation 4

After bagging chip supply deals for all three new-generation consoles -- Xbox One, PlayStation 4, and Wii U, things are looking up for AMD. While Wii U uses older-generation hardware technologies, Xbox One and PlayStation 4 use the very latest AMD has to offer -- "Jaguar" 64-bit x86 CPU micro-architecture, and Graphics CoreNext GPU architecture. Chips that run the two consoles have a lot in common, but also a few less-than-subtle differences.

PlayStation 4 chip, which came to light this February, is truly an engineer's fantasy. It combines eight "Jaguar" 64-bit x86 cores clocked at 1.60 GHz, with a fairly well spec'd Radeon GPU, which features 1,156 stream processors, 32 ROPs; and a 256-bit wide unified GDDR5 memory interface, clocked at 5.50 GHz. At these speeds, the system gets a memory bandwidth of 176 GB/s. Memory isn't handled like UMA (unified memory architecture), there's no partition between system- and graphics-memory. The two are treated as items on the same 8 GB of memory, and either can use up a majority of it.

Xbox One chip is a slightly different beast. It uses the same eight "Jaguar" 1.60 GHz cores, but a slightly smaller Radeon GPU that packs 768 stream processors, and a quad-channel DDR3-2133 MHz memory interface, which offers a memory bandwidth of 68.3 GB/s, and holding 8 GB of memory. Memory between the two subsystems are shared in a similar way to PlayStation 4, with one small difference. Xbox One chip uses a large 32 MB SRAM cache, which operates at 102 GB/s, but at infinitesimally lower latency than GDDR5. This cache cushions data-transfers for the GPU. Microsoft engineers are spinning this off as "200 GB/s of memory bandwidth," by somehow clubbing bandwidths of the various memory types in the system.

The two consoles also differ with software. While PlayStation 4 runs a Unix-derived operating system with OpenGL 4.2 API, Xbox One uses software developers are more familiar with -- a 64-bit Windows NT 6.x kernel-based operating system, running DirectX 11 API. Despite these differences, the chips on the two consoles should greatly reduce multi-platform production costs for game studios, as the two consoles together have a lot in common with PC.Source: Heise.de
Add your own comment

148 Comments on Xbox One Chip Slower Than PlayStation 4

#1
Tigershark8700
by: entropy13
"an RPG game for the Desktop (a spiritual successor to a famous 90s game by Konami)"


LOL I know what that would be. I think, anyway. That Konami game was for the PlayStation...
Actually it was developed for the Super Nintendo, so we might be thinking of different titles, unless it was ported over (but not sure on that).
Posted on Reply
#2
AsRock
TPU addict
by: ManofGod
Got 3 words into that video and realized he had absolutely nothing worth listening to. :slap: Clicked the X and moved on, not even worth reading the comments.
What i find funny about it how they went on about watching movies on it so much.. Like sorry last thing i would get is a xbox to watch movies and pay MS to go online then pay other people as well to use the service.

Just that you be better of with a PS4 but they need exclusives to make people like me at least to even think about getting a xbox..

And if movies annd stuff is what you going do mostly with it get a frigging Roku 3 as that will beat the pants of it in every way even more so on power usage as the unit only takes 3.2w on load and supports 3rd party stuff too.

I would have to go PS4 for a few reasons like for Heavy Rain if there is ever another of those and uncharted and then the free online so no monthly fee's.

Again if ya just watching movies Roku 3 has more than enough to keep you happy for a long time..
Posted on Reply
#3
Dent1
by: Jstn7477
I have no problem paying for hardware that makes my games run smoothly, considering I have a $300 monitor that functions best at 120Hz. Playing games at 40 FPS was something I did a couple years ago with an X2 4400+ and 7800GS in 2008, then X4 9750 and a 9800 GT, and then a 4GHz 955BE and HD 5770 before I got my 2600K and HD 6950 in late 2011. My minimum framerate in TF2 almost doubled when I got the i7 (before you call out the video card differences, my 5770 was never fully stressed in TF2 to begin with). Without VSYNC, TF2 runs in the 200s but in the largest fights on 24-28 player servers, my framerate dips down to around 100 with shadows off, sometimes less in extreme situations. My main work computer with a 2.5GHz Phenom X3 8550 and a 3850 AGP hangs around in the mid 30s-40s in the same situations with an under-utilized GPU.
Maybe you don't have a problem paying for hardware, but the person I was originally responding to (RejZoR) does.

Just because you have a 120Hz monitor doesn't mean your experience is diminished if 120FPS isn't achieved. Most console games are capped between 25FPS and 30 FPS yet the high end HD TVs can support up to 120Hz. I'm not saying 25-30FPS is something PC gamers should be accustomed to as I wouldn't tolerate such a low frame rate, but I see nothing wrong with playing a game at 50-60-70+ FPS on my almost 4 year old GPU/GPU. I'm not going to drop money to see Fraps @ 120FPS vs 70FPS to run at the same detail settings to see a negligible difference.
Posted on Reply
#4
NeoXF
by: Jacez
Now, the Xbox One has an underclocked (850Mhz -> 800Mhz) HD 6770 inside and the PS4 has an underclocked (900Mhz -> 800Mhz) HD 6870. The memory bandwidth appears to be accurate for both examples.

All in all, it's an Entry-Point/Mid-Range computer.
What? PS4's GPU is a 1152 GCN v2.0 cores part @ 800MHz with a shared system memory of colossal size and bandwidth.

And I'm pretty sure the XBO's GPU side is no slouch either.

PS4's CPU part is @ 2GHz, it's been confirmed time and time again over the weeks after it's launch, not sure about the XBO. All in all yes, probably slightly above half the i7-2600's performance, but specific coding and optimizations should bring results of something way above what the i7 can do on a straight-up PC platform... of course, this in the following years, not at launch. However... taking into account AMD's HSA... the CPU part plus the GPU grunt work... it's computational power is going to be way above anything we see in today's PCs, heck, more FLOPs than on a 4-CPU 10-core Ivy Bridge server... (abstracting out the fact that the GPU as a pure graphics processing unit will be starved of resources in that scenario).


All in all, it's more like a mid-range or above gaming system, with the vast untapped capabilities we aren't aware of yet... (que AMD's Kaveri/Kaveri+ HSA demostrations...)
Posted on Reply
#5
theoneandonlymrk
by: Dent1
Maybe you don't have a problem paying for hardware, but the person I was originally responding to (RejZoR) does.

Just because you have a 120Hz monitor doesn't mean your experience is diminished if 120FPS isn't achieved. Most console games are capped between 25FPS and 30 FPS yet the high end HD TVs can support up to 120Hz. I'm not saying 25-30FPS is something PC gamers should be accustomed to as I wouldn't tolerate such a low frame rate, but I see nothing wrong with playing a game at 50-60-70+ FPS on my almost 4 year old GPU/GPU. I'm not going to drop money to see Fraps @ 120FPS vs 70FPS to run at the same detail settings to see a negligible difference.
Good point but exactly what has 120hz /fps gameing got to do with consoles anyway , this is an argument for 2025-2030 when the ps 5 is due
Posted on Reply
#6
NeoXF
by: theoneandonlymrk
Good point but exactly what has 120hz /fps gameing got to do with consoles anyway , this is an argument for 2025-2030 when the ps 5 is due
Not likely... more like 3820x2160@30Hz... or 60, if "we're" lucky... Seeing as how they don't see the point of 60fps when they(SONY/the developers) "think" they would rather squeeze some extra eye-candy mumbo-jumbo and stay at 30fps... I don't see 120Hz ever happening in the console world... but then again, HDTV/PC display tech might have a big revolution down the road, who knows what will make more sense then.
Posted on Reply
#7
Ravenas
The GPU and ram are both slightly slower, however, Xbox will most likely have a better price point.
Posted on Reply
#8
btarunr
Editor & Senior Moderator
by: Fourstaff
So can we install Windows 7 (not 8)?
I imagine the OS being stored on a device other than the 500GB HDD, and the system's EFI being prevented from booting from USB/ODD. It won't be long before Linux geeks figure out a way though.
Posted on Reply
#9
AsRock
TPU addict
by: Ravenas
The GPU and ram are both slightly slower, however, Xbox will most likely have a better price point.
Maybe until you add up the monthly fee for the dam thing to be online.
Posted on Reply
#10
purecain
our pc's will shine on all the new titles... cant wait...
Posted on Reply
#11
Mussels
Moderprator
by: purecain
our pc's will shine on all the new titles... cant wait...
openGL and DX11 consoles is looking good for the glorious master race.
Posted on Reply
#12
BiggieShady
by: NeoXF
All in all yes, probably slightly above half the i7-2600's performance, but specific coding and optimizations should bring results of something way above what the i7 can do on a straight-up PC platform... of course, this in the following years, not at launch.
Way above? Really? For it to be true, it would mean that more than half of i7-2600 performance is regularly lost on OS + DirectX + Driver overheads, which is simply ridiculous.
Posted on Reply
#13
Mussels
Moderprator
by: BiggieShady
Way above? Really? For it to be true, it would mean that more than half of i7-2600 performance is regularly lost on OS + DirectX + Driver overheads, which is simply ridiculous.
actually its not that ridiculous. no one has solid numbers, but coding for set hardware is a MASSIVE benefit.
Posted on Reply
#14
freaksavior
To infinity ... and beyond!
by: NinkobEi
Playstation historically has always had the best exclusive titles. This generation will not be any different. It has to do with which continent the console is developed on. MS can try to make an appeal to japanese developers but in the end there will always be that language barrier.


If you read the Anandtech report, the PS4 should run a lot hotter/higher power. Seems unlikely that their fan will be quieter.
That's an opinion
Posted on Reply
#15
Jstn7477
by: Dent1
Maybe you don't have a problem paying for hardware, but the person I was originally responding to (RejZoR) does.

Just because you have a 120Hz monitor doesn't mean your experience is diminished if 120FPS isn't achieved. Most console games are capped between 25FPS and 30 FPS yet the high end HD TVs can support up to 120Hz. I'm not saying 25-30FPS is something PC gamers should be accustomed to as I wouldn't tolerate such a low frame rate, but I see nothing wrong with playing a game at 50-60-70+ FPS on my almost 4 year old GPU/GPU. I'm not going to drop money to see Fraps @ 120FPS vs 70FPS to run at the same detail settings to see a negligible difference.
I will agree that some games do play fine at 60 FPS (which is what the more poorly optimized games tend to go down to on my machine on some occasions despite none of my CPU cores or GPU being maxed out), but if you play multiplayer FPS games competitively e.g. Team Fortress 2, Counter-Strike, Quake, etc. there is quite a difference in smoothness between 60Hz and 100-120Hz. Some people have hung onto their CRTs for years and play at stupid resolutions like 1024*768 and 100Hz for these particular games (if they have yet to purchase a 120Hz 1080p LCD) because your local frame rate determines how many snapshots are sent back to the server. Again, many single player games (especially slower paced ones) at 60 Hz play nicely, but in multiplayer FPS games where people tweak the hell out of their net settings, reduce their interp settings and whatnot, it's hard to be part of the 60Hz norm. Call of Duty (not that I play it) supposedly has the best hit registration at 125 or 250 client FPS from what I heard as well.
Posted on Reply
#16
KainXS
by: freaksavior
That's an opinion
it really is
If I had to look at the best exclusives its nintendo without a doubt, sony tends to keep many exclusives in the japanese markets(and they're some very good ones that end up being unknown) ms tends to do multiplatform games better but when their is an exclusive its usually a pretty good minus the kinect games.
Posted on Reply
#17
FordGT90Concept
"I go fast!1!11!1!"
by: Vinska
I'd say OGL is only a disadvantage on when on Windoze. And it's not even OpenGL's fault per se.
I'd best describe it in the words one game developer said to me not long a go (not exact words; Greatly shortened) "working with OpenGL is great. OpenGL is also lighter on the CPU and helps to keep the framerate up when running on weaker CPUs. But OpenGL implementations on Windows just suck and are much slower than they could be."

Also, what midnightoil said.
Apples to apples, Direct3D will always be faster because it's hardware + software, not just software. Case in point, Direct3D created the unified shader model, OpenGL adapted it in its own specifications. Whenever there is a performance hit on Direct3D, it is because it is doing something extra (e.g. post processing).

As to what midnightoil said, bare in mind that Windows on Xbox isn't the same as Windows on IBM-PC compatible. Xbox developers likely have direct access to the hardware resources to squeeze every drop of performance from the hardware. On consumer Windows, developers have to go through layer after layer of software to reach the hardware which means it is slower--but less likely to crash (and other undesirable outcomes) the computer. The reason why there isn't a direct access to the hardware in consumer Windows it has to account for the hundreds of graphics devices out there.

I have no doubt that Sony would have used DirectX if they didn't have to license it from Microsoft.

OpenGL 3.# requires Direct3D 10 hardware
OpenGL 4.# requires Direct3D 11 hardware
Posted on Reply
#18
Vinska
by: FordGT90Concept
Apples to apples
On the other hand, comparing D3D to OGL on RL usage scenarios as "apples to apples" is not possible. D3D is only implemented on Windoze [and MS devices]. There, OGL is either greatly neglected by the implementers or is simply non-existent. Say what You want, but comparing D3D to pitiful excuses for OGL implementations found on Windoze simply cannot be called as "apples to apples".
On *nix, for example, the implementations are much better.
But comparing between D3D on Windoze and and OGL on *nix cannot be called "apples to apples" due to arising external factors [obvious one - different friggin' OS].

by: FordGT90Concept
Direct3D will always be faster because it's hardware + software, not just software.
What the hell were You smoking?
Posted on Reply
#19
FordGT90Concept
"I go fast!1!11!1!"
Direct3D is emulated on *nix. OpenGL and Direct3D both have to go through the layers of protection on Windows. Windows is the closest apples to apples available. The fact that most professional software is rendered using OpenGL attests to the fact that is well implemented on Windows.

There are a lot of engines out there which run on Windows that support Direct3D and OpenGL render paths and the performance is more or less the same when trying to achieve the same degree of visuals.

A lot of EA titles (The Sims 3 and Spore, for example) are DirectX on Windows and OpenGL on Mac OS X. If DirectX was as terrible as you claim it is, why would EA go out of their way to use DirectX on Windows instead of OpenGL on both?
Posted on Reply
#20
Vinska
by: FordGT90Concept
why would EA go out of their way to use DirectX on
Windows instead of OpenGL on both?
Already said the reason many times. Do I really need to repeat myself again?

by: FordGT90Concept
Direct3D is emulated on *nix. OpenGL and Direct3D both have to go through the layers of protection on Windows.
same - what the hell were You smoking?
Posted on Reply
#21
FordGT90Concept
"I go fast!1!11!1!"
You do know that virtually all professional software (e.g. AutoCAD, 3DSMax, Photoshop, etc.) uses an OpenGL render, correct? OpenGLs implementation on Windows is good (much better than you claim it not to be), it just isn't up to par with the purpose-built DirectX. Most x86 compatible games are released on Windows because of DirectX, not in spite of it. DirectX was created because Bill Gates wasn't satisified with OpenGL at the time. Hell, about the only game developer that loves him some OpenGL is John Carmack (ID Tech engine). That's mostly because he resents the Microsoft empire.

And don't expect a further reply from me on this topic. The discussion is circular.
Posted on Reply
#22
1c3d0g
by: NeoXF
...
All in all yes, probably slightly above half the i7-2600's performance, but specific coding and optimizations should bring results of something way above what the i7 can do on a straight-up PC platform... of course, this in the following years, not at launch. However... taking into account AMD's HSA... the CPU part plus the GPU grunt work... it's computational power is going to be way above anything we see in today's PCs, heck, more FLOPs than on a 4-CPU 10-core Ivy Bridge server... (abstracting out the fact that the GPU as a pure graphics processing unit will be starved of resources in that scenario).


All in all, it's more like a mid-range or above gaming system, with the vast untapped capabilities we aren't aware of yet... (que AMD's Kaveri/Kaveri+ HSA demostrations...)
:roll: Are you new here?!? Ahahaha! That's the dumbest dumb comment I've read in a while. :laugh:

Come on, man! :shadedshu You seriously think these shitty consoles can beat any current PC, let alone the next-generation ones? A Haswell + Titan will crush any of these so-called "gaming machines" hands down. Even the developers themselves (from both platforms) have said they won't take on the high-end PC's head on, instead focusing on "good" (read: not great, let alone the best) middle-of-the-road performance for a gaming console. This time they focused more on entertainment, making the consoles a "media hub" etc. for the living room, not raw power.

By the time the consoles launch and developers get experience coding for them, Broadwell + Maxwell will be out, so these consoles stand NO CHANCE in beating PC's. Mark my words on that. :rockout:
Posted on Reply
#23
Aquinus
Resident Wat-man
by: 1c3d0g
By the time the consoles launch and developers get experience coding for them, Broadwell Maxwell will be out, so these consoles stand NO CHANCE in beating PC's. Mark my words on that.
Consoles aren't here to take over PC gaming. They did that already without having superior graphics so what's your point? PC gamers lately have been a slowly dying niche which is a shame.

Go ahead and pay 1000 USD for your Titan. Some person who could care less will probably get an Xbox One, pay a mere fraction of the cost of a full gaming rig, and still enjoy it just as much as you and not know the difference because the general user really doesn't care as much as we do here at TPU.

I guess that depends on how you look at "winning." Image quality wise PCs will be better. Cost effectiveness, market penetration and profits wise, I think consoles are winning by a pretty large margin.

by: 1c3d0g
Are you new here?!? Ahahaha! That's the dumbest dumb comment I've read in a while.
Don't call someone else's comments stupid when your post is just as bad. :shadedshu
Posted on Reply
#24
NinkobEi
by: Aquinus
Consoles aren't here to take over PC gaming. They did that already without having superior graphics so what's your point? PC gamers lately have been a slowly dying niche which is a shame.

Go ahead and pay 1000 USD for your Titan. Some person who could care less will probably get an Xbox One, pay a mere fraction of the cost of a full gaming rig, and still enjoy it just as much as you and not know the difference because the general user really doesn't care as much as we do here at TPU.

I guess that depends on how you look at "winning." Image quality wise PCs will be better. Cost effectiveness, market penetration and profits wise, I think consoles are winning by a pretty large margin.
I bet some of us TPU'ers could build a budget pc sub-600 that is on par with the Xbone. This will be doubly true in a couple of years. And the best thing about PC is we dont have to spend $60 for a game. It is very common to buy last year's triple-A titles for $10 during a sale.

As far as PC gaming being a dying niche, well never has that statement been less true than right now.
The PC gaming market reached $20 billion in 2012, a healthy increase of eight percent over the previous year, the PC Gaming Alliance (PCGA) revealed this week at a news conference held in San Francisco.
Now let us compare that to console sales in 2012
Video game and console sales plunged 22 percent in 2012, according to NPD Group data published by Home Media Magazine. As consumers focused their dollars on a few high-profile titles and opted for new digital services, and publishers just released fewer titles, revenue for the year totaled $13.3 billion compared to $17.0 billion in 2011. The decline more than doubled the 9 percent decrease between 2010 and 2011, reported the Los Angeles Times.
Posted on Reply
#25
AsRock
TPU addict
Sales are bound to drop due to the supposedplanned release of the new consoles. And for people like my self only play a few classy games it's more than $60 you paying with a xbox as you gotta tag on the monthly fee o top and that would be like paying $100+. That's why they need to get the new models out the door ASAP.

So if i get one of the ewer systems it would have to not have a monthly fee.

It would be nice what the power usage is of these newer ones too how much better are they.
Posted on Reply
Add your own comment