Friday, November 2nd 2012

Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU

According to a VG 24/7 report, Sony began shipping development kits of its upcoming game console, PlayStation 4, codenamed "Orbis" to developers. The kit is described as being a "normal sized PC," driven by AMD A10 "Trinity" APU, and 8 or 16 GB of memory. We've known from reports dating back to April that Sony plans to use a combination of APU and discrete GPU, similar to today's Dual Graphics setups, where the APU graphics core works in tandem with discrete mid-range GPU. The design goal is to be able to play games 1920 x 1080 pixels resolution, with 60 Hz refresh rate, and with the ability to run stereo 3D at 60 Hz. For storage, the system has a combination of Blu-ray drive and 250 GB HDD. Sony's next-generation game console is expected to be unveiled "just before E3," 2013.


Source: VG 24/7
Add your own comment

354 Comments on Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU

#1
esrever
ps3 has a 7900m...

the A10 APU is at least 8x the rendering performance of the current ps3 hardware...
384SPs vs the 16 in the 360 which is comparable to the ps3 = 24x the general shading performance. I don't see how there would be a problem rendering only 4x as many pixels...
Posted on Reply
#2
BigMack70
by: esrever
the A10 APU is at least 8x the rendering performance of the current ps3 hardware...
384SPs vs the 16 in the 360 which is comparable to the ps3 = 24x the general shading performance. I don't see how there would be a problem rendering only 4x as many pixels...
We'll see before too long...
Posted on Reply
#3
esrever
by: BigMack70
We'll see before too long...
I am pretty sure the end resulting console will have different memory management and might even have dedicated on die ram so I wouldn't be surprised if it can do 1080p 60fps average with 4x AA.
Posted on Reply
#4
Benetanegia
by: esrever
384SPs vs the 16 in the 360 which is comparable to the ps3 = 24x the general shading performance. I don't see how there would be a problem rendering only 4x as many pixels...
Apples to oranges. Different ways of doing things. But in general:

- The 360 had 48 SP not 16.
- These SPs had 5 ALUs. Similar but not 100% equal to VLIW5.
- So while it's still kinda apples to oranges this is FAR more accurate than what you posted:

XB360 = 48 x 5 = 240 "SP" (HD2900/3800 had 320 SP, it actually had 64 VLIW5 SP)
APU = 96 x 4 = 384 SP

As you can see not 8x faster and not much faster than 7 year old cards.

The PS3 suffers of the same BAD way of comparing things. RSX had 24 pixel shaders, but also 8 vecter shaders, so that "equals" to 32 unified Nvidia SP, but again like with Xenos, these could do up to 5 ops/cycle vs 2 op/cycle in the current unified shaders. So it's more like 32 x 2.5 = 80 "SP", versus the 128 SPs in the 8800.
Posted on Reply
#5
alwayssts
Been saying this would be the case...Comparable to 1080p @ 60fps on contemporary pc titles at launch.

Between the nexbox and the ps4, I've long felt the dev kits were something like 256 + 1024 (salvage trinity + 7850ish gpu) eventually moving to 384 + 896sp (28nm fully-working trinity + an example of what 8770 could be). There are other possible configs of course, but something like that.

Oh look, the final dev kit comes when 8000 launches...who would've thought? Wonder why? :rolleyes:

896sp/16 ROPs on a 128-bit bus could run 950/6000 very efficiently, for example....and be very, very close to the avg potency of 7850...which on average is going to net you about ~45fps at 1080p. With the APU adding perhaps ~40% more resources you're very much at the 1080p60 level for most titles...assuming they find a way to make them operate seamlessly.
Posted on Reply
#6
Binge
Overclocking Surrealism
The number of completely uneducated and uninformed individuals making statements other than 'Wow! I wonder how this will work out," are staggering. The people I'm referring to know who they are.

The hardware designed for consoles is specialized and commissioned to suit a specific work-load. They are the min-maxed gaming machines using the absolute lowest common denominator of hardware to achieve optimized results. Again whatever they select may be based off of an architecture but it will most certainly have its own specialized cpus and gpus which suit the console's intended purpose. A target BASELINE performance of 1920x1080 at 60Hz is not impressive to us PC users but this is designed for TV play. With this limitation in mind it allows the system to use hardware to specifically drive those pixels with the utmost efficiency. This actually allows for less powerful hardware to achieve better results than it would in a PC environment... so for the most part this won't be evolving the console market past TV-HD, and honestly it may be a short-lived bump. The console devs are able to supply the mass with something for cheap... driving a lot of game sales and keeps the console market strong has got to be a balancing act for these industry giants. I bet the cost per console will be about $150 but they will be selling them for about $400. Something they weren't able to achieve with the 360 or PS3. For once making a console might net them a profit.
Posted on Reply
#7
1d10t
by: BigMack70
You posted downscaled pictures which have no relevance to anything in this topic and which definitely have no relevance to "how it renders" or "how it shows" because you're doing something completely opposite - rather than upscaling the low resolution images, you're downscaling the high resolution one (though you, for good measure, downscaled them all :laugh:).

If you want to compare upscaled with real resolution, then you need to post two pictures - one with dimensions 1080p which was originally rendered at 720p and then upscaled, and then that same picture with dimensions 1080p but which was natively rendered at 1080p. Alternatively, if you have a 1080p monitor, you could just look at a picture in 720p zoomed in to full screen vs that same picture at 1080p with no zooming in.

There's a big difference.

I don't think you know what you're talking about.
I never posted any downscaled image.I quote from another site and show the source.Next post is camera capture shows 1080p60Hz on TV info to countermeasure your opinion regarding lack of 1080p capabilities on PS3.You suggesting a proper method for comparing between these two,you may quote any of my last post,did i mention "rendered"?

If you didn't have three of them,PS3 XBox and PC,why arguing?
Posted on Reply
#8
LiNKiN
Staff
Hmm. What I gather from this thread is a bunch of arguing over gfx capability and or processing/horsepower. In my opinion hardware does not mean a thing if the game(s) made for that hardware suck balls. No wonder developers push games with graphics over gameplay. Makes me want to dust off the old NES, SNES, or PS1 and relish the glory days of gaming.

On Topic: I think people will be surprised with the results of these APUs in the coming consoles. ;)
Posted on Reply
#9
esrever
by: Benetanegia
Apples to oranges. Different ways of doing things. But in general:

- The 360 had 48 SP not 16.
- These SPs had 5 ALUs. Similar but not 100% equal to VLIW5.
- So while it's still kinda apples to oranges this is FAR more accurate than what you posted:

XB360 = 48 x 5 = 240 "SP" (HD2900/3800 had 320 SP, it actually had 64 VLIW5 SP)
APU = 96 x 4 = 384 SP

As you can see not 8x faster and not much faster than 7 year old cards.

The PS3 suffers of the same BAD way of comparing things. RSX had 24 pixel shaders, but also 8 vecter shaders, so that "equals" to 32 unified Nvidia SP, but again like with Xenos, these could do up to 5 ops/cycle vs 2 op/cycle in the current unified shaders. So it's more like 32 x 2.5 = 80 "SP", versus the 128 SPs in the 8800.
guess I remembered wrong but that is not correct either. The inefficiencies in the original R600 designs were made the performance extremely low. The trinity A10 is exactly 1/4 of a 6970. the 360 performs like a 2600xt.
the 2600xt gets 933 3dmark vantage, the 6970 gets 21k. If trinity wasn't bandwidth constrained it would get more than 5k which is an almost 6x as much performance increase. This is from outdated and inefficiency software system. 8x the performance is what sony quotes I think. Which given moore's law, its very easily done.
Posted on Reply
#10
BigMack70
by: 1d10t
I never posted any downscaled image.I quote from another site and show the source.Next post is camera capture shows 1080p60Hz on TV info to countermeasure your opinion regarding lack of 1080p capabilities on PS3.You suggesting a proper method for comparing between these two,you may quote any of my last post,did i mention "rendered"?

If you didn't have three of them,PS3 XBox and PC,why arguing?
I already addressed your claims of the PS3 rendering at 1080p. It's nonsense. Go look at the link I posted that shows you the resolution PS3 renders various games in. 99.9% of them are upscaled 720p.

And I know that those images were from another site. They're all heavily downscaled, even on the original site. Not 1080p. Therefore irrelevant.

I already told you what you need to do if you want to compare real vs upscaled resolution. I don't see you doing it...

Like I said, you don't seem to know what you're talking about.
Posted on Reply
#11
Frick
Fishfaced Nincompoop
by: lyndonguitar
thing is, 1080p is shit, now that they got 4k Monitors,
No it isn't. Stupid argument.
Posted on Reply
#12
BigMack70
Here 1d10t, let me do your job for you and we'll take a look at 720p upscaled vs 1080p...

Here's an example of a native 1080p screenshot:


And here's that same screenshot upscaled from 720p to 1080p:


HUGE difference.
Posted on Reply
#13
Dent1
by: BigMack70


Read the thread? Maybe posts like #127, 130, and especially 134.
So 3 posts out of 189, you are still losing :)
Posted on Reply
#14
Ravenas
by: TheMailMan78
If I were you I would be mad also about the Linux thing. :toast: However with that being said you are not the average user. Most people are very happy with the PS3.



No. It would need to work 4x more efficient. Something that's been done over the past 7 YEARS.
Agreed. The PS3 is a very good all around entertainment system. Nothing Sony has done has "screwed" any of their customers in respect to the PS3.
Posted on Reply
#15
BigMack70
by: Dent1
So 3 posts out of 189, you are still losing :)
Not on that particular argument... maybe read the thread? Notice anyone still trying to argue that 1080p60 isn't 4x as demanding as 720p30? Your original post just reflects a lack of reading comprehension as even Phenom realized that my argument was correct.

As for rather the PS4 will be doing 1080p60, that's anyone's guess. Some think no, myself among them, others think yes. No way to know who's right until the hardware is released.
Posted on Reply
#16
TheGuruStud
I vote that a mod deletes everything except the screenshots lol
Posted on Reply
#17
3870x2
by: LiNKiN
Hmm. What I gather from this thread is a bunch of arguing over gfx capability and or processing/horsepower. In my opinion hardware does not mean a thing if the game(s) made for that hardware suck balls. No wonder developers push games with graphics over gameplay. Makes me want to dust off the old NES, SNES, or PS1 and relish the glory days of gaming.

On Topic: I think people will be surprised with the results of these APUs in the coming consoles. ;)
Can't disagree with you there.
Posted on Reply
#18
1d10t
by: BigMack70
I already addressed your claims of the PS3 rendering at 1080p. It's nonsense. Go look at the link I posted that shows you the resolution PS3 renders various games in. 99.9% of them are upscaled 720p
*sigh
how many times i have to say this...

read my post #1,#2,#3,#4.i never mention any of word "rendering" of yours.read carefully,i did argued about how it show on TV...none the less.
And I know that those images were from another site. They're all heavily downscaled, even on the original site. Not 1080p. Therefore irrelevant.
Images provided to show any differences between these 3 platform.
I already told you what you need to do if you want to compare real vs upscaled resolution. I don't see you doing it...

Like I said, you don't seem to know what you're talking about.
Great,now i'm force to do something.Maybe you know more sir,but that don't justified your attitude towards another member.

by: BigMack70
Here 1d10t, let me do your job for you and we'll take a look at 720p upscaled vs 1080p...

Here's an example of a native 1080p screenshot:
http://images.eurogamer.net/articles//a/1/3/3/2/5/9/2/1080p1_pc.jpg.jpg

And here's that same screenshot upscaled from 720p to 1080p:
http://images.eurogamer.net/articles//a/1/3/3/2/5/9/2/new1080p1.bmp.jpg

HUGE difference.
Okay,what differences?Image quality?Jaggies?Colour?
Posted on Reply
#19
alwayssts
by: Benetanegia
Apples to oranges. Different ways of doing things. But in general:

- The 360 had 48 SP not 16.
- These SPs had 5 ALUs. Similar but not 100% equal to VLIW5.
- So while it's still kinda apples to oranges this is FAR more accurate than what you posted:

XB360 = 48 x 5 = 240 "SP" (HD2900/3800 had 320 SP, it actually had 64 VLIW5 SP)
APU = 96 x 4 = 384 SP

As you can see not 8x faster and not much faster than 7 year old cards.

The PS3 suffers of the same BAD way of comparing things. RSX had 24 pixel shaders, but also 8 vecter shaders, so that "equals" to 32 unified Nvidia SP, but again like with Xenos, these could do up to 5 ops/cycle vs 2 op/cycle in the current unified shaders. So it's more like 32 x 2.5 = 80 "SP", versus the 128 SPs in the 8800.
Ah yes...xenos. The chip with avg texture usage is 15us:1tmu...more optimal than even the 16 shader:1tmu or 14:1 nvidia use today (a big reason radeons gets crapped on for texturing). Odd since it has been that way since the dawn of dx10...and this preceded that. Sometimes you wonder about those 'experimental' chips like xenos that were kind of crazy brilliant. Why give it unneeded rops? Why give it unneeded bandwidth? Why give it unbalanced texture ability? Then...they create R600. WTF.

When you see ratios finely and finally perfected in Kepler: 14 shading + sfu: 1 tmu (slight overkill on texturing but better than under like amd), optimal amount of smaller sfus compared to true shaders (32:192...aka perfect...no idea if overall better use of space than amd doing it in shader) or the total amount of shader/sfu per array corresponding with rops (224 total units...around 230 is the sweetspot for scaling with 4 rops...pretty much perfect given how an array has to be set up)...it really makes you wonder where all the mad scientists at ATi went. They used to own that turf...now all they have is shader density/flexability...which granted helps a metric ton, but still...a carry over from years and years ago.

I remember when I used to discuss this stuff on B3D...those were the days.
Posted on Reply
#20
BigMack70
@1d10t

Ok point taken about what the PS3 "shows". It "shows" a 1080p image that is upscaled from a 720p source (when in game).

But you do realize that the screenshots you posted were all downscaled below what "shows" on the TV, and therefore are irrelevant, right?

If you can't see a difference between the two screens I posted, you probably need glasses.
Posted on Reply
#21
3870x2
by: BigMack70
Not on that particular argument... maybe read the thread? Notice anyone still trying to argue that 1080p60 isn't 4x as demanding as 720p30? Your original post just reflects a lack of reading comprehension as even Phenom realized that my argument was correct.
You ever argue with a brick wall? we all just did. Just when we thought we were going to convince you, we realized that a brick wall is an inanimate object.
Posted on Reply
#22
OneCool
I hope Sony wise's the F up and does something about its DRM.

cinavia makes me want to smash my ps3 sometimes :mad:
Posted on Reply
#23
BigMack70
by: 3870x2
You ever argue with a brick wall? we all just did. Just when we thought we were going to convince you, we realized that a brick wall is an inanimate object.
So you argue that 720p --> 1080p (2.2x) along with 30fps --> 60fps (2x) somehow yields something other than a ~4x increase in computational difficulty?

I don't know why I keep having to state the obvious: it's basic math. In the case of this argument, I am not and never was talking about PS3 vs PS4 or 2006 vs 2012 hardware. I'm making a picky point about basic kindergarten level math.

I'm absolutely amazed so many people have failed to understand that. :shadedshu
Posted on Reply
#24
Binge
Overclocking Surrealism
Yup... completely ignored. Why are the non-engis/programmers arguing over specs which may not even appear in the end system. These chips aren't the same you can buy for PCs and have less to do with performance than how well the software is written for the platform...

One of the best examples of this is ICO for the PS2. The ways they got that game to look the way it did on that system were genius, and so specific the team which remade the game in HD had to re-code the engine to support more resolutions/objects displayable on screen.
Posted on Reply
Add your own comment