Friday, November 2nd 2012

Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU

According to a VG 24/7 report, Sony began shipping development kits of its upcoming game console, PlayStation 4, codenamed "Orbis" to developers. The kit is described as being a "normal sized PC," driven by AMD A10 "Trinity" APU, and 8 or 16 GB of memory. We've known from reports dating back to April that Sony plans to use a combination of APU and discrete GPU, similar to today's Dual Graphics setups, where the APU graphics core works in tandem with discrete mid-range GPU. The design goal is to be able to play games 1920 x 1080 pixels resolution, with 60 Hz refresh rate, and with the ability to run stereo 3D at 60 Hz. For storage, the system has a combination of Blu-ray drive and 250 GB HDD. Sony's next-generation game console is expected to be unveiled "just before E3," 2013.


Source: VG 24/7
Add your own comment

354 Comments on Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU

#1
sergionography
by: Benetanegia
Apples to oranges. Different ways of doing things. But in general:

- The 360 had 48 SP not 16.
- These SPs had 5 ALUs. Similar but not 100% equal to VLIW5.
- So while it's still kinda apples to oranges this is FAR more accurate than what you posted:

XB360 = 48 x 5 = 240 "SP" (HD2900/3800 had 320 SP, it actually had 64 VLIW5 SP)
APU = 96 x 4 = 384 SP

As you can see not 8x faster and not much faster than 7 year old cards.

The PS3 suffers of the same BAD way of comparing things. RSX had 24 pixel shaders, but also 8 vecter shaders, so that "equals" to 32 unified Nvidia SP, but again like with Xenos, these could do up to 5 ops/cycle vs 2 op/cycle in the current unified shaders. So it's more like 32 x 2.5 = 80 "SP", versus the 128 SPs in the 8800.
not to mention this is assuming u use the same detail level
Posted on Reply
#2
cdawall
where the hell are my stars
by: BigMack70
I'm worked up about kiddos who don't understand kindergarten math and reading comprehension. (I also got worked up earlier about people making crazy claims about what a single graphics card can do on the PC.)
It's not kindergarten math. Your little equation has zero to do with real life. If you don't believe me play the exact same game with the exact same graphics settings at 720P and 1080P you don't get 4 times as many frames in 720P.

by: BigMack70
It doesn't get me worked up that some people think this APU will be able to do 1080p60... I've said about a hundred times that your guess is as good as mine and we won't know till we actually see what the hardware can do. I'm skeptical and don't believe it. Others here aren't as skeptical. But it's not unreasonable either way. It depends a lot on how demanding you think the next generation of games is going to be, how well you think they can optimize for console rather than PC, and rather or not you read Sony's claims of 60fps as being an average 60fps or a minimum 60fps (and rather or not you trust Sony's word).
Why wouldn't an APU be able to play Gran Tourismo at 1080P 60 FPS? A 6 year old G70 series card can play it at 720P. If you want to get into your silly little equation the GPU inside of an A10-5800K is 4x as fast as the old G70 series card in a PS3.
Posted on Reply
#3
BigMack70
by: cdawall
It's not kindergarten math.
OK. First grade math.
Your little equation has zero to do with real life.
Sure it does. You need a hardware/software solution that can produce ~4x more pixels than at 720p30.

I never claimed it had anything to do with a specific hardware configuration, but it does have something to do with real life. We're not talking about unicorns.
If you don't believe me play the exact same game with the exact same graphics settings at 720P and 1080P you don't get 4 times as many frames in 720P.
I've been over this a bunch of times now. I know this, and it's not relevant to my point at all, and I explained that all the way back at post #85 and have clarified further in recent posts.

Seriously guys. READ before you post. :shadedshu
Why wouldn't an APU be able to play Gran Tourismo at 1080P 60 FPS? A 6 year old G70 series card can play it at 720P. If you want to get into your silly little equation the GPU inside of an A10-5800K is 4x as fast as the old G70 series card in a PS3.
I was unaware that they were just going to be re-releasing current gen games on next-gen hardware with beefed up resolution and framerate. Do you have a source suggesting that's what they're going to be doing?

My position on this, I think, is pretty clear. My speculation is the following:
-I take a 1080p60 + 3D claim by Sony to be a claim about either 60fps or near-60fps minimum framerates in their games.

-I assume that next gen games are going to improve graphically from current gen games and thus be more demanding

-I look to the hardware needed to run current gen games on the PC maxed out with 60fps minimum framerates at 1080p, and you need multi-GPU to do that.

-I assume that next-gen games are going to be roughly as demanding as current PC games are maxed out or near-maxed out

-I assume it's impossible to optimize an APU (or an APU + low-midrange GPU) to such a degree that it is able to do things that a 7970/680 cannot.


Could I be wrong on some or all of those counts? Sure. But they're not inherently any more unreasonable than someone who assumes the opposite and thinks that Sony can do this. We'll know in a couple years. It would be awesome if they're able to do it, but I'm not a believer yet.
Posted on Reply
#4
erocker
I think it is time to let others chime in with their thoughts. To the few people who have been posting over and over again... It is time to give it a rest. Take your arguments elsewhere.
Posted on Reply
#5
WhiteLotus
What do people think the price of the next generation consoles will be?

I've already heard that the WiiU will be sold at a loss...
Posted on Reply
#6
xenocide
by: WhiteLotus
What do people think the price of the next generation consoles will be?

I've already heard that the WiiU will be sold at a loss...
I had heard Nintendo Investors were pretty insistent that the WiiU be sold at either break even or a slight profit, to the point of basically cutting as many corners as they could get away with. As for the PS4, I know Sony has said they will not price a console as high as the PS3 was, so I'm thinking $400 and $500 models, at a slight loss. If they skimp a little on the hardware knock $50 off each package.
Posted on Reply
#7
Kreij
Senior Monkey Moderator
Last I read about Nintendo was this ...
Satoru Iwata, president of Nintendo, has revealed the Wii U will be sold below cost in the company's most recent financial results briefing.

"The Wii U hardware will have a negative impact on Nintendo's profits early after the launch because rather than determining a price based on its manufacturing cost, we selected one that consumers would consider to be reasonable," he said.
As for the PS4, I have no idea as we have virtually no details on anything related to it.
Posted on Reply
#8
Rei86
by: WhiteLotus
What do people think the price of the next generation consoles will be?

I've already heard that the WiiU will be sold at a loss...
Wii U 32GB MSRP is 349.99

I'm gonna throw this out that the PS4 and Xbox 720s will be around 399~499 price points. Can't see either hardware manufactures asking more than that. But then again they could add value features that could balloon the prices.

Also about Nintendo, they have never sold hardware at a loss till the 3DS and the Wii U. Sony on the other hand has never sold the Playstation at a profit. Always at a loss that was recouped from the overwhelming sales figures.
Posted on Reply
#9
WhiteLotus
by: Rei86
Wii U 32GB MSRP is 349.99
Damn really? Usually you get nintendo being the cheaper option, I'm hoping they will be a little bit cheaper.
Posted on Reply
#10
xenocide
by: WhiteLotus
Damn really? Usually you get nintendo being the cheaper option, I'm hoping they will be a little bit cheaper.
Well they couldn't keep using dated hardware, so they had to step up the cost. Plus those tablet controllers cost quite a bit. With the Wii Nintendo made a fortune for the first few years because they were the only company selling their consoles at positive margins, and they were selling so many more of them than Sony and MS. But once people stopped buying Wii's, they started to suffer because they had nothing for Software sales, which is where MS and Sony started making a majority of their revenue in the second half of this console cycle.

Assuming Sony and MS stick to traditional controllers it should cut at least $50 off the launch package. I'm just assuming they have beefier hardware that will drive up the price. I expect a base model PS4 at about $400--which comes with just the system and a controller--and a premium package that comes with 1 launch title, a year of PSN+ and probably a second controller, or a bigger HDD (320GB vs. the 250GB).

The new Xbox will probably be around the same price with similar hardware to the PS4, but I see them packaging all systems from launch with the Kinect which might bump the price up a bit.
Posted on Reply
#11
Rei86
by: WhiteLotus
Damn really? Usually you get nintendo being the cheaper option, I'm hoping they will be a little bit cheaper.
The controller itself cost like 50% of that machine.

by: xenocide

The new Xbox will probably be around the same price with similar hardware to the PS4, but I see them packaging all systems from launch with the Kinect which might bump the price up a bit.
The next Xbox is rumored to have Kinect 2 as part of the hardware and not as an accessory.
Posted on Reply
#12
sergionography
by: cdawall
It's not kindergarten math. Your little equation has zero to do with real life. If you don't believe me play the exact same game with the exact same graphics settings at 720P and 1080P you don't get 4 times as many frames in 720P.



Why wouldn't an APU be able to play Gran Tourismo at 1080P 60 FPS? A 6 year old G70 series card can play it at 720P. If you want to get into your silly little equation the GPU inside of an A10-5800K is 4x as fast as the old G70 series card in a PS3.
actualy theoretical numbers are really important
you are forgetting that the resolution and gpu capability are kinda 2 different things
resolution is more affected by cache and memory, as detail is affected by computation power
i hardly think running 1080p is the problem, the issue here is details
i am studying video game design and the biggest concern is polygons when we do modeling, the more polygons in the models the more gpu capability it takes, so being a good modeler is to get the best looking shape with the least number of polygons possible
now the reason you dont get x4 the performance out of the hardware is because the poly count doesnt change with resolution, the rendering only changes to 1080p
not to mention the code has alot to do offcouse aswell as the poly count like i mentioned
some of the newer games have stunning graphcs and still run on old cards or even the consoles of today, mostly due to optimizing well with the crapload of graphics shaders available today, older games werent designed to run on 2000+ shaders so when u upscale to newer gens to run the older games they dont nessesarly do so linearly. but when u try the new optimized games for the hardware and run them on older cards then u will see the 4x weaker card running 4x weaker(when mem bandwidth and other gpu specialized features are also 4x weaker)
Posted on Reply
#13
xenocide
by: Rei86
The controller itself cost like 50% of that machine.
Well the retail cost of the controllers was placed at $100-150, so not completely inaccurate.

by: Rei86
The next Xbox is rumored to have Kinect 2 as part of the hardware and not as an accessory.
Yea it would make sense for them to bundle it as one system since they have been pushing Kinect so hard. I think it will have interesting applications in like 5 years when the sensors are accurate enough to actually measure fine movements (like individual finger movements).
Posted on Reply
#14
Rei86
by: xenocide
Well the retail cost of the controllers was placed at $100-150, so not completely inaccurate.



Yea it would make sense for them to bundle it as one system since they have been pushing Kinect so hard. I think it will have interesting applications in like 5 years when the sensors are accurate enough to actually measure fine movements (like individual finger movements).
If you remember Natal had the ability too, but MS was still fine tuning its software. Even all the industry insiders that was able to see Natal behind closed doors was impressed with it. The Kinect we got was a cut down version to keep cost low.

With Kinect II if it is truly integrated into the hardware we'll see a big push for games for kinect since its already part of the system, allowing MS to market it better. Next is I'm sure we'll get a more Natal product vs Kinect.
Posted on Reply
#15
Disruptor4
by: BigMack70
... 60fps 1080p is actually very demanding. You're not going to get 60fps minimum framerates in even all current titles without a multi-GPU setup. If you want 60fps minimum framerate in BF3, for example, you are either going to be using multiple GPUs or turning settings down.
I don't understand where you got your info for that. With my set up, which is single GPU, I get well above 60FPS with max graphics...
Posted on Reply
#16
Dent1
by: Disruptor4
I don't understand where you got your info for that. With my set up, which is single GPU, I get well above 60FPS with max graphics...
As much as I believe BigMack70 is a troll, on balance he said minimum 60FPS.

In BF3 you'll get an average and maximum way above 60FPS. Minimum will be about 30-40FPS regardless of setup.

But yes, I agree BigMack70 does often pull info out his butt.
Posted on Reply
#17
mystikl
The PS3 had a custom 7800 series GPU which was slightly slower than the regular one. Remember it was an upper midrange GPU released one year before the PS3, so if they were to do that again, they should be aiming for something with performance similar to a midrange GPU released this year namely a 7850.
So the PS4 having a discrete card paired with the APU makes sense otherwise it wouldn't reach the performance target.
But this is just a theory.
Posted on Reply
#18
M3T4LM4N222
I've built a FM1 A6 3650 APU based system and FM2 A8 5600K system and they both had great integrated graphics. My guess is the PS4 would be a A10 w/ a dedicated ATI GPU in hybrid crossfire mode. It'll be more than enough for console gaming especially when you consider that the games will be 100% optimized for those specific pieces of hardware. My guess is a A10 65W w/ Passive cooler + Hybrid Crossfire GPU w/ passive cooler. Correct me if I am wrong - but I think an A10 + 6670 would be nearly equivalent to a 7750 and that would handle a decent amount of PC games @ 1920 x 1080 no problem. Now the only issue is console hardware will be holding back 2560 x 1440 from becoming more mainstream. At least you can get a 2560 x 1440 IPS panel monitor for $350 online right :D
Posted on Reply
#19
Mussels
Moderprator
by: BigMack70
+1 to the # of people who don't understand my point has nothing to do with hardware

You have to churn out ~4x the pixels at 1080p60 as you do at 720p30. That's basic math.

Now, I understand (and have stated from very early on in this argument) that when you actually go look at how things perform in the real world, this breaks down and is not linear. But there's all sorts of various reasons for that and none of them have to do with the fact that you're putting out a different ratio of pixels at 1080p60 vs 720p30 than ~4x.

Why does it take dozens of posts to explain such a stupidly over-simple point? Go read the thread - I was making a picky point because of a lack of clarity in one of mailman's posts, and you guys have taken it to a whole other level.

Wow.
why doesnt performance drop to 1/4, when you up the pixels by 4x?


because its not that simple -.-
that doesnt take into accounts the various hardware tweaks, lossless and lossy compression (hardware and software via drivers) and a million other things. you've based this argument around something you see as simple and obvious, without actually checking it yourself.


god, this thread is really full of over-simplified arguments, just one after another after another...
Posted on Reply
#21
Benetanegia
by: Mussels
why doesnt performance drop to 1/4, when you up the pixels by 4x?


because its not that simple -.-
that doesnt take into accounts the various hardware tweaks, lossless and lossy compression (hardware and software via drivers) and a million other things. you've based this argument around something you see as simple and obvious, without actually checking it yourself.


god, this thread is really full of over-simplified arguments, just one after another after another...
The answer to that is actually much simpler than that. It doesn't drop because when on lower resolution the GPU is not actually performing as fast as it can. The bottleneck is elsewhere in lower resolution (CPU, triangle setup...). Any GPU built in the last decade is built with higher resolutions in mind. This is why ROPs, TMUs and pixel shaders have always kept increasing to a point where they (pixel shaders) are 100x times higher than a decade ago while triangle setup and raster engines are now only 2x (AMD+ Nvidia mid-range) or 4x (Fermi/Kepler high end) higher.

A similar question would be why is a HD7950 the slowest card here?



Why do all cards perform nearly the same? And the answer is not anything complicated about optimization HW&SW, etc, etc.

EDIT: Low end cards however...



The GTX 650 Ti for example goes from 38.5 fps to 23.7 which is actually very close to the resolution difference which is 1.77X -> 38 / 1.77 =~ 21
Posted on Reply
#22
Mussels
Moderprator
by: Benetanegia
The answer to that is actually much simpler than that. It doesn't drop because when on lower resolution the GPU is not actually performing as fast as it can. The bottleneck is elsewhere in lower resolution (CPU, triangle setup...). Any GPU built in the last decade is built with higher resolutions in mind. This is why ROPs, TMUs and pixel shaders have always kept increasing to a point where they (pixel shaders) are 100x times higher than a decade ago while triangle setup and raster engines are now only 2x (AMD+ Nvidia mid-range) or 4x (Fermi/Kepler high end) higher.

A similar question would be why is a HD7950 the slowest card here?

http://tpucdn.com/reviews/MSI/GTX_660_HAWK/images/skyrim_1280_800.gif

Why do all cards perform nearly the same? And the answer is not anything complicated about optimization HW&SW, etc, etc.
the answer is simple: nothing about video game performance is simple. (which is my problem with this 4x 'fact')
Posted on Reply
#23
Benetanegia
by: Mussels
the answer is simple: nothing about video game performance is simple. (which is my problem with this 4x 'fact')
Sorry but it is simple in this case. Pixels (whether final, or fragments, or texels or whichever step in the pipeline you want to talk about) need to be calculated and rendered and this is done on processors*. 2x the pixel amount 2x the power required, either 2x the SP number or 2x clock. It really is as simple as that. It's a fact.

Now you all guys are talking about margins. PC graphics cards have power to spare on nearly all fronts, lots and lots of it, for a long time, many years, and most definitely in the case of Pixel Shading capabilities and as such it's in this front where lower end cards have more "leeway".

* Nowadays, I mean any game engine in the past 5 years does everything on a per-pixel basis. There's no real escape from the law. More pixels more power required. Some games, especially on consoles "avoid" this physics law by rendering some elements at lower resolutions. For example rendering the lighting pass(es) at 1/2 or 1/4 the resolution is very common. This is just a workaround and not breaking the law. If output res is increased from 720 to 1080 but everything else on the fragment data is kept the same res, resolution has not really increased by as much as stated.
Posted on Reply
#24
MuhammedAbdo
Sigh

(SIGH)

This thread is full of ignorant people making false and misguided statements all along , the only people here with brains are : Benetanegia and BigMack70 .

First you morons need to read some facts , here they are :

FACT 1 : Not even High-End PCs can sustain 60FPS @1080p in demanding games with maximum graphics , games like : Metro 2033 , ARMA 3 , Crysis , Dragon Age 2 and many many more , there will be drops below 60FPS and in many occasions .

FACT 2 : Consoles include insane amount of code optimization , where every CPU/GPU cycle is utilized , they literally run on the machine code , which is the lowest language of software programming , contrary to the PC that sports many compilers and higher languages which wastes valuable cycles .

FACT 3 : Even with these optimizations , consoles run with shitty graphics . they cant even maintain 30 FPS at 720p , they usually drop to 25 and 20 FPS , they even run at sub 1280x720 resolutions , sometimes as low as 900x600 !

FACT 4 : Consoles cut down on graphics severely , they decrease Shadows density , Lighting Effects , Level Of Detail , Polygon Count , Alpha Effects , Texture Resolution , Texture Filtering , Anti-Aliasing , Post Processing and so many things that I can't even remember them all .

FACT 5 : a console with the triple specs of,
1-AMD CPU
2-AMD APU
3-AMD GPU (low-end/6670)

Will barely run today games at 1080p @60 FPS with PC graphics level , all of the code optimizations will be spent on the cost of resolution increase (to 1080p) and the cost of graphics increase (to PC level) , such as shadows , lighting , textures and etc .

If the specs has been changed and the console came with a high-end or even a medium AMD GPU , then the situation will be different .

FACT 6 : These consoles will have to do the usual dirty business to be able to run at 1080p , cut resolution and upscale , decrease all graphics elements (lighting and shadows and textures .. etc) below the future PC level .. This happened to the previous generation too , Xbox and PS 3 stared operating at 720p just fine , then they had to cut corners to increase graphics other wise the visuals will stall .

FACT 7 : PCs will maintain a higher visual quality and frame rates , consoles will have the graphics of a two years old PCs .

End of Discussion .
Posted on Reply
#25
DaMobsta
298 posts and yet no intel fanboys bashing the amd article. Go TPU! :toast:

I have a bad feeling about using hybrid crossfire on a console though, even if they optimize it, that's still an extra piece of hardware that could cause problems.
Posted on Reply
Add your own comment