Friday, November 2nd 2012
Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU
According to a VG 24/7 report, Sony began shipping development kits of its upcoming game console, PlayStation 4, codenamed "Orbis" to developers. The kit is described as being a "normal sized PC," driven by AMD A10 "Trinity" APU, and 8 or 16 GB of memory. We've known from reports dating back to April that Sony plans to use a combination of APU and discrete GPU, similar to today's Dual Graphics setups, where the APU graphics core works in tandem with discrete mid-range GPU. The design goal is to be able to play games 1920 x 1080 pixels resolution, with 60 Hz refresh rate, and with the ability to run stereo 3D at 60 Hz. For storage, the system has a combination of Blu-ray drive and 250 GB HDD. Sony's next-generation game console is expected to be unveiled "just before E3," 2013.
Source:
VG 24/7
354 Comments on Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU
the A10 APU is at least 8x the rendering performance of the current ps3 hardware...
384SPs vs the 16 in the 360 which is comparable to the ps3 = 24x the general shading performance. I don't see how there would be a problem rendering only 4x as many pixels...
- The 360 had 48 SP not 16.
- These SPs had 5 ALUs. Similar but not 100% equal to VLIW5.
- So while it's still kinda apples to oranges this is FAR more accurate than what you posted:
XB360 = 48 x 5 = 240 "SP" (HD2900/3800 had 320 SP, it actually had 64 VLIW5 SP)
APU = 96 x 4 = 384 SP
As you can see not 8x faster and not much faster than 7 year old cards.
The PS3 suffers of the same BAD way of comparing things. RSX had 24 pixel shaders, but also 8 vecter shaders, so that "equals" to 32 unified Nvidia SP, but again like with Xenos, these could do up to 5 ops/cycle vs 2 op/cycle in the current unified shaders. So it's more like 32 x 2.5 = 80 "SP", versus the 128 SPs in the 8800.
Between the nexbox and the ps4, I've long felt the dev kits were something like 256 + 1024 (salvage trinity + 7850ish gpu) eventually moving to 384 + 896sp (28nm fully-working trinity + an example of what 8770 could be). There are other possible configs of course, but something like that.
Oh look, the final dev kit comes when 8000 launches...who would've thought? Wonder why? :rolleyes:
896sp/16 ROPs on a 128-bit bus could run 950/6000 very efficiently, for example....and be very, very close to the avg potency of 7850...which on average is going to net you about ~45fps at 1080p. With the APU adding perhaps ~40% more resources you're very much at the 1080p60 level for most titles...assuming they find a way to make them operate seamlessly.
The hardware designed for consoles is specialized and commissioned to suit a specific work-load. They are the min-maxed gaming machines using the absolute lowest common denominator of hardware to achieve optimized results. Again whatever they select may be based off of an architecture but it will most certainly have its own specialized cpus and gpus which suit the console's intended purpose. A target BASELINE performance of 1920x1080 at 60Hz is not impressive to us PC users but this is designed for TV play. With this limitation in mind it allows the system to use hardware to specifically drive those pixels with the utmost efficiency. This actually allows for less powerful hardware to achieve better results than it would in a PC environment... so for the most part this won't be evolving the console market past TV-HD, and honestly it may be a short-lived bump. The console devs are able to supply the mass with something for cheap... driving a lot of game sales and keeps the console market strong has got to be a balancing act for these industry giants. I bet the cost per console will be about $150 but they will be selling them for about $400. Something they weren't able to achieve with the 360 or PS3. For once making a console might net them a profit.
If you didn't have three of them,PS3 XBox and PC,why arguing?
On Topic: I think people will be surprised with the results of these APUs in the coming consoles. ;)
the 2600xt gets 933 3dmark vantage, the 6970 gets 21k. If trinity wasn't bandwidth constrained it would get more than 5k which is an almost 6x as much performance increase. This is from outdated and inefficiency software system. 8x the performance is what sony quotes I think. Which given moore's law, its very easily done.
And I know that those images were from another site. They're all heavily downscaled, even on the original site. Not 1080p. Therefore irrelevant.
I already told you what you need to do if you want to compare real vs upscaled resolution. I don't see you doing it...
Like I said, you don't seem to know what you're talking about.
Here's an example of a native 1080p screenshot:
And here's that same screenshot upscaled from 720p to 1080p:
HUGE difference.
As for rather the PS4 will be doing 1080p60, that's anyone's guess. Some think no, myself among them, others think yes. No way to know who's right until the hardware is released.
how many times i have to say this...
read my post #1,#2,#3,#4.i never mention any of word "rendering" of yours.read carefully,i did argued about how it show on TV...none the less. Images provided to show any differences between these 3 platform. Great,now i'm force to do something.Maybe you know more sir,but that don't justified your attitude towards another member. Okay,what differences?Image quality?Jaggies?Colour?
When you see ratios finely and finally perfected in Kepler: 14 shading + sfu: 1 tmu (slight overkill on texturing but better than under like amd), optimal amount of smaller sfus compared to true shaders (32:192...aka perfect...no idea if overall better use of space than amd doing it in shader) or the total amount of shader/sfu per array corresponding with rops (224 total units...around 230 is the sweetspot for scaling with 4 rops...pretty much perfect given how an array has to be set up)...it really makes you wonder where all the mad scientists at ATi went. They used to own that turf...now all they have is shader density/flexability...which granted helps a metric ton, but still...a carry over from years and years ago.
I remember when I used to discuss this stuff on B3D...those were the days.
Ok point taken about what the PS3 "shows". It "shows" a 1080p image that is upscaled from a 720p source (when in game).
But you do realize that the screenshots you posted were all downscaled below what "shows" on the TV, and therefore are irrelevant, right?
If you can't see a difference between the two screens I posted, you probably need glasses.
cinavia makes me want to smash my ps3 sometimes :mad:
I don't know why I keep having to state the obvious: it's basic math. In the case of this argument, I am not and never was talking about PS3 vs PS4 or 2006 vs 2012 hardware. I'm making a picky point about basic kindergarten level math.
I'm absolutely amazed so many people have failed to understand that. :shadedshu
One of the best examples of this is ICO for the PS2. The ways they got that game to look the way it did on that system were genius, and so specific the team which remade the game in HD had to re-code the engine to support more resolutions/objects displayable on screen.