Friday, November 2nd 2012

Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU

According to a VG 24/7 report, Sony began shipping development kits of its upcoming game console, PlayStation 4, codenamed "Orbis" to developers. The kit is described as being a "normal sized PC," driven by AMD A10 "Trinity" APU, and 8 or 16 GB of memory. We've known from reports dating back to April that Sony plans to use a combination of APU and discrete GPU, similar to today's Dual Graphics setups, where the APU graphics core works in tandem with discrete mid-range GPU. The design goal is to be able to play games 1920 x 1080 pixels resolution, with 60 Hz refresh rate, and with the ability to run stereo 3D at 60 Hz. For storage, the system has a combination of Blu-ray drive and 250 GB HDD. Sony's next-generation game console is expected to be unveiled "just before E3," 2013.


Source: VG 24/7
Add your own comment

354 Comments on Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU

#1
THE_EGG
As far as the 4X performance needed for 1080p @ 60fps and all the other arguments about the next playstation, CAN WE JUST AGREE TO DISAGREE?
Posted on Reply
#2
Rei86
by: Benetanegia

Anyway the Wii U is rumored to have a significantly more powerful GPU than A10 APU. Is SONY trully going to release something less capable? :laugh:
Most of the Wii U's spec is out.

2GB of shared RAM but with 1GB always detected to games, three core Boradway CPU, GPU on the same die being based on either the HD4000/5000 AMD.
Posted on Reply
#3
erocker
After having to cleanup things up once again, frankly I'm fed up with this thread. So, if anyone feels like ignoring the posting guidelines, insulting others, trolling, going off topic or ignoring a moderator's instruction you will receive a vacation for a while. This goes for everyone in this thead. I'm making no exceptions. This is your only warning.
Posted on Reply
#4
jamsbong
Sounds to me like a very average hardware specs. This is so un-playstation like.
All past Playstation hardware were always ahead of its time at launch. It took PC a couple of years to catch up PS3.

PS4 looks more like a cost-cutting exercise.
Posted on Reply
#5
Mussels
Moderprator
why do people keep insisting that the last gen consoles had high end hardware at launch? they had derivitaves of high end PC hardware, but thats not the same thing at all. the power consumption of the consoles as a whole is lower than the high end video cards of that era, which really should give that fact away.

consoles never launch with high end PC hardware. this launch is no exception - and its generations faster than what we have now.
Posted on Reply
#6
happita
Let's all look on the bright side. The console gamers can finally enjoy what PC gamers have been spoiled with for a long time. I can't wait to see what developers can come up with using a directx 11 capable GPU when this gen of consoles only had dx9 capable cards. It will no doubt be exciting for everyone. I'm tired of half-assed ports looking like shit and having all sorts of bugs because a developer didn't care enough to spend the hours needed for a fantastic gameplay experience. There's plenty of them out there unfortunately.
Posted on Reply
#7
Mussels
Moderprator
i'm just glad the console ports will be native DX11 this time around. cant we all be happy with that?
Posted on Reply
#8
Rei86
by: Mussels
why do people keep insisting that the last gen consoles had high end hardware at launch? they had derivitaves of high end PC hardware, but thats not the same thing at all. the power consumption of the consoles as a whole is lower than the high end video cards of that era, which really should give that fact away.

consoles never launch with high end PC hardware. this launch is no exception - and its generations faster than what we have now.
Eh, the Playstation 3 was pretty high end tech gear when it did launch.

Its own HDD, BR player built in, multi output, four USB 2.0, memory card reader, etc
And the STI produced Cell CPU was pretty high end for the time.
Posted on Reply
#9
Benetanegia
by: Mussels
why do people keep insisting that the last gen consoles had high end hardware at launch? they had derivitaves of high end PC hardware, but thats not the same thing at all. the power consumption of the consoles as a whole is lower than the high end video cards of that era, which really should give that fact away.

consoles never launch with high end PC hardware. this launch is no exception - and its generations faster than what we have now.
Yeah but in order to be the same it would be something like an underclocked GTX 660 Ti or HD7950, not an APU that is 5x less powerful, we are talking about the other side of the spctrum altogether. And no, an APU is just 1 or 2 generations faster than what the XB360 had for example and it's not launching now but in end of 2013 or 2014, that's why I'm sure it will have smething like HD7850 underclocked. And just like they did, once smaller process is available they will include it in soc fashion.
Posted on Reply
#10
xenocide
by: Rei86
Its own HDD, BR player built in, multi output, four USB 2.0, memory card reader, etc
And the STI produced Cell CPU was pretty high end for the time.
Most of those things had already existed on PC's for years, the onyl cutting edge feature was Blu-Ray. The Cell CPU was really impressive on paper, but the simple fact was in underwhelmed. The best analogy I can think of is Bulldozer--people assumed when AMD went from 4/6 Phenom II cores to an 8-Core Bulldozer they'd see a huge performance gain, but because the per coreper thread performance was down so much, it was a huge disappointment. The reason Cell seemed so impressive was because it had a ton of cores, the problem was that at the time most developers had only handled 1-4 threads at a time, so developing software for 7 all of a sudden was a nightmare, compiled with the fact that the per-core performance is pretty low, you see exactly why Sony isn't reusing it. They use Cell Technology in TV's now btw.

by: Benetanegia
Yeah but in order to be the same it would be something like an underclocked GTX 660 Ti or HD7950, not an APU that is 5x less powerful, we are talking about the other side of the spctrum altogether. And no, an APU is just 1 or 2 generations faster than what the XB360 had for example and it's not launching now but in end of 2013 or 2014, that's why I'm sure it will have smething like HD7850 underclocked. And just like they did, once smaller process is available they will include it in soc fashion.
This is true. This generation Consoles launched with 1 generation old GPU's that had features of their successors. The Xenos was basically a streamlined X1950XT with some of the features that appeared in the 2900 series, and similarly the PS3 GPU was 7800-based with some features from the 8800 series. The saving grace for the Xenos was its incredible usage of memory and implementation of eDRAM. In order for this upcomming generation of consoles to impress, it would have to use as he said, a 7800 series or 660 GPU, which is nearly impossible.
Posted on Reply
#11
Rei86
by: xenocide
Most of those things had already existed on PC's for years, the onyl cutting edge feature was Blu-Ray. The Cell CPU was really impressive on paper, but the simple fact was in underwhelmed. The best analogy I can think of is Bulldozer--people assumed when AMD went from 4/6 Phenom II cores to an 8-Core Bulldozer they'd see a huge performance gain, but because the per coreper thread performance was down so much, it was a huge disappointment. The reason Cell seemed so impressive was because it had a ton of cores, the problem was that at the time most developers had only handled 1-4 threads at a time, so developing software for 7 all of a sudden was a nightmare, compiled with the fact that the per-core performance is pretty low, you see exactly why Sony isn't reusing it. They use Cell Technology in TV's now btw.
For a product that was AIO I say it was pretty damn impressive at its time.
Posted on Reply
#12
happita
by: xenocide
The reason Cell seemed so impressive was because it had a ton of cores, the problem was that at the time most developers had only handled 1-4 threads at a time, so developing software for 7 all of a sudden was a nightmare, compiled with the fact that the per-core performance is pretty low, you see exactly why Sony isn't reusing it. They use Cell Technology in TV's now btw.
They all learned from experience: Sony, Toshiba, and IBM. A CPU made with strong correlations to supercomputing doesn't exactly translate to killer real-world game performance, as Sony saw. So I really don't blame them for not wanting to bring back the Cell architecture for a second round. But I have to say though, they did a pretty dam good job for what they paid for. The developers who make exclusives for PS3 really push it to its limits, for ex. God of War 3, Uncharted 2 & 3, etc.
Posted on Reply
#13
Rei86
by: happita
They all learned from experience: Sony, Toshiba, and IBM. A CPU made with strong correlations to supercomputing doesn't exactly translate to killer real-world game performance, as Sony saw. So I really don't blame them for not wanting to bring back the Cell architecture for a second round. But I have to say though, they did a pretty dam good job for what they paid for. The developers who make exclusives for PS3 really push it to its limits, for ex. God of War 3, Uncharted 2 & 3, etc.
Just like Nintendo having a hold on gamers hearts with nostalgic names and the reason to purchase a Nintendo product; Sony has been on the offensive when it comes to first party offerings.

I mean really Uncharted 1/2/3, God of War 3, Killzone 2/3, GT5, and many of the smaller titles on PSN like Journey has made owning a PS3 worth it.
Posted on Reply
#14
happita
by: Rei86

I mean really Uncharted 1/2/3, God of War 3, Killzone 2/3, GT5, and many of the smaller titles on PSN like Journey has made owning a PS3 worth it.
That and the fact that I don't have to pay $50 a year just to play my games online. Sony did a good job when it decided not to charge it's customers to pay in order to play online which is great and is another reason why I own a PS3 and not a XBOX.
Posted on Reply
#15
Steevo
by: erocker
After having to cleanup things up once again, frankly I'm fed up with this thread. So, if anyone feels like ignoring the posting guidelines, insulting others, trolling, going off topic or ignoring a moderator's instruction you will receive a vacation for a while. This goes for everyone in this thead. I'm making no exceptions. This is your only warning.
Big meanie!!!!!!

My first A8-5600 is together and based off the tiny cooler they included and very cool running temps I am almost scared of what it is actually going to be able to do.

Benchmarks to follow shortly.


To avoid a double post.

3D11 at stock with memory @ 800 Mhz 9.9.9.24 stock.
Score
P1189 3DMarks

Graphics Score
1056

Physics Score
3932

Combined Score
1087

PC Health Check

Your PC is performing properly.


Overclocked

GPU at 1000 Mhz, core speed bumped to 4.1Ghz no change to memory speed as the memory doesn't like it.

Score
P1371 3DMarks

Graphics Score
1229

Physics Score
4163

Combined Score
1206

PC Health Check

Your PC is performing properly.


4.2Ghz CPU, 1000 Mhz GPU

Score
P1370 3DMarks

Graphics Score
1227

Physics Score
4239

Combined Score
1208

PC Health Check

Your PC is performing properly.
Posted on Reply
#16
Binge
Overclocking Surrealism
Yes this is the thread that never ends... yes it goes on and on my friends... some people started trolling it not knowing a freaking thing, and they'll continue trolling it because.. I guess it's an e-penis win? Yes this is the thread that never...

I'd like to thank the folks who have dabbled in game development for lending their comments despite the overwhelming wave of blatherskite. That is all.
Posted on Reply
#17
cdawall
where the hell are my stars
by: Binge
Yes this is the thread that never ends... yes it goes on and on my friends... some people started trolling it not knowing a freaking thing, and they'll continue trolling it because.. I guess it's an e-penis win? Yes this is the thread that never...

I'd like to thank the folks who have dabbled in game development for lending their comments despite the overwhelming wave of blatherskite. That is all.
It works better to just unsubscribe from the thread..



Works rather well as the silliness stops.
Posted on Reply
#18
lyndonguitar
I play games
Either way if the PS4 sucks or whatever, I'm still pretty excited for this coming new "gen", that means more GAMEZ!!!
Posted on Reply
#19
Cataclysm_ZA
Hello everybody! This is my first time posting on TPU and I've been a lurker guest for years. I decided today I'm rather going to start adding my bit to the community here and this seems like the perfect place to start.

by: Thefumigator
2 - A10 is today's top trinity apu, but won't be the only one. Think about A12, A14, A16, I mean, we don't really know how AMD will refresh its line of apus, we only know the line will be compatible.
AMD won't refresh the entire lineup until the generation after Steamroller hits, possibly unifying the entire family under one socket, all with GPU and ARM components. But for now lets assume that with Sony giving AMD a large dosh of cash and telling it what is needed that this drives AMD to accelerate production of Piledriver cores on a 22nm process with VLIW4 GPUs on a 32nm process. They've arguably had enough time to get something like that tapered out because the Sony deal has been in the works for years. But since it doesn't look like Steamroller will be on 22nm, it wouldn't be a train smash if the components were stuck on 32nm and 40nm respectively.

I don't think they'll change from the existing lineup of A4; A6; A8 and A10 cpus because that would introduce more complexity. But that's on the desktop which, for this thread, doesn't fit into the equation. What will be in the PS4 is going to be a different beast to what we're used to and it could be based off the A10-5700 and a HD6670 GPU. So lets leave the desktop out of this for now because, as everyone in this thread is eager to point out, they're not directly comparable. Similar in terms of hardware, yes, but with software they are very different performers.

by: Thefumigator
3 - Developers had to optimize multithreading on the PS3 very complex architecture. So adopting the A10 should make things easier, and dual GPU would be a breeze.
I also think that coding for an x86-64 architecture and more modern instruction set will make the developers life easier but I'm not sold on the dual-GPU portion just yet (even on the desktop I'm hesitant to recommend SLI or Xfire to anyone). I haven't seen results from an APU and GPU combo that shows frame rates over time and from what I've seen with SLI and Xfire, the stuttering issues are enough to put some people off dual GPUs completely. However, I do know that with a strict hardware configuration the Xfire rendering could be tweaked so that instead of rendering alternating frames the devs could choose to divide up the frames between the two. Unfortunately, only developers with access to these machines can answer our questions and until then, everything else is just assumption.

by: 1d10t
Why Sony choose APU?Definitely Sony knew something that we don't.Console are console,targeted for most casual gamer whose doesn't even bother about upscale.
TDP requirements and less complexity, mostly (along with cost, which is going to be a big factor). You can build a APU setup into a thin client-like/ITX chassis without having to worry too much about cooling and your GPU requirements are mostly catered for already. When I was working at a computer repair shop a year ago I received three PS3s to diagnose and fix. When I could finally open two up for myself, one a launch version and the other a Slim I was stumped at how the launch versions could have survived the heat generation. You had these massive, (relatively) power-sucking chips that needed a good amount of cooling to stay functional for as long as the warranty remained valid and I often found with other units that the cooling wasn't always up to scratch. I had to re-flow the boards, clean out the cooling systems and lap the heatsinks so that the older units wouldn't overheat.

With desktop-class APUs, the stock cooler is perfectly fine. In fact, cooling requirements for the A10-5700 peaks at only 65W TDP which means there's much less work required in designing the console's cooling system. I think we might even see a launch version that is as slim as the PS3 Slim (not the recent swanky one which, IMO, looks fugly) and in future could become as small as the PS2 Slim. Considering that the APU in question might even be the mobile A10-4600M, its plausible.

by: 1d10t
Next post is camera capture shows 1080p60Hz on TV info to countermeasure your opinion regarding lack of 1080p capabilities on PS3.You suggesting a proper method for comparing between these two,you may quote any of my last post,did i mention "rendered"?
No-one's ever said that the PS3 can't run games at 1080p. In fact, there's a handful that can, Prince of Persia being the only recent one I can remember. Its down to the developers that have to figure out what they want to sacrifice the most: visual fidelity and potentially higher performance or more stuff on the screen but potentially lower performance.

To those of you who say that a game running at 30fps at 1080p is crap, I'd have to agree with you initially. If I notice it, it becomes a problem until I play the game enough times to not notice it and then its smooth sailing from there on. Even Forza Horizon, which I got to play recently, runs at 720p and minimum 30fps. Technically the developers could run the game at 1080p and get similar performance, but they'd have to sacrifice some visual fidelity and the beautiful world the game is rendered in. Personally, I don't have a problem with the speed at which the game is rendered, only that it looks good and doesn't suffer hiccups.

Likewise, you can't directly compare today's consoles with desktops. Well, at least not the PS3 because the RSX GPU lacks some components and instruction sets that make it comparable to a desktop-class GPU. The Xbox 360 is closer to a proper desktop setup but again can't be compared directly because it can't render anything in DX10. A good deal of games today include a DX10/DX11 render path so that makes the comparison even more moot. 720p30 DX9 and 720p30 DX11 with their highest settings will look different and will behave differently due to the rendering mode. With the PS4 and the Xbox 720 being based off modern hardware, at least we'll have consoles and computers on the same footing again in terms of graphical ability, if not in performance.

And I'm sad that the shift to an x86-64 architecture means my existing PS3 library won't be compatible with the new system, but I guess that its only fair that six years on a new standard is introduced. The PS3 has had an incredibly long run and its time for something new.
Posted on Reply
#21
1d10t
by: Cataclysm_ZA
When I was working at a computer repair shop a year ago I received three PS3s to diagnose and fix. When I could finally open two up for myself, one a launch version and the other a Slim I was stumped at how the launch versions could have survived the heat generation. You had these massive, (relatively) power-sucking chips that needed a good amount of cooling to stay functional for as long as the warranty remained valid and I often found with other units that the cooling wasn't always up to scratch
yes,i remembered around January 2009 i bought 40GB PS3 CECHJ NTSC-J,been enhanced to 65nm from the previous 90nm process.Still produce large amount of heat from rear-leaf blower fan,after 30 minutes of playing it sounded like a jet ready to take off :laugh:
Likewise, you can't directly compare today's consoles with desktops. And I'm sad that the shift to an x86-64 architecture means my existing PS3 library won't be compatible with the new system, but I guess that its only fair that six years on a new standard is introduced. The PS3 has had an incredibly long run and its time for something new.
And yet fellow TPU member here say they need bla-bla gpu to do 1080p,mumble-mumble about so called rendering or giving mambo-jambo calculation for kindergarten exam.
Are they forget?This is a console,a consumer electronic,a set-top-boxes attached to a TV set.Sony had to make their console compatible for a wide range TV,so they attached A/V RCA output legacy for analog TV and sporting HDMI for SD/HD/Full HD TV.They didn't have to do 60fps because if doing so it violates NTSC standards (29,97fps),PAL standards (25fps) or non-popular SECAM.

Current PS3 using Linux kernel 2.4,i bet Sony will use Linux 3.0 with some HSA optimization :p
Posted on Reply
#22
KainXS
I am wondering how much information on the Wii U's gpu is available that can be confirmed, most of what I have seen are rumors, anyone know any facts, all we have are some pictures that show that the gpu and cpu are on the same package and unconfirmed sources just like when the wii launched and it took years to verify that after the wii's release.
Posted on Reply
#23
newtekie1
Semi-Retired Folder
by: Mussels
why do people keep insisting that the last gen consoles had high end hardware at launch? they had derivitaves of high end PC hardware, but thats not the same thing at all. the power consumption of the consoles as a whole is lower than the high end video cards of that era, which really should give that fact away.

consoles never launch with high end PC hardware. this launch is no exception - and its generations faster than what we have now.
Agreed completely, and it only takes a little bit of research to confirm this.

Lets look at the PS3. It was launched in Nov/2006. It used basically the core from a 7800GTX, but with lower memory clocks and less ROPs, but lets just say it was a 7800GTX for the sake of argument. At almost the exact same time, within days actually, nVidia released the 8800GTX which was a huge leap ahead of what the 7800GTX was capable of. Not only did it provide 50-100% more performance(depending on the game) in DX9, it also introduced DX10.

The Xbox360 wasn't that much different. It used, basically, an X1800XT GPU(again lower clocks and memory speeds, but we'll just say x1800XT). And while it was a little more up to date when it was released, thanks to be released a year earlier than the PS3, it was still behind as ATI had just released the x1900 series as the Xbox was coming out.
Posted on Reply
#24
Benetanegia
by: newtekie1
Agreed completely, and it only takes a little bit of research to confirm this.

Lets look at the PS3. It was launched in Nov/2006. It used basically the core from a 7800GTX, but with lower memory clocks and less ROPs, but lets just say it was a 7800GTX for the sake of argument. At almost the exact same time, within days actually, nVidia released the 8800GTX which was a huge leap ahead of what the 7800GTX was capable of. Not only did it provide 50-100% more performance(depending on the game) in DX9, it also introduced DX10.

The Xbox360 wasn't that much different. It used, basically, an X1800XT GPU(again lower clocks and memory speeds, but we'll just say x1800XT). And while it was a little more up to date when it was released, thanks to be released a year earlier than the PS3, it was still behind as ATI had just released the x1900 series as the Xbox was coming out.
And again, the PS4 will not release until 2014 or very late 2013. By that time HD8900 refresh will be released if not HD9900. Following the logic above the PS4 should use at the very least an HD7950 (lower clocks, maybe 256 bit) or simply a HD7870 or whatever OR as said a HD8950, because HD9900 is around the corner.

They are talking about an APU or in the best case scenario maybe paired up with HD6770 or something like that. It's as if PS3 had shipped with a GeForce 6600 instead of a 7800. Plus in 2004/2005 when the PS3 was spec'ed the 7800 was the fastest card.

So yeah let's stop with that argument. No one's saying it had a high-end GPU when it launched, but it definately had a (slower) variant of the high-end GPUs when they were designed. PS4 by the time it launches, it will have a low-end GPU 3 generations behind.
Posted on Reply
#25
T4C Fantasy
CPU & GPU DB Maintainer
its like this every year, some people get sick of it, but some people love to see others talk about things they don't know about.

here are the facts, im not wrong and its been said before, even in this thread, game consoles only need a fraction of the power a pc needs for gaming because there is no heavy multitasking, no os throttling, and devs get a dev kit that PROMISES 100% performance so the devs have time to work with what little they have without worrying about component upgrades. even today I bet any of you cant play gta iv with 256mb ram and even a hd7970 on windows xp and up with as good frames as a ps3 gets... lol and with a core i7 3770k.. I doubt gta iv will even open, it will just say fuck you im busy.
Posted on Reply
Add your own comment