Friday, November 2nd 2012

Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU

According to a VG 24/7 report, Sony began shipping development kits of its upcoming game console, PlayStation 4, codenamed "Orbis" to developers. The kit is described as being a "normal sized PC," driven by AMD A10 "Trinity" APU, and 8 or 16 GB of memory. We've known from reports dating back to April that Sony plans to use a combination of APU and discrete GPU, similar to today's Dual Graphics setups, where the APU graphics core works in tandem with discrete mid-range GPU. The design goal is to be able to play games 1920 x 1080 pixels resolution, with 60 Hz refresh rate, and with the ability to run stereo 3D at 60 Hz. For storage, the system has a combination of Blu-ray drive and 250 GB HDD. Sony's next-generation game console is expected to be unveiled "just before E3," 2013.


Source: VG 24/7
Add your own comment

354 Comments on Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU

#2
ChidoriHV
If the next gen consoles could run Battlefield 3 Ultra settings style games @1080p at 60 FPS average and min 30FPS, then i would be a very happy man. Albeit BF3 Ultra setting at 30-40FPS is pretty unplayable. About human eye not noticing past 30FPS... thats really bull... i can see the ghosting difference between a 2 and 5ms screen. You don't focus on those things all the time, but when you notice them... they annoy the hell out of you.

- Accually the worst thing on PS3 and Xbox360 right now is that when you run games at 720P/1080P - The more graphically intense games have a pretty noticeable stutter/lag in them.
Posted on Reply
#4
1nf3rn0x
The Reality Synthesizer is a modified GeForce 7800. It has a core clock of 550 mhz, 24 pixel-shader pipes each capable of 27 flops/cycle, and 8 vertex pipes capable of 10 flops/cycle, for a total flops performance of
550mhz x (24x27 + 8x10) = 550 x 728 = about 400 gigaflops.

This is the gpu from a Trinity A10-5800K APU



Compare the pair. With 7 years of optimization, more efficient game engines, most likely more video memory, RAM, processors and in total more powerful hardware full 1080p @ 60fps seems easily done to me.
Posted on Reply
#5
v12dock
So a PS4 emulator will be very doable
Posted on Reply
#6
1nf3rn0x
v12dock said:
So a PS4 emulator will be very doable
Emulators are made by indie developers, hence why we can just emulate ps2 games @ full speed. The amount of coding a single or a few people can do and optimizations for a completely different platform to a console is pretty hard. No need to be sarcastic.
Posted on Reply
#7
Binge
Overclocking Surrealism
v12dock said:
So a PS4 emulator will be very doable
No, no it will not. The hardware will be LIKE what we have in PCs, but in reality the specialty chips they will make will behave much differently and will be addressed differently in the system.
Posted on Reply
#8
Rowsol
Good for AMD. If 2 gpus are going to be powering the games it should make sli/crossfire that much easier to code for when we get the ports. I for one can't wait for the new consoles, even though I won't be buying one. Going to be another year though... pfft
Posted on Reply
#9
Steevo
There were some people who made a "lite" version of XP a few years ago that was bootable, it made only mild differences in the performance of the games and applications they ran and tried.


The difference here is the lack of other configurations. Optimize for one machine, with one set of known instructions. The difference is at the software level of the application, much like a patch can improve frame rates, and a GPU driver can increase performance.
Posted on Reply
#10
Delta6326
I don't know about you people. But when I play games I spend 90% of my time with the story and gameplay and could care less about visual quality. Remember the days of NES and 64 when it was actually fun to play and you didn't just spend your whole time staring at a screen and saying OH OH look I found a rock that's not properly rendered!! Let me go online and thread crap the internet over it.
Posted on Reply
#11
Benetanegia
esrever said:
guess I remembered wrong but that is not correct either. The inefficiencies in the original R600 designs were made the performance extremely low. The trinity A10 is exactly 1/4 of a 6970. the 360 performs like a 2600xt.
the 2600xt gets 933 3dmark vantage, the 6970 gets 21k. If trinity wasn't bandwidth constrained it would get more than 5k which is an almost 6x as much performance increase. This is from outdated and inefficiency software system. 8x the performance is what sony quotes I think. Which given moore's law, its very easily done.
The XB360 GPU is much more capable than 2600xt. A hell of a lot more capable, and so is RSX. 2600xt was 1/4 a HD2900, while Xenos was more like 2/3. Your 3dmark numbers are irrelevant and unrelated. Trinity GPU is not much more capable than the old HD2900 and by extension the Xenos GPU. And by this I mean generationally speaking, yes it's maybe 4x, maybe 6x faster in some regards, but nearly any dedicated GPU is 10x or 20x faster. APUs are also much more limited by memory bandwidth in it's PC form (which makes it much much slower than HD6570, the chip that can be said t be 1/4 a HD6970), but that's something I expect them to solve in the PS4.

All in all it's incredibly irrelevant whether it's 4x-6x or 8x faster, it' won't do new games at steady 1080p@60fps unless it's at least one order of magnitude faster. Same games as the PS3 (old games)? Yes of course, but what's the point. If a resolution upgrade is the only thing the new consoles are going to offer, I don't think anyone's going to pay so much for a resolution upgrade, unless the consoles sell for <$200. I mean yeah, people are stupid, but if after 7 years dismissing it, they suddenly awaken to the fact that 1080p, true 1080p, is much better and they decide they now need to shell out $400 just for that upgrade... I swear I'll visit them one by one and punch them in the face. No kidding.

Delta6326 said:
I don't know about you people. But when I play games I spend 90% of my time with the story and gameplay and could care less about visual quality. Remember the days of NES and 64 when it was actually fun to play and you didn't just spend your whole time staring at a screen and saying OH OH look I found a rock that's not properly rendered!! Let me go online and thread crap the internet over it.
I remember those days. They sucked bad. I spent all days knowing how much better my uncle's PC games were compared to the crap I was forced to play because I was a child.

Better graphics don't make games worse. There's games with bad graphics that suck too, etc. One thing does not exclude the other. Better graphics means more immersion and in some games that contributes and makes it 100x better. i.e. Metro 2033 with worse graphics would have not had the same atmosphere and wouldn't have been so immersing.
Posted on Reply
#12
Thefumigator
BigMack70 said:
Didn't expect to see a "human eye can only see [x] fps" troll in this thread :laugh:

I'm not even gonna get into that nonsense...
First of all, I'm not trolling. Second, what I said is correct, and its not something to take as lightly. But of course I made a mistake of applying it to gaming, and it really is another story that movies. (As I am a professional video editor, and not a gamer, at all)

Just to polish my argument, take a bluray movie, if its NTSC it runs at 30fps. You don't see any movie losing frames at that framerate. Even a computer animated movie like toy story, unless the disc gets dirty or something and the thing begins stuttering. Of course, toy story was smoothly rendered prior its final release, not real time rendering. This makes difference in games, talking with my colegues about it, it may happen that motion blur and other things don't look good when rendering at 30fps and more frames are needed indeed.

Also, in a real time rendering scene, a colegue of mine demoed to me a 3D scene where some elements were rendered at a different pace. Yes it may happen, depending on the developer of course, that some elements could be rendered on faster framerates than others, on the same scene. So of course, a videocard that reaches 120frames per second may be able to overcome this issue very easily.
Posted on Reply
#13
semantics
ChidoriHV said:
If the next gen consoles could run Battlefield 3 Ultra settings style games @1080p at 60 FPS average and min 30FPS, then i would be a very happy man. Albeit BF3 Ultra setting at 30-40FPS is pretty unplayable. About human eye not noticing past 30FPS... thats really bull... i can see the ghosting difference between a 2 and 5ms screen. You don't focus on those things all the time, but when you notice them... they annoy the hell out of you.

- Accually the worst thing on PS3 and Xbox360 right now is that when you run games at 720P/1080P - The more graphically intense games have a pretty noticeable stutter/lag in them.
You do know Ghosting a FPS have nothing to do with each other, and just because a monitor is rated gtg 2ms or 5ms does mean a 2ms would look the same as the other 2ms gtg. And Ghosting has to do with the monitor ability to change from one color to another or from on to off, when it's slower ghosting occurs where you get sorta a bleeding of the previous frames, fps in a game is independent of that.

I'd also point out that in 1 second 1000ms 30 frames equates to a difference of 33.3ms per a frame 60 fps results in 16.6ms per frame, to get down to your 2ms it would take a 500fps monitor. Also i'd point out if you believe you can actually see the difference between 2 and 5ms you're delusional that means you can see humming bird sings flap flawlessly, the bleeding effect in ghosting varies by monitor have ghosting as long as 100ms+ in duration in early monitors as long as 300ms+. 2ms GTG is not a good measure of how the overall monitor handles a picture it's insinuates it's faster at turning on and off and changing color of pixels but it's not a 100% flawless.
Posted on Reply
#14
Xzibit
Am I the only one that read the 3rd paragraph of that linked article
There are to be four versions of the dev kit, we were told. A previous version was essentially just a graphics card. The version shipping now is a “modified PC,” and the third version, appearing in January, will be close to final spec. A final version will be delivered to developers “next summer”.
Dev kit will change 2 more times.

So how many pages is the thread for the first Dev kit at ?

Cant wait for the next 2 dev kit threads

:toast:
Posted on Reply
#15
BigMack70
Thefumigator said:
First of all, I'm not trolling. Second, what I said is correct, and its not something to take as lightly. But of course I made a mistake of applying it to gaming, and it really is another story that movies. (As I am a professional video editor, and not a gamer, at all)

Just to polish my argument, take a bluray movie, if its NTSC it runs at 30fps. You don't see any movie losing frames at that framerate. Even a computer animated movie like toy story, unless the disc gets dirty or something and the thing begins stuttering. Of course, toy story was smoothly rendered prior its final release, not real time rendering. This makes difference in games, talking with my colegues about it, it may happen that motion blur and other things don't look good when rendering at 30fps and more frames are needed indeed.

Also, in a real time rendering scene, a colegue of mine demoed to me a 3D scene where some elements were rendered at a different pace. Yes it may happen, depending on the developer of course, that some elements could be rendered on faster framerates than others, on the same scene. So of course, a videocard that reaches 120frames per second may be able to overcome this issue very easily.
Oh no... the only thing worse than the "human eye can't see more than [x] fps" troll is the "human eye can't see more than [x] fps" person who seriously believes that argument.

It's bogus. It's been debunked a billion times and has no credibility at all. It's not worth going into. Like I said, if you think the human eye can't see more than 25-30fps, go get a 120 Hz monitor, set its refresh rate to 30 Hz, 60 Hz, and 120 Hz respectively, and drag some windows around the desktop. You tell me if you see a difference.

:shadedshu
Posted on Reply
#16
EpicShweetness
WOULD JUST ALL SHUT THE FUCK UP! ESPECIALLY YOU BIGMACK70 HOLY SHIT YOUR SUCH A CHILD!! I went to go do my work for the day and after a simple post I come back you've argued over resolution with some kids OVER 9 GOD DAMN PAGES!! LET IT GO SHIT!!!
Posted on Reply
#17
KainXS
bigmack, . . . . . . . that name . . . . . . . sounds so familiar . . . . . . and you act just like that guy did . . . . . . .. :shadedshu
Posted on Reply
#18
BigMack70
EpicShweetness said:
WOULD JUST ALL SHUT THE FUCK UP! ESPECIALLY YOU BIGMACK70 HOLY SHIT YOUR SUCH A CHILD!! I went to go do my work for the day and after a simple post I come back you've argued over resolution with some kids OVER 9 GOD DAMN PAGES!! LET IT GO SHIT!!!
I think you're taking the internet too seriously :roll:
Posted on Reply
#19
Nihilus
New egg selling PS4s for $438

I N WIN BP655.200BL Black Steel Mini-ITX Desktop Computer Case 200W Power Suppl
$44.99

Western Digital WD Blue WD5000AAKX 500GB 7200 RPM SATA 6.0Gb/s 3.5" Internal Hard Drive -Bare Drive
$69.99

G.SKILL Ares Series 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 2133 (PC3 17000) Desktop Memory Model F3-2133C9D-8GAB
$60.99

ASUS Black Blu-ray Drive SATA Model BC-12B1ST/BLK/B/AS
$54.99

AMD A10-5800K Trinity 3.8GHz (4.2GHz Turbo) Socket FM2 100W Quad-Core Desktop APU (CPU + GPU) with DirectX 11 Graphic AMD Radeon HD 7660D AD580KWOHJBOX
ASRock FM2A75M-ITX FM2 AMD A75 (Hudson D3) SATA 6Gb/s USB 3.0 HDMI Mini ITX AMD Motherboard
$207.98

Subtotal: $438.94
Posted on Reply
#20
BigMack70
KainXS said:
bigmack, . . . . . . . that name . . . . . . . sounds so familiar . . . . . . and you act just like that guy did . . . . . . .. :shadedshu
Huh?
Posted on Reply
#21
Deadlyraver
I'm just happy that the console generation, after so many years, has picked up in graphics capability.
Posted on Reply
#22
Super XP
The only issue with these new consoles is most of our 3+ year old PC's are superior to these new consoles which will once again be aged at release.

But of course it's a he'll of a lot better than still mucking around with current ones that's been blowing down gaming quality IMO.
Posted on Reply
#23
1d10t
BigMack70 said:
@1d10t

Ok point taken about what the PS3 "shows". It "shows" a 1080p image that is upscaled from a 720p source (when in game).

But you do realize that the screenshots you posted were all downscaled below what "shows" on the TV, and therefore are irrelevant, right?

If you can't see a difference between the two screens I posted, you probably need glasses.
Yes sir, i hope you happy now.
I can't see the difference yet,i'm browsing TPU from 4,3" inch HTC Sensation.

Last word,please enjoy any future game came from console ported.

================================

A lot of a haters.No,seriously.
When TPU staff wrote article about AMD overclocking,Intel fanboys will swarm in spreading flamebait.Someone posted news about Apple,all Apple haters come to arise.Now the best part,even news about upcoming console has been raid with some expertise claiming knew better about hardware capabilities even they never had nor play any console.
So if i use Intel,not using Apple and never touched console,does that make me rich and smart?
Posted on Reply
#24
HossHuge
Everybody chill out and remember we are all here for the same reason. We all have an interest in all things Computer/Gaming.

Can we just agree that this will make gaming better finally for PC and console gamers?

Hot chicks always help!!



Has Sony given up on any sort of motion control device?
Posted on Reply
#25
Dent1
BigMack70 said:
Not on that particular argument... maybe read the thread? Notice anyone still trying to argue that 1080p60 isn't 4x as demanding as 720p30? Your original post just reflects a lack of reading comprehension as even Phenom realized that my argument was correct.
It isnt 4x demanding, what is your point?

Either show mathematical proof or a link source or sit down.

You can't claim stuff with no evidence then say you're right.
Posted on Reply
Add your own comment