Friday, November 2nd 2012

Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU

According to a VG 24/7 report, Sony began shipping development kits of its upcoming game console, PlayStation 4, codenamed "Orbis" to developers. The kit is described as being a "normal sized PC," driven by AMD A10 "Trinity" APU, and 8 or 16 GB of memory. We've known from reports dating back to April that Sony plans to use a combination of APU and discrete GPU, similar to today's Dual Graphics setups, where the APU graphics core works in tandem with discrete mid-range GPU. The design goal is to be able to play games 1920 x 1080 pixels resolution, with 60 Hz refresh rate, and with the ability to run stereo 3D at 60 Hz. For storage, the system has a combination of Blu-ray drive and 250 GB HDD. Sony's next-generation game console is expected to be unveiled "just before E3," 2013.


Source: VG 24/7
Add your own comment

354 Comments on Sony PlayStation 4 "Orbis" Kits Shipping to Developers, Powered by AMD A10 APU

#1
SIGSEGV
by: MuhammedAbdo

...
If the specs has been changed and the console came with a high-end or even a medium AMD GPU , then the situation will be different .
seriously, are you kidding me? :laugh:
i think, you should open your console machine cover then move and place your high end GPU along with intel 6 cores, high end mobo, and also your 32GB rams inside your console box. :rockout:
FACT 7 : PCs will maintain a higher visual quality and frame rates , consoles will have the graphics of a two years old PCs .
...and many of pc games are ported from console games.
Posted on Reply
#2
lyndonguitar
I play games
by: MuhammedAbdo
(SIGH)

This thread is full of ignorant people making false and misguided statements all along , the only people here with brains are : Benetanegia and BigMack70 .

First you morons need to read some facts , here they are :
FACT 1 : Not even High-End PCs can sustain 60FPS @1080p in demanding games with maximum graphics , games like : Metro 2033 , ARMA 3 , Crysis , Dragon Age 2 and many many more , there will be drops below 60FPS and in many occasions .
In many occasions yes, But AVERAGE is what matters bro. a few 50 fps in couple of miliseconds doesnt matter
FACT 2 : Consoles include insane amount of code optimization , where every CPU/GPU cycle is utilized , they literally run on the machine code , which is the lowest language of software programming , contrary to the PC that sports many compilers and higher languages which wastes valuable cycles .
You said it yourself, Heavy Optimization, Which is why Consoles from 2005 are able to run today's games. same reason why a Console in 2013(PS4) will be able to run games up to 2020 or so.
FACT 3 : Even with these optimizations, consoles run with shitty graphics . they cant even maintain 30 FPS at 720p , they usually drop to 25 and 20 FPS , they even run at sub 1280x720 resolutions , sometimes as low as 900x600 !
Because they are pushing the console to its limit by trying to play a game on a 7 year old machine. but still looks decent for CONSOLE gamers to play
FACT 4 : Consoles cut down on graphics severely , they decrease Shadows density , Lighting Effects , Level Of Detail , Polygon Count , Alpha Effects , Texture Resolution , Texture Filtering , Anti-Aliasing , Post Processing and so many things that I can't even remember them all .
Not severely, Almost all games in consoles are the low-medium graphics counterparts of their PC versions. High/Ultra is just additional eye candy benefit for PC users. but most PC players mostly aim for pure performance and flexibility.
FACT 5 : a console with the triple specs of,
1-AMD CPU
2-AMD APU
3-AMD GPU (low-end/6670)

Will barely run today games at 1080p @60 FPS with PC graphics level , all of the code optimizations will be spent on the cost of resolution increase (to 1080p) and the cost of graphics increase (to PC level) , such as shadows , lighting , textures and etc .

If the specs has been changed and the console came with a high-end or even a medium AMD GPU , then the situation will be different .
you said it yourself again, "PC graphics level", This is a console bro, and Its not running PC graphics level, include all the heavy optimizations and the huge possibility of mid-range AMD GPU, as with PS3's midrange GPU. and you can do 1080p @60fps average
FACT 6 : These consoles will have to do the usual dirty business to be able to run at 1080p , cut resolution and upscale , decrease all graphics elements (lighting and shadows and textures .. etc) below the future PC level .. This happened to the previous generation too , Xbox and PS 3 stared operating at 720p just fine , then they had to cut corners to increase graphics other wise the visuals will stall .
Again this is a console, don't expect it to be PC God like performance, If you want that, Buy a PC, they never said the graphics to be similar to PCs, they only said 1080p @60FPS
FACT 7 : PCs will maintain a higher visual quality and frame rates , consoles will have the graphics of a two years old PCs .

End of Discussion .
uhmm Nobody said that Consoles "will maintain a higher visual quality and frame rates" than PCs, End of Discussion???

Trololololol
Posted on Reply
#3
acerace
by: DaMobsta
298 posts and yet no intel fanboys bashing the amd article. Go TPU! :toast:
You ask for it.

AMD sucks balls, Intel is million times better. :rockout: They should just burned to the ground, what a waste of human resources. Don't get me started on their shite GPU. :mad:





:roll: :roll: :roll: :roll: :roll:
Posted on Reply
#4
lyndonguitar
I play games
by: acerace
You ask for it.

AMD sucks balls, Intel is million times better. :rockout: They should just burned to the ground, what a waste of human resources. Don't get me started on their shite GPU. :mad:
says the Radeon User
Posted on Reply
#5
xenocide
@[USER=83138]lyndonguitar[/USER]

I wouldn't say Average frame rate is all that matters. FPS dips will affect gameplay way more than adjusting the LoD so its just a tad better at the cost of some visual affects. Say you have 2 situations, one where a certain game--say Modern Warfare 4--can run at 1080p with an average of 60fps, but it drops down as low as 30fps. Now compare that to the same game running at the same resultion but it average 55 fps, and never drops below 50fps, the only difference is they removed a particular lighting effect. On a console the second option will be more enjoyable because you'll get a more constant frame rate, despite having a lower average frame rate.

As for console optimization, you're kind of blurring the perception a bit. The most demanding and well optimized console games render at 720p at about 29-30 fps, with the equivalent of lowmedium settings and almost no anti-aliasing. They look decent by most peoples standards, but hitting the bar isn't that hard. You could accomplish that on same title with a mediocre PC (with none of that optimization) akin to a C2D and HD4870 GPU. Hell, you could probably do a decent amount better.

You have to realize rendering stuff at 540p and 720p really gives a lot of wiggle room. I think the biggest bottleneck with current gen consoles is actually in the RAM department. Imagine making a game that can only access 256MB of system memory (PS3), and see how well it runs. I think we have to keep things in perspective. The APU's GPU is capable of running modern PC games at medium settings at 1080p with moderate amounts of AA, and still posting 20-30fps. That's really not bad. When you throw a 6670 in the mix it only gets better. The question will always be quality. They could use that setup and render a single textured cube spinning at 1080p at 60fps and their statements would be accurate, but people want a game that hits those settings and still looks good.
Posted on Reply
#6
Capitan Harlock
by: xenocide
@lyndonguitar

I wouldn't say Average frame rate is all that matters. FPS dips will affect gameplay way more than adjusting the LoD so its just a tad better at the cost of some visual affects. Say you have 2 situations, one where a certain game--say Modern Warfare 4--can run at 1080p with an average of 60fps, but it drops down as low as 30fps. Now compare that to the same game running at the same resultion but it average 55 fps, and never drops below 50fps, the only difference is they removed a particular lighting effect. On a console the second option will be more enjoyable because you'll get a more constant frame rate, despite having a lower average frame rate.

As for console optimization, you're kind of blurring the perception a bit. The most demanding and well optimized console games render at 720p at about 29-30 fps, with the equivalent of lowmedium settings and almost no anti-aliasing. They look decent by most peoples standards, but hitting the bar isn't that hard. You could accomplish that on same title with a mediocre PC (with none of that optimization) akin to a C2D and HD4870 GPU. Hell, you could probably do a decent amount better.

You have to realize rendering stuff at 540p and 720p really gives a lot of wiggle room. I think the biggest bottleneck with current gen consoles is actually in the RAM department. Imagine making a game that can only access 256MB of system memory (PS3), and see how well it runs. I think we have to keep things in perspective. The APU's GPU is capable of running modern PC games at medium settings at 1080p with moderate amounts of AA, and still posting 20-30fps. That's really not bad. When you throw a 6670 in the mix it only gets better. The question will always be quality. They could use that setup and render a single textured cube spinning at 1080p at 60fps and their statements would be accurate, but people want a game that hits those settings and still looks good.
this seems a marketing move to tell we give you 60 fps at 1920 x 1080 when most of the console gamers dont know nothing about fps and punt money on something wortless than a pc.
If they comes out with a system with high performance in EVERYGAME at high settings or ultra is good but they dont have to give the same FAKE NEW TECNOLOGY of ps3 for steal money from idiots with the excuse of the bluray .
This is very sad and we pc gamers brings again porting with shit graphics and unoptimized like gta 4 and others.
Now will see .
Posted on Reply
#7
Dent1
by: MuhammedAbdo
(SIGH)

This thread is full of ignorant people making false and misguided statements all along , the only people here with brains are : Benetanegia and BigMack70 .
BigMack70's second account? Hello :roll:

by: MuhammedAbdo
(SIGH)

This thread is full of ignorant people making false and misguided statements all along , the only people here with brains are : Benetanegia and BigMack70 .

First you morons need to read some facts , here they are :

FACT 1 : Not even High-End PCs can sustain 60FPS @1080p in demanding games with maximum graphics , games like : Metro 2033 , ARMA 3 , Crysis , Dragon Age 2 and many many more , there will be drops below 60FPS and in many occasions .

FACT 2 : Consoles include insane amount of code optimization , where every CPU/GPU cycle is utilized , they literally run on the machine code , which is the lowest language of software programming , contrary to the PC that sports many compilers and higher languages which wastes valuable cycles .

FACT 3 : Even with these optimizations , consoles run with shitty graphics . they cant even maintain 30 FPS at 720p , they usually drop to 25 and 20 FPS , they even run at sub 1280x720 resolutions , sometimes as low as 900x600 !

FACT 4 : Consoles cut down on graphics severely , they decrease Shadows density , Lighting Effects , Level Of Detail , Polygon Count , Alpha Effects , Texture Resolution , Texture Filtering , Anti-Aliasing , Post Processing and so many things that I can't even remember them all .

FACT 5 : a console with the triple specs of,
1-AMD CPU
2-AMD APU
3-AMD GPU (low-end/6670)

Will barely run today games at 1080p @60 FPS with PC graphics level , all of the code optimizations will be spent on the cost of resolution increase (to 1080p) and the cost of graphics increase (to PC level) , such as shadows , lighting , textures and etc .

If the specs has been changed and the console came with a high-end or even a medium AMD GPU , then the situation will be different .

FACT 6 : These consoles will have to do the usual dirty business to be able to run at 1080p , cut resolution and upscale , decrease all graphics elements (lighting and shadows and textures .. etc) below the future PC level .. This happened to the previous generation too , Xbox and PS 3 stared operating at 720p just fine , then they had to cut corners to increase graphics other wise the visuals will stall .

FACT 7 : PCs will maintain a higher visual quality and frame rates , consoles will have the graphics of a two years old PCs .

End of Discussion .
What has any of this got to do with BigMack70's claim that your hardware needs to be 4x demanding to jump from 720p to 1080p?


by: DaMobsta
298 posts and yet no Trickson bashing the amd article. Go TPU! :toast:
FIXED!
Posted on Reply
#8
acerace
by: lyndonguitar
says the Radeon User
In case you didn't notice, that is a sarcasm. :)

Well, he asked for it. :laugh:
Posted on Reply
#9
Am*
I'm really not impressed, and pretty disappointed with this.

Rumours suggest the next Xbox going the same route of using AMD APUs. So, that will make both consoles boring and pretty much identical PCs-in-a-box, with only the optical drives and the company logos setting them apart.

The worst part is, they're not even going to be high end by today's standards, which they were back in 2005/2006 when the 360 and the PS3 launched, and they certainly NEED to be if they want them to last anywhere near as long as the current consoles have. It's been about half a decade since the launch of both the 360 and the PS3 and they're already seeming more and more dated with every release.

The only good news I see from this is the fact that it'll use AMD GPUs. This should take a tonne of weight from the shoulders of their driver developers and make their lives a whole lot easier with first party developers that will maximize the capabilities of their hardware (and possibly encourage more console developers to develop or at least release decent ports of their games).

Other than that, the whole lineup of next gen consoles sounds crap. At this point, I can't see why another company, other than Sony, Microsoft and Nintendo, can't do the very same thing but better.

by: BigMack70
I find it hard to believe that they're claiming 1080p 60fps possible on an APU based system.

I call BS. More probably 720p upscaled
I can't see how it seems so farfetched to you. The 360's X1950-level GPU and the PS3's crappy 7800GT can do that already. This APU is in a whole different league compared to those old dogs.

by: No_Asylum
Actually ... that isnt even impressive at all. 60fps @ 1080p is slow by todays standards.
With a reasonable level of antialiasing/anisotropic filtering, it sounds good by today's standards. The problem is, it'll be painfully slow by tomorrow's standards (even if they release it by 2014, it will already be way outdated).
Posted on Reply
#10
Ferrum Master
by: Am*

The worst part is, they're not even going to be high end by today's standards
At this point guys... they are forgetting that at current development scale in mobile market... next gen ARM + PowerVR and Tegra will catch up the consoles pretty soon...(I am worried about Adreno I mean Radeon R500... it is still stucked in DX9 era... they fetched some people from ATI team recently again, I guess they are in a hurry)

Why you ask?

It has a larger market!

Good app ecosystem (stores). Unreal engine kit is already working with no problems, bringing no problems for devs...

The darn thing isn't only usable for gaming and doesn't gain dust in the shelf while mommy doesn't give $$ for a new[again the same] COD :D.

Do you think it won't be capable to catch this so called next gen?

Well I am currently playing this on my almost two years old crap phone based on Tegra2.

Horn Game
Posted on Reply
#11
Trovaricon
hello TPU fellows.

today I spared more than usual ammount of time to read the whole thread, register (after years) and prepare reply in the Internet 90' format.

by: BigMack70
Mathematical proof (for like the 4th time):
(1920*1080)/(1280*720) = 2.25
60/30 = 2
Put them together... 2.25*2 = 4.5

Hence, 1080p60 is ~4x as demanding as 720p30.

That's not a point about hardware. In fact, strictly speaking my point here has nothing to do with hardware. It's a point about math which is apparently too simple for many of the elite minds in this thread to grasp. :shadedshu

I'm worked up about kiddos who don't understand kindergarten math and reading comprehension.
That is mathematically correct for painting pixels on 2D surface by "simple c algorithm" with predefined array of 2D vector images.
Today rendering of 3D scene to 2D surface is much more complex than your simple calculation.

Objects are represented as 3D mesh with additional properties tied to them (material, textures, etc.). Now, you need to place this objects into 3D space by applying transformation and output texture position(s) (2d, to pick color from texture [x,y]) and vertex position "on screen", usually you utilize not only [x,y] coordinations but information about how deep "in screen" vertex is (vertex processing).
For each pixel that covers calculated triangle (output of 3 executions of vertex position transformation) on screen is then run pixel (fragment) shader with interpolated values of vertex shader output (here you can mix, modify skip etc. on-screen pixel color).

You see ? Your simple calculation matches only pixel shader computation. Described procedure is very simplified totally basic projection of 3D objects with textures into 2D space. Our simple "resolution based computation power requirements" formula is now much more complex, isn't it ? (let's not start even with adding basic lighting, shadowing or God-forbid animation to the calculation).

If you try to animate some object, you need to update its vertex positions within mesh as they are not only somewhere else on-screen (happens when you turn camera) but their relative position to each other is different. If this is handled by CPU, then this is often responsible for "CPU bottleneck", as it pushes constant pressure on CPU regardless of graphical settings. You can see it in multiplayer FPS with many players, or as perfect example - MMORPG games (CPU requirements for games as Lineage 2 in seriously "mass" pvp are astronomical). If it its handled by GPU, then you again have constant computation complexity not affected by render screen resolution.

On topic:
Create compiler exactly for single x86 architecture without compromises to "universal x86" operations selection (you can take into account exact memory / cache latencies and instruction latencies and their selection) and believe me, you will see miracles.
If next gen consoles contain some sort of X86 based APU, then did you not consider, that this will force to adapt new thinking for utilizing APU's in general for considerable amount of software developers ? And that is great success even for future desktop development.
If you have exact machine specification (HW, SW), you don't need statistics to determine how much operations can you execute in a given time on (avg) target machine HW with (avg) target software layer (drivers, OS) > you can count them exactly.


AD Internet 90' format (semi-ot):
In the past, reading almost any discussion thread on sites devoted to technical stuff resulted in gaining substantial knowledge (either by users directly writing information in post, or by pointing other discussants to relevant resources). After spending hour of forum reading, you took for granted, that your knowledge base expanded (not necessarily in exactly-wanted direction).
Today after huge Internet users expansion and with connection accessible even on toilet you need to watch out to not end up more stupid after hour of reading technical stuff related forum.
If users spent single hour of reading about how 3D rendering works (You can pick DirectX SDK samples, NeHe tutorial, some other introduction material or even a completely simple "How it works" or Wikipedia [1][2] reading) instead of smashing F5 for quickest possible response to "discussion enemies", then there would be real information sharing and knowledne gain benefit for all. Today Internet is not a medium for information and knowledge sharing (I have sometimes bad feeling that knowledge-generation process is stagnating) but one great human based random "BS" generator that can without any problems compete with random number generator run on supercomputer.

Seriously - this thread contains enough text and graphics to cover PhD or some other work, but information value posts can be counted on one's fingers...
Until some genius comes up with "BS filter", it would be interesting to "emulate" such feature by manually picking of information-rich posts by moderators or even by forum users (something like existing "thanks" function) with forum filter to show only flagged posts.

EDIT: Now I checked Wikipedia second link and statement "Vertex shaders are run once for each vertex given to the graphics processor" is not alway true. If you utilize Radeon HD2k-HD4k tesselator, then vertices count processed by vertex shader is actually higher, because fixed pipeline tesselator is placed before vertex shader in rendering pipeline (see Programming for Real-Time Tessellation on GPU)
Posted on Reply
#12
Ferrum Master
by: Trovaricon
hello TPU fellows.

today I spared more than usual ammount of time to read the whole thread, register (after years) and prepare reply in the Internet 90' format.


That is mathematically correct for painting pixels on 2D surface by "simple c algorithm" with predefined array of 2D vector images.
Today rendering of 3D scene to 2D surface is much more complex than your simple calculation.
Fully agree... It reminds me of the old times when Voodoo reigned and the sucker still didn't have hardware T&L... if someone still remembers what did it do.

Simply summing up by coefficient isn't possible due to large data overhead that runs also in parallel. First of all memory bandwidth bottle necks, latency increases due to more complex scene and more shader intensive tasks due to light sources etc and the engine itself.

The coefficient ain't linear anymore since those late 90ies I guess.

The next thing that consoles have only! Why they can evolve the graphics on the same platform. For example Metal Gear, Hideo Kojima stated himself in interview that the development was so long due to the fact that they had to rewrite many engine parts using assembly to achieve needed performance for PS3. It is a nightmare you know, but this also bends the math about calculating what we could expect on screen, due to nonexistent recompiler software layer.
Posted on Reply
#13
Super XP
by: WhiteLotus
Damn really? Usually you get nintendo being the cheaper option, I'm hoping they will be a little bit cheaper.
How can it be cheaper? They are using miniature tablets for controllers. Quite interesting, but I wouldn't buy it for more than $250.
Quote:
FACT 7 : PCs will maintain a higher visual quality and frame rates , consoles will have the graphics of a two years old PCs .
This is not a fact, since when is the 7+ Year old XBox 360 2 years old in graphics vs. PC.
by: DaMobsta
298 posts and yet no intel fanboys bashing the amd article. Go TPU! :toast:

I have a bad feeling about using hybrid crossfire on a console though, even if they optimize it, that's still an extra piece of hardware that could cause problems.
No, they would never release a console if it wasn't running 100%. This may be a good idea and may benefit the PC because of these blasted console ports. I say go Hybrid Crossfire.
Posted on Reply
#14
lyndonguitar
I play games
we need more people like Trovaricon here lol
Posted on Reply
#15
douglatins
Wasn't the ps3 supposed to last 10 years?
Posted on Reply
#16
Benetanegia
by: lyndonguitar
we need more people like Trovaricon here lol
Yes, I can agree with that, because of the effort and such, BUT the info he posted is heavily outdated and because of that non-relevant almost entirely. He's describing forward rendering and to make matters worse, forward rendering with per-vertex shading. That's not used in 95% of the games for 5+ years already. Nowadays everything is calculated on a per-pixel basis and several buffers are created with the resulting pixel information. That's it, deferred rendering/shading.

In deferred rendering what he describes only happens in the first pass (BF3 has 8+ passes): the diffuse color pass, which has very little information and after that everything from lighting, to shading, to advanced shadowing to ambient occlusion to everything happens on a per-pixel basis. Without all of this per-pixel calculations the end result would be the looks of a 90's 3d game. 90% of the work is based on pixel data == buffers == frames that are afterwards mixed (by ROPs and again pixel by pixel) into the final composition.
Posted on Reply
#17
Super XP
by: douglatins
Wasn't the ps3 supposed to last 10 years?
:roll: Yes that is what Sony wanted people to believe :D
by: BigMack70
Couple quick points just since people keep interacting with my arguments.

@Trovaricon - I know. See my posts #270 & 272. I clarified, or at least tried to.

@Am* - Current consoles do ~720p30, NOT 1080p60. I tried to sum up why it seems far fetched to me that next gen consoles will do 1080p60 in post 279. Even if you think I'm wrong, that should at the very least help you understand why I make the conclusion(s) I make regarding this.

@Disruptor4 - I posted benchmarks early on in the thread showing no single GPU can get 60fps minimum framerate in all games with all settings maxed at 1080p. Not trying to make a big deal out of that at this point, just wanted to clarify where I got my claims from (I could also say my own experience with a single 7970 but I figure benchmarks are more trustworthy).
I fully agree 100%. Today's consoles do not do 1080p, they do 720p then they get upscaled to 1080p. Consoles don't have the power to output at 1920 x 1080p resolution especially when the game is too complex.
Posted on Reply
#18
Jizzler
by: douglatins
Wasn't the ps3 supposed to last 10 years?
All systems (except for maybe the Sega 32X, Nintendo Virtual Boy) have at least one or a couple timeless classics that keep them alive :)

Now I know what you meant and I think it'll be close.

by: September 29th 2012
Last week, the Vice President of Hardware Marketing for the PlayStation brand, John Koller, told news outlet GameSpot that the company plans to support the PlayStation 3 until 2015. While some may think this automatically means the PlayStation 4 will not see release until 2015, this is probably not the case. As Koller explains, the PlayStation 2 stayed active for years, even after the PlayStation 3 saw release. It makes sense for Sony to continue to support the PlayStation 3 since they do not alienate those Sony fans who won't upgrade ASAP to the PlayStation 4. However, chances are good that all of Sony's major franchises will make the jump to the PlayStation 4, leaving only the smaller developers and publishers to continue to support the PlayStation 3.
Posted on Reply
#19
Trovaricon
by: Benetanegia
Yes, I can agree with that, because of the effort and such, BUT the info he posted is heavily outdated and because of that non-relevant almost entirely. He's describing forward rendering and to make matters worse, forward rendering with per-vertex shading. That's not used in 95% of the games for 5+ years already. Nowadays everything is calculated on a per-pixel basis and several buffers are created with the resulting pixel information. That's it, deferred rendering/shading.

In deferred rendering what he describes only happens in the first pass (BF3 has 8+ passes): the diffuse color pass, which has very little information and after that everything from lighting, to shading, to advanced shadowing to ambient occlusion to everything happens on a per-pixel basis. Without all of this per-pixel calculations the end result would be the looks of a 90's 3d game. 90% of the work is based on pixel data == buffers == frames that are afterwards mixed (by ROPs and again pixel by pixel) into the final composition.
What I described in previous post is definitely not forward rendering but exactly what I stated that it is:
by: Trovaricon
basic projection of 3D objects with textures into 2D space.
And yes such projection looks like 90' 3d game. EDIT: I used "On screen" term to avoid need to explain render targets.

Both forward and deferred rendering require more outputs from vertex shader (e.g. normals), not only texture coordination and vertex position in 2D space. It is hard to imagine how can you produce dynamic shadows with only single projection (mentioned BF)
Posted on Reply
#20
VIPER
by: btarunr
...The design goal is to be able to play games 1920 x 1080 pixels resolution, with 60 Hz refresh rate, and with the ability to run stereo 3D at 60 Hz...
13 pages of arguing about PS4 being capable of 1080p @60fps... OK, I know I am old, and maybe ignorant, but where does it say something about 60fps?!
Posted on Reply
#21
Aquinus
Resident Wat-man
by: VIPER
13 pages of arguing about PS4 being capable of 1080p @60fps... OK, I know I am old, and maybe ignorant, but where does it say something about 60fps?!
The 360 now can output a "60hz signal" it doesn't mean that it rendering 1080p at 60hz though.
Posted on Reply
#22
theoneandonlymrk
Whilst some argue this wont be enough, imho this is an initial dev kit based on what they project will work with ps4,
I recently noticed Amd have changed trinitys sucessor back to piledriver cores with radeon cores next, which to me indicates a trinity successor with Gcn2 and Hsa optimisations ahead of more cpu grunt, and its this chip with additional IP yet to be disclosed(imho interposer with lvl 4 x126mb cache and further Dsp's) that i believe will be the bases of a ps4 anyway, not directly this trinity chip thats for sure, so apples to apples will count for nothing,.

add in hardware optimisations for consoles and Amd's hint at a hardcoded gameing future (Api reduced/removed) and you will have a console that does 60 fps,

to the naysayers i have a crossfire main rig with a physx card(HYBRID) and it will do 60 Fps in EVERY game at 1080p with few settings needing to be eased ever bar AA, so to me the games amd dont do well in are nvidia biased or straight up physx games and an interesting point is that the xbox and ps3 implementation of physx use sse like extensions and are optimised better so nvidia are going to have to write physx to work well on Amd gear:eek:, of a kind:p

id swear some games work better just because an nvidia cards present(tho not used by the game)sometimes.
Posted on Reply
#24
Benetanegia
by: Trovaricon
Both forward and deferred rendering require more outputs from vertex shader (e.g. normals), not only texture coordination and vertex position in 2D space. It is hard to imagine how can you produce dynamic shadows with only single projection (mentioned BF)
And who said single projection?

I'm sorry but it looks like you're learning 3D programming and you're stuck on Chapter 1 yet.

Your argument was there's more than pixel shading, which is true and no one said otherwise. However you went all off with the description of what it's basically <5% of a modern game render pipeline and frame time, as if that represents a big proportion of it. So it's essentially true, but as I said irrelevant to dispute the argument that 4x the resolution requires 4x the power*. Even modern shadows are much more than a projection into shadow maps and is dependent on pixel shading.

*I said it in a previous post, this word is the biggest problem. The problem in a way is that people understand power as == performance in reviews and that's incredibly innacurate. A card with 2x the SPs is undeniably 2x as powerful in that department whether it ends up producing 2x the fps or fails to do so because it's bottlenecked elsewhere.

Now the problem regarding the OT is that an A10 APU is severely limited in ALL fronts, SPs, ROPs, texture units... everything and is simply not going to do what a high-end GPU has difficulties on achieving even today, no matter the optimization. And that's another focus of argument because some people are saying that we don't know if it's going to be a custom APu with more GPU power, etc, but the artcile states it's a A10 and that's not a custom APU is it? It's a commercially available APU, which is why an APU is supposedly going to be used, because it's available and cheap to produce. The days of heavily customized chips is over. Otherwise (custom chip) they would have continued with PowerPC architecture and keep the backwards compatibility.
and it will do 60 Fps in EVERY game at 1080p with few settings needing to be eased ever bar AA
First of all that's a lie or very arbitrary on what "few settings needing to be eased" trully means.

Second your crossfire setup is at least 5x more powerful than an A10 APU, so even if it was true an APU would do 12 fps on the same "few settings eased" conditions. It's a console so let's add a MASSIVE optimization from being a console and you might or might not reach 30 fps (200% increase), but 60 fps not a chance.

Anyway the Wii U is rumored to have a significantly more powerful GPU than A10 APU. Is SONY trully going to release something less capable? :laugh:
Posted on Reply
#25
Aquinus
Resident Wat-man
by: Benetanegia
Anyway the Wii U is rumored to have a significantly more powerful GPU than A10 APU. Is SONY trully going to release something less capable?
They said a custom chip based on the A10. We don't actually know what the specs are so it is very possible that it could sport a larger iGPU than what the A10-5800k has. We will have to wait and see what Sony churns out.
Posted on Reply
Add your own comment