Wednesday, April 4th 2012

Orbis Implements Multi-GPU, Too

Sony's next-generation PlayStation, reportedly codenamed "Orbis", is known to be powered by an AMD x86-64 CPU with graphics based on its Southern Islands architecture, from the older report. We're now hearing that Sony may implement a multi-GPU solution of its own. According to an IGN.com report, the CPU in question will be a custom-version of AMD's A8-3850 quad-core APU. This should serve as indication that the processor cores will be based on AMD's K10 Stars architecture, rather than K15 Bulldozer/Piledriver.

The GPU, on the other hand, will be based on the "Southern Islands" architecture, and the IGN.com report pin points it to resemble Radeon HD 7670. The HD 7670 is a re-branded HD 6670, which is based on the 40 nm "Turks" GPU. Turks uses neither Graphics CoreNext nor VLIW4, but the older VLIW5 number-crunching machinery. The most interesting piece of information here is talk of a multi-GPU configuration between this Turks-based GPU, and the GPU that's embedded into the "Llano" APU. We know that the graphics core embedded into AMD A8-3850, the Radeon HD 6550D, can work in tandem with Radeon HD 6670 to yield an AMD Hybrid CrossFireX configuration called "Radeon HD 6690D2". This could be end up being Sony's graphics weapon of choice.

Speaking of choices, we see that AMD is falling back to established technologies, be it K10 Stars CPU cores, or VLIW5-based GPUs, avoiding Bulldozer CPU. While its new Graphics CoreNext architecture is found to be fairly efficient in performance GPUs such as the HD 7700 series, its incompatibility with Llano's graphics could have served as a deal-breaker. Pictured above, is fan-made concept art.Source: IGN
Add your own comment

88 Comments on Orbis Implements Multi-GPU, Too

#1
xenocide
by: Thefumigator
A standard quad core cpu + a video card will be more expensive. Llano is perfect, its everything integrated in one single chip.

I believe in crossfire, its a good way of improving performance. In a console its even better: developers know that if something doesn't work well, they will have to fix it before release, and this fix will work on all (PS4) consoles. because they are all the same internally.
Llano in Hybrid Crossfire is most certainly not integrated into one chip. You also have to consider that the GPU's built into Llano are not very powerful, and even in Hybrid Crossfire are nothing spectacular compared to even an entry-level discrete GPU. As someone said earlier in here, the best Hybrid Crossfire setup you can do is only about as good as an HD6670, which doesn't sound majorly impressive to me. The PS3 and 360 used--at the time--rather high-end and advanced GPU's, and if these companies are really pushing for longer life cycles on consoles, they need to do at least that.

I also think it's more likely they would release the games with any Crossfire-induced bugs and try and patch around it similar to how AMD does with drivers (hopefully more successfully than AMD does with drivers but I digress). It's adding more of a burden to developers than necessary, which is the EXACT same problem developers had with CELL.
Posted on Reply
#2
ompak5
by: xenocide
Llano in Hybrid Crossfire is most certainly not integrated into one chip. You also have to consider that the GPU's built into Llano are not very powerful, and even in Hybrid Crossfire are nothing spectacular compared to even an entry-level discrete GPU. As someone said earlier in here, the best Hybrid Crossfire setup you can do is only about as good as an HD6670, which doesn't sound majorly impressive to me. The PS3 and 360 used--at the time--rather high-end and advanced GPU's, and if these companies are really pushing for longer life cycles on consoles, they need to do at least that.

I also think it's more likely they would release the games with any Crossfire-induced bugs and try and patch around it similar to how AMD does with drivers (hopefully more successfully than AMD does with drivers but I digress). It's adding more of a burden to developers than necessary, which is the EXACT same problem developers had with CELL.
i think the APU to be used on this console will be more powerfull compare to the current APU for desktop. it will be a redesign APU with a powerfull GPU, it will be less bug coz they only have one hardwre.
Posted on Reply
#3
Dent1
by: xenocide

I also think it's more likely they would release the games with any Crossfire-induced bugs and try and patch around it similar to how AMD does with drivers (hopefully more successfully than AMD does with drivers but I digress). It's adding more of a burden to developers than necessary, which is the EXACT same problem developers had with CELL.
Crossfire induced bugs? Please go back to page 3, post #75. I already addressed that issue.
Posted on Reply
#4
xenocide
by: ompak5
i think the APU to be used on this console will be more powerfull compare to the current APU for desktop. it will be a redesign APU with a powerfull GPU, it will be less bug coz they only have one hardwre.
It said in the OP that it would be a Stars-based (Llano APU) and those have been readily available for a year. I doubt they would waste production space on a special design just for the PS3. I don't think GloFo or TSMC have the available space to do so without charging a mark up (further increasing the cost of the system).

They cannot just throw a more powerful GPU on an older CPU design by the way. They have to consider power consumption and heat generation, as well as available die space. If none of the these things were very real concerns we'd see Trinity launching with the likes of HD7860D's on them or something of that nature.

by: Dent1
Crossfire induced bugs? Please go back to page 3, post #75. I already addressed that issue.
Which I was responding to. I think you'd sooner see developers push that kind of stuff off than spend time fixing it right away. If AMD has issues getting Crossfire to work so frequently I doubt regular developers would be immune from the issues. I still think a single more powerful GPU would be an infinitely better solution.
Posted on Reply
#5
Irish_PXzyan
Should I be excited about the PS4 or not?????
Posted on Reply
#6
THE_EGG
by: Irish_PXzyan
Should I be excited about the PS4 or not?????
I suppose that's for you to decide.
Posted on Reply
#7
Irish_PXzyan
Usually I get all hyped up about the latest and greatest Play Station but with these rumors it's not getting me excited at all!
I don't want to get the latest Gen console if it's still going to be 720p and look rubbish on any 1080p HD tv :/
It doesn't really sound next gen at all to me. Not impressed.
Posted on Reply
#8
Dent1
by: xenocide
If AMD has issues getting Crossfire to work so frequently I doubt regular developers would be immune from the issues..
What issues specifically? If you are talking about performance issues then crossfire typically always increases performance, even in badly supported titles, granted it isn't always 100% performance boost, but the boost is typically present.

If you are talking about bluescreens, crashes and lock ups - those issues are due to variables pertaining to driver conflicts, configurations, software/hardwre variations and is not a major factor in a console due to standardisation of equipment.


by: xenocide

I still think a single more powerful GPU would be an infinitely better solution.
I guess that's one thing we're in agreement about. I would rather see a single powerful GPU too (but for different reasons).
Posted on Reply
#9
theoneandonlymrk
by: Dent1
If you are talking about bluescreens, crashes and lock ups - those issues are due to variables pertaining to driver conflicts, configurations, software/hardwre variatiosn and is not a major factor in a console due to standardisation of equipment.
quite right if setup well xfire,( as sli )is faultless ,mines worked well this last year with no issues,

firstly hopefully they will go with trinity, as they wouldnt want the next 4/6 core xbox piping them in performance so im optimistically declareing this news BS

but either way not long ago AMD were talking about games coded bare metal style(like on consoles )being a future possibility on pc, and if you consider the fact that pc's under utilise gpu's as it is 2x low end gpu's can be made to pack a punch, plus sony are said to be adding in chips for functionality / performance, without knowing the full details of these ,and how they might enhance performace its a bit early to right sony or the ps4 off imho
Posted on Reply
#11
vagxtr
by: btarunr
Sony's NGN PS is known to be powered by an AMD x86-64 CPU with graphics based on its Southern Islands architecture, from the older report. We're now hearing that Sony may implement a multi-GPU solution of its own.
custom-version of AMD's A8-3850 4core APU. (based on AMD's K10 Stars architecture, rather than K15 Bulldozer/Piledriver.)

and HD 7670 (HD6670 rebranding based on the 40 nm GPU "Turks")
Turks uses neither GCN nor VLIW4, but the older VLIW5 number-crunching machinery. The most interesting piece of information here is talk of a multi-GPU configuration between this Turks-based GPU, and the GPU that's embedded into the "Llano" APU. We know that the graphics core embedded into AMD A8-3850, the Radeon HD 6550D, can work in tandem with Radeon HD 6670 to yield an AMD Hybrid CrossFireX configuration called "Radeon HD 6690D2". This could be end up being Sony's graphics weapon of choice.
So should this mean that nextgen consoles are going to have MicroStuttering as fully advertised feature?

CFX is a nice feature especially if and when hardware is already available so there's no waste of resources. This time end-user experience (EUXP) during gameplay ain't of any crucial significance it's reather how to build readily available hardware on cheapest process (40nm?) and offer something newer than PS3 that's already 5yrs old (in Q1 2013 will be 6yrs obsolete product)

I think it would be interesting to see where Soy will produce DAMNs IP licenced chips and at what price. TSMC 28nm is highly overpriced and UMC who should have it in H2 2012 could offer some alternative. Or it will be a glofo. It would be grief if they would produce and optimise it for already obsolete 40nm which is 2-3 times cheaper but then it would suck 200W more than DAMNs APU on 32nm SOI that is in production now. It crapptastic for customers which already dont know on what they would waste their money.

But then DAMNs proposal is to respin consoles from 6-7yrs cycle to 3-4yrs and that could be part of this agenda.

I dont think obsoleted VLIW5 engine would be in NGC PS4 as it performs best but just for it's compatibility to build CFX as you mentioned. And we could hail microstuttering feature finally arrives into consoles world.

Would those APU+GPU mean really that there wil be separate GPU in PS4 not as before uni-chip-composite aka. Quasi-SoC?

by: Dos101
I doubt the next generation of consoles will be as expensive as this generation's were at launch, so I think that's what MS and Sony are going for. Neither one of them wants to sell their systems at a loss anymore so they're going for reasonable specs, not high end. Plus if Sony (and MS) want to release a new console in 2013, the specs would have been generally decided a while ago, it's just how the process is.
I think thats more like of DAMNs agenda to fertilize their investment in hardware that Sony-MS-DAMN consortium agreed upon rather than that we should see satisfying experience on superior optimized apps/games that will run on those TWO-GEN Obsoleted Hadware when it finally arrives to the market.

by: Kaleid
Microstuttering now for consoles?
Yep. Finally i might add :D
Posted on Reply
#12
vagxtr
by: Mike_b
In all honesty if this is what sony has planed for the next gen i think i'll pass. (And i'm probably one of sonys biggest fans) i expected them to at least use a trinity based APU paired with a customized version of the HD7750 it's not only a better performing chip, but it's also amazingly efficient. I'm very disappointed if this is true, Sony you can do better than this.
It's impossible because even Piledriver APU supposedly should have only VLIW4 based engine not GCN, and different tech GPUs cant mix, hell even same tech GPUs cant mix in DAMNs drivers, but only same chips with same thing enabled only different clocks (like in HD4800 series). It's poor practice but when there's BLIND SUPPORTERS for it why the hell not abuse them and milk money out of their pockets.

I'm not dissapointed with Sony or DAMN.

Why o why o why o why nobody ever complains about same shabby GigaIntel practice that they NEED TO SOLD OUT their little or no upgrade chipsets everytime when release their shiny tic-toc chip out of the box??? It's shabby practice Intel does for 25yrs. And now Sony-MS-DAMN is THE ONLY evil kartel?
Posted on Reply
#13
NdMk2o1o
All 4 pages TL;DR

Are these specs confirmed or not?
Posted on Reply
Add your own comment