Monday, February 9th 2015

Radeon R9 380X Based on "Grenada," a Refined "Hawaii"

AMD's upcoming Radeon R9 380X and R9 380 graphics cards, with which it wants to immediately address the GTX 980 and GTX 970, will be based on a "new" silicon codenamed "Grenada." Built on the 28 nm silicon fab process, Grenada will be a refined variant of "Hawaii," much in the same way as "Curacao" was of "Pitcairn," in the previous generation.

The Grenada silicon will have the same specs as Hawaii - 2,816 GCN stream processors, 176 TMUs, 64 ROPs, and a 512-bit wide GDDR5 memory interface, holding 4 GB memory. Refinements in the silicon over Hawaii could allow AMD to increase clock speeds, to outperform the GTX 980 and GTX 970. We don't expect the chip to be any more energy efficient at its final clocks, than Hawaii. AMD's design focus appears to be performance. AMD could save itself the embarrassment of a loud reference design cooler, by throwing the chip up for quiet custom-design cooling solutions from AIB (add-in board) partners from day-one.
In other news, the "Tonga" silicon, which made its debut with the performance-segment Radeon R9 285, could form the foundation of Radeon R9 370 series, consisting of the R9 370X, and the R9 370. Tonga physically features 2,048 stream processors based on the more advanced GCN 1.3 architecture, 128 TMUs, 32 ROPs, and a 384-bit wide GDDR5 memory interface. Both the R9 370 and R9 370X could feature 3 GB of standard memory amount.

The only truly new silicon with the R9 300 series, is "Fiji." This chip will be designed to drive AMD's high-end single- and dual-GPU graphics cards, and will be built to compete with the GM200 silicon from NVIDIA, and the GeForce GTX TITAN-X it will debut with. This chip features 4,096 stream processors based on the GCN 1.3 architecture - double that of "Tonga," 256 TMUs, 128 ROPs, and a 1024-bit wide HBM memory interface, offering 640 GB/s of memory bandwidth. 4 GB could be the standard memory amount. The three cards AMD will carve out of this silicon, are the R9 390, the R9 390X, and the R9 390X2.
Source: 3DCenter.org
Add your own comment

156 Comments on Radeon R9 380X Based on "Grenada," a Refined "Hawaii"

#126
xfia
GCN ftw :) I would be excited to see what a desktop carrizo could do.
Posted on Reply
#127
ZoneDymo
arbiterFun Fact: AMD wasn't breaking any laws if they let it go. As long as they didn't provide any support for the hack or promote it, they would been free and clear of any liability.



"sarcasm" yea those effects don't make the game look more real as smoke would in real live "/sarcasm"

I am done with this thread, turning in an AMD fanboy thread trying to twist history to make AMD look like the super hero with a can do no wrong persona and Nvidia the super villain.
Ermm very odd to throw that AMD Fanboy statement after your reaction to my comment.
That is not even close to appropriate.

With the physics I was speaking from experience as an Nvidia user (some of us just switch between brands and are not glued down).
And no, the way smoke reacts to you in physX enabled Batman AA is not realistic at all, it becomes this palpable substance that roles off batman almost, that is just overdoing it massively.
Glowing flying orbs with every specal attack in Warframe makes no sense/is not realisic either, its just shiney colored orbs (ooh so pretty right?).
Maaaaybe Mirror's Edge can be considered a visual increase with it on, breaking glass and tearing cloth etc.

Although on that note, I am playing Splinter Cell 1 on the gamecube again and im mighty impressed with the way cloth reacts in that old old game, and that without PhyX.

I like what PhysX COULD do for us, but the way it is, exclusive to Nvidia, its going nowhere, unless Nvidia to borderline buy the development of an entire game so they can make it about physX from the start.
But that will not happen so it will be just a gimmicky addition and never what it should be.
Oh well, Havok 2.0 is coming still, maybe that will move some mountains.
Posted on Reply
#128
ZoneDymo
Harry LloydIf AMD finally caught up, maybe Intel would have to do something. Right now they are focusing on mobile, while desktops just sit there doing nothing, because they have had the most powerful CPUs since 2006.
I miss the days where AMD CPUs were better for gaming (Duron, Athlon XP, Athlon 64), while being cheaper as well.

At least the GPUs are ok, though power hungry, but they are not really AMD, they just bought what was good.
Kinda hard when everybody and their mother buys Intel and tells everybody and their mother to do the same.
AMD is a muuuuuch smaller company and does not nearly have the research resources Intel has.
It might be the only competition Intel has but calling it competition is pushing it.

Luckily the pricing makes up for that making it all viable options though.
Posted on Reply
#129
THU31
ZoneDymoI like what PhysX COULD do for us, but the way it is, exclusive to Nvidia, its going nowhere, unless Nvidia to borderline buy the development of an entire game so they can make it about physX from the start.
PhysX in Borderlands 2 and Pre-sequel is insane. It completely changes the game. It can be done, if only developers want to do it.

XBO and PS4 officially support PhysX, so developers can implement it in any game. Unfortunately those consoles have no power, which will make that rather difficult.

Posted on Reply
#130
TheHunter
And what does this nvidia stuff have to do with this AMD thread? Yeah nothing...
Posted on Reply
#131
dyonoctis
Funny to see that nvidia is willing to work to make some of their physx/visualfx enjoyable on console running amd hardware, but do not give much option to optimize these effects on amd desktop...this is the part of the competition that i hate the most, that basically mean that amd would have to work on a solution of their own, and a game running both nvidia and amd "extra" will never happen.....
Posted on Reply
#132
Sony Xperia S
Unfortunately this shit always comes from nvidia only. I don't remember AMD doing anything closed standard...... :rolleyes:
Posted on Reply
#133
HumanSmoke
Sony Xperia SUnfortunately this shit always comes from nvidia only. I don't remember AMD doing anything closed standard...... :rolleyes:
You don't? Google AMD's XGP. To protect what they thought was going to be a growth market,they made the connections proprietary. Needless to say, it died a horribly protracted death. If you're sailing the proprietary course, youneed to determine the market viability and back the technology fully. ATI (and later AMD) did neither.
Posted on Reply
#134
ZoneDymo
Harry LloydPhysX in Borderlands 2 and Pre-sequel is insane. It completely changes the game. It can be done, if only developers want to do it.

XBO and PS4 officially support PhysX, so developers can implement it in any game. Unfortunately those consoles have no power, which will make that rather difficult.

Insane?
Totally changes the game?

www.geforce.com/whats-new/articles/borderlands-2-physx

Ermmm you mean some extra particles that bounce away when you shoot something or some particle made water flowing somewhere?
Because that does not change the game in any way shape or form.
Its exactly the same gimmicky nonsense that physX does in Warframe.
Hell in that article they do not refer to the physX as "effects" for nothing, thats all it adds, some effects.

It adds nothing but some orbs flying around, while it could be the entire basis for how things are build up and react (ya know... Physics) like those tech demo's they show of it.

The fact that you can turn it off is pretty much the dead give away that it infact does not "totally change the game" because a game that is based around those physX would not work without it.
You cannot turn havok off in for example HL2, because the game does not function anymore if that would be the case.
Posted on Reply
#135
arbiter
dyonoctisFunny to see that nvidia is willing to work to make some of their physx/visualfx enjoyable on console running amd hardware, but do not give much option to optimize these effects on amd desktop...this is the part of the competition that i hate the most, that basically mean that amd would have to work on a solution of their own, and a game running both nvidia and amd "extra" will never happen.....
MS and Sony both Licensed the use of PhysX for XBone and PS4 so yea nvidia will work with them.
Sony Xperia SUnfortunately this shit always comes from nvidia only. I don't remember AMD doing anything closed standard...... :rolleyes:
Hrm, lets see mantle comes to mind. Freesync as well since that is a proprietary solution of the standard. Wonder what else i am forgetting.
Posted on Reply
#136
THU31
ZoneDymoErmmm you mean some extra particles that bounce away when you shoot something...
Not SOME.

I played the game for the first time without PhysX. When I played it the second time, I was blown away. There are hundreds of those particles, if not thousands in huge firefights. Also, a lot of debris actually stays on the ground, and interact with your shots and grenade.
Best PhysX implementation I have ever seen.

It does not change the gameplay, it changes the visuals. If you cannot appreciate it, then you must be really spoiled.
Posted on Reply
#137
wiak
HumanSmokeThe really odd thing about this lineup, is what AMD expects to field in the discrete mobile arena. Presently, the top part is Pitcairn based (M290X) in its third generation of cards. The M295X's (Tonga) heat production in the iMac probably preclude its use in laptops, and Hawaii is clearly unsuitable.

That's where HBM starts. Better too have too much bandwidth than too little.

Hey, someone has to lead the charge. Just imagine the marketing mileage from 640GB/sec. It's like those nutty theoretical fillrates pumped up to eleven!

Fiji will do double duty as a compute chip, where on-card bandwidth will play a much greater role in GPGPU. FWIW, even Nvidia are unlikely to go below 384-bit for their compute chip. The one thing that will hold Fiji back is the 4GB as a FirePro (not the bandwidth). AMD already has the W9100 with 16GB of onboard GDDR5 for a reason.
well the solution is to double to 8GB? :p, but for consumers pretty sure they will start with 4GB on fairly new memory type, GDDR5 was pretty when amd introduced it
Posted on Reply
#138
HumanSmoke
wiakwell the solution is to double to 8GB? :p
No mean feat since HBM1 is limited to 4 stacks of 1GB each (4 layers @ 2Gbit per chip), and the Fiji chip is specced for 4 stacks (1024Gbit x 4 for 4096Gbit bus width)
Posted on Reply
#139
DinaAngel
I bought nvidia gpus because of physx and I totally regretted it over time. It's just a performance drop. The physx effects are used by like maybe 30 games.

If the 380x or the 390x is very good then I might jump over to AMD
Posted on Reply
#140
petteyg359
arbiterHrm, lets see mantle comes to mind. Freesync as well since that is a proprietary solution of the standard. Wonder what else i am forgetting.
Freesync: No license fee, no additional hardware needed to add to your monitor, uses a built-in feature of DisplayPort 1.2a, hmm... It's an implementation of a standard. None of the connotations of "proprietary" apply. Do you also complain that AMI and Phoenix have their own BIOS implementations, and that Intel and AMD have their own x86 implementations, and that BouncyCastle and OpenSSL each have their own crypto implementations?

Mantle: Duh. It's an interface for their specific hardware. Do you expect Atheros to write drivers that work with Broadcom chips, too?
Posted on Reply
#141
HumanSmoke
petteyg359Freesync: No license fee, no additional hardware needed to add to your monitor, uses a built-in feature of DisplayPort 1.2a, hmm... It's an implementation of a standard. None of the connotations of "proprietary" apply.
Seems you don't understand some of the terms you're using. FreeSync is hardware limitedto the extent that even a large number of AMD's own graphics cards aren't supported (notably the HD 7000, 6000 series, and a whole bunch of R (-5/-7/-9) ebranded cards launched - in some cases, less than a year ago)
petteyg359Do you also complain that AMI and Phoenix have their own BIOS implementations,
arbiter was showing examples of AMD proprietary tech, and you quote other examples of proprietary tech for what reason exactly?
American Megatrends and Phoenix BIOS's are proprietary tech, and has been since Phoenix devised their BIOS and started charging $290,000 per vendor licence and $25 per BIOS ROM chip in May 1984.
petteyg359Mantle: Duh. It's an interface for their specific hardware
"an interface for their specific hardware" is pretty much a definition of proprietary.
petteyg359Do you expect Atheros to write drivers that work with Broadcom chips, too?
How is this any different from expecting Nvidia to write engine code for AMD's driver and hardware? Because one thing is certain, AMD has no interest in using, nor supporting PhysX.
Posted on Reply
#142
petteyg359
HumanSmoke"an interface for their specific hardware" is pretty much a definition of proprietary.
Hence the "duh" when somebody complains that it is proprietary. Of course it is proprietary. The whole point is that you don't expect nVidia to code for it, no more than you expect Atheros to write Broadcom drivers.
HumanSmokeHow is this any different from expecting Nvidia to write engine code for AMD's driver and hardware? Because one thing is certain, AMD has no interest in using, nor supporting PhysX.
1. Because nobody ever asked nVidia to write Mantle code.
2. I doubt nVidia ever offered the implementation details without requesting a bunch of money in exchange.

You think it's okay to complain that nVidia would have to license Mantle, but whine about AMD not having licensed PhysX in the same breath? :rolleyes:
HumanSmokeSeems you don't understand some of the terms you're using. FreeSync is hardware limited
Seems you don't understand your own argument. It's hardware limited just like 4k over HDMI, 9k jumbo packets, and AVX. It requires DisplayPort 1.2a because that's where the specification exists to be implemented. A claim of it being "proprietary" is more like "My hardware is too old for this new stuff! Why can't I run this x64 AVX code on my Pentium 4?! I'm gonna whine about it!" nVidia supports DisplayPort; there's absolutely nothing stopping them from creating their own Adaptive-Sync driver-side implementation (and they could even call it Expensive$ync if they want to confuse people who don't understand that it is just an implementation of a damn standard just like FreeSync).
Posted on Reply
#143
HumanSmoke
petteyg359Hence the "duh" when somebody complains that it is proprietary. Of course it is proprietary. The whole point is that you don't expect nVidia to code for it, no more than you expect Atheros to write Broadcom drivers.
So now its proprietary. You've just said it wasn't
petteyg359None of the connotations of "proprietary" apply.
You can understand how people might no be following your logic, right?
petteyg3591. Because nobody ever asked nVidia to write Mantle code.
Don't think they care TBH. Also, they'd be shit out of luck even if they had. AMD's Mantle is closed Beta. Even Intel has been denied access.
"I know that Intel have approached us for access to the Mantle interfaces, et cetera," Huddy said. " And right now, we've said, give us a month or two, this is a closed beta, and we'll go into the 1.0 [public release] phase sometime this year, which is less than five months if you count forward from June. -Richard Huddy, June 2014
petteyg3592. I doubt nVidia ever offered the implementation details without requesting a bunch of money in exchange.
And? 1. Nobody is disputing PhysX isn't proprietary, and 2. Even if the point being made concerned PhysX (which it isn't- It's about whether AMD produce proprietary IP), Nvidia paid $150 million for AGEIA - why would they give anyone free access to something they paid dearly for?
petteyg359You think it's okay to complain that nVidia would have to license Mantle, but whine about AMD not having licensed PhysX in the same breath? :rolleyes:
I've said no such thing. What you fail to understand is that stating the facts - that Mantle is not open source, automatically means that it is proprietary in nature. Now if you can find ANY post where I said that Mantle should be made available to Nvidia gratis then quote it. If you cannot (and you will not since I am already on record as stating Nvidia would never use code ultimately controlled AMD) then I kindly suggest you STFU with regards to misquoting me or anyone else. Creating straw man arguments and misquoting is not the way to advance a position.
Posted on Reply
#144
petteyg359
HumanSmokeSo now its proprietary. You've just said it wasn't
Never said any such thing, unless you choose to interpret it very strangely and believe that I said Atheros/Broadcom hardware/software is non-proprietary.
Posted on Reply
#145
HumanSmoke
petteyg359Never said any such thing, unless you choose to interpret it very strangely and believe that I said Atheros/Broadcom hardware/software is non-proprietary.
I smell yet another straw man...
petteyg359Freesync: No license fee, no additional hardware needed to add to your monitor, uses a built-in feature of DisplayPort 1.2a, hmm... It's an implementation of a standard. None of the connotations of "proprietary" apply.
Plenty of proprietary tech is free. Nvidia's CUDA is a prime example - proprietary and freeware. With that in mind, FreeSync certification is free as well, but AMD decide who gets the certification:
Certification of FreeSync monitors will be handled by AMD directly. The company says it wants to ensure its brand is synonymous with a "good experience."
FreeSync as a brand requires an AMD graphics card, a FreeSync approved monitor, and AMD's Catalyst Control Center. That makes it proprietary by definition.
Posted on Reply
#146
petteyg359
And now you're countering arguments about Mantle with arguments about FreeSync. I smell "moving the goal posts", if things must be smelled.

WRT FreeSync "certification", that's no different than any other certification. If you want to stick somebody else's brand name on your product, you've got to get permission from them. That's how trademarks work, silly.
Posted on Reply
#147
HumanSmoke
petteyg359And now you're countering arguments about Mantle with arguments about FreeSync. I smell "moving the goal posts", if things must be smelled.
WTF are you talking about. This whole exchange is centred around whether AMD produce proprietary IP as you took initially took issue with arbiter's post. Mantle and Freesync are both AMD tech, they are both proprietary either in practice, or branding.
petteyg359WRT FreeSync "certification", that's no different than any other certification. If you want to stick somebody else's brand name on your product, you've got to get permission from them. That's how trademarks work, silly.
Of course, you're welcome to point out that in your view FreeSync is just VESA's Adaptive Sync by another name - and I wouldn't argue any differently, but FreeSync marketing is AMD trade marked, and of course if they are exactly the same then AMD's stance is at odds with reality:

Posted on Reply
#148
arbiter
petteyg359Freesync: No license fee, no additional hardware needed to add to your monitor, uses a built-in feature of DisplayPort 1.2a, hmm... It's an implementation of a standard. None of the connotations of "proprietary" apply. Do you also complain that AMI and Phoenix have their own BIOS implementations, and that Intel and AMD have their own x86 implementations, and that BouncyCastle and OpenSSL each have their own crypto implementations?

Mantle: Duh. It's an interface for their specific hardware. Do you expect Atheros to write drivers that work with Broadcom chips, too?
HumanSmokeSeems you don't understand some of the terms you're using. FreeSync is hardware limitedto the extent that even a large number of AMD's own graphics cards aren't supported (notably the HD 7000, 6000 series, and a whole bunch of R (-5/-7/-9) ebranded cards launched - in some cases, less than a year ago)
I will add another bit to this, You said no additional hardware is required. Besides that List of limited gpu's that support it. Just cause a monitor has DP 1.2a doesn't mean it supports adaptive sync. There was new hardware required in the monitors in the form of a new scaler chip which g-sync had on its module to start with. So the monitors needed this knew scaler to support adaptive sync. Adaptive part of the spec is optional not required for 1.2a. When AMD claimed no new hardware was needed well, a new monitor with one the new scaler is needed.

AMD when they announced freesync they claimed some current monitors supported it with no new hardware. Well that is AMD PR marketing for ya.
HumanSmokeWTF are you talking about. This whole exchange is centred around whether AMD produce proprietary IP as you took initially took issue with arbiter's post. Mantle and Freesync are both AMD tech, they are both proprietary either in practice, or branding.

Of course, you're welcome to point out that in your view FreeSync is just VESA's Adaptive Sync by another name - and I wouldn't argue any differently, but FreeSync marketing is AMD trade marked, and of course if they are exactly the same then AMD's stance is at odds with reality:

In the end Freesync is a proprietary implication of the standard. No matter how you cut that its still proprietary no matter what.

"AMD freesync tech is a unique AMD hardware/software"
^ Another way to say proprietary
Posted on Reply
#149
petteyg359
arbiterIn the end Freesync is a proprietary implication of the standard. No matter how you cut that its still proprietary no matter what.
All vendor implementations of all standards are proprietary by that definition. Way to completely invalidate your own argument!
Posted on Reply
#150
AsRock
TPU addict
ZoneDymoInsane?
Totally changes the game?

www.geforce.com/whats-new/articles/borderlands-2-physx

Ermmm you mean some extra particles that bounce away when you shoot something or some particle made water flowing somewhere?
Because that does not change the game in any way shape or form.
Its exactly the same gimmicky nonsense that physX does in Warframe.
Hell in that article they do not refer to the physX as "effects" for nothing, thats all it adds, some effects.

It adds nothing but some orbs flying around, while it could be the entire basis for how things are build up and react (ya know... Physics) like those tech demo's they show of it.

The fact that you can turn it off is pretty much the dead give away that it infact does not "totally change the game" because a game that is based around those physX would not work without it.
You cannot turn havok off in for example HL2, because the game does not function anymore if that would be the case.
Basically dumbed down what could be done without PhysX hardware and made the extra which is not really NV PhysX. Lets face it it's nothing in that vid that a CPU could not handle, never mind the game being cartoon like it seen much better in other games and even Arma 3 has better shit than that.

Shit GTA 4 has better Physics than that game and that ran on a good system today runs really well.
Posted on Reply
Add your own comment
May 10th, 2024 02:58 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts