• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon R9 380X Based on "Grenada," a Refined "Hawaii"

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,378 (2.37/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Who cares about efficiency really.
You should if you value the viability of the company.
Efficiency at the high end really isn't that pressing a concern - Nvidia built markets share fielding the GT200 and GF100 - although might be a consideration in non-gaming scenarios.
Where you should care about efficiency is that the architecture scales efficiently with the smaller GPUs. As I said earlier, AMD aren't competitive in discrete mobile, and that also reflects on low/mid range OEM builds, where the vast majority of discrete sales happen.
You can say that efficiency doesn't matter (to you), but it just means one more strike against sales and revenue, which means less R&D funding, which means that AMD may not be able to field a top-to-bottom GPU refresh(!). So many things don't seem to matter with AMD - GPU efficiency, enthusiast desktop, the x86 server market....sooner or later the sum total of these "doesn't matter" must indeed matter.
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Wouldn't such time you'd be "Sleeping" and wouldn't AMD ZeroCore be in play.
Just saying.

Yes, that would further cut back on watts being used while idling. For simplicities sake I used the idle watts measurements from the cards reviewed on this site.
 
Joined
Apr 10, 2012
Messages
1,400 (0.32/day)
Location
78°55' N, 11°56' E
System Name -aLiEn beaTs-
Processor Intel i7 11700kf @ 5.055Ghz
Motherboard MSI Z490 Unify
Cooling Corsair H115i Pro RGB
Memory G.skill Royal Silver 4400 cl17 @ 4403mhz
Video Card(s) Zotac GTX 980TI AMP!Omega Factory OC 1418MHz
Storage Intel SSD 330, Crucial SSD MX300 & MX500
Display(s) Samsung C24FG73 144HZ
Case CoolerMaster HAF 932 USB3.0
Audio Device(s) X-Fi Titanium HD @ 2.1 Bose acoustimass 5
Power Supply CoolerMaster 850W v2 gold atx 2.52
Mouse Razer viper 8k
Keyboard Logitech G19s
Software Windows 11 Pro 21h2 64Bit
Benchmark Scores ► ♪♫♪♩♬♫♪♭
As long as they release Fiji XT aka 390/390X soon. I dont mind if they rebrand Cayman chip for that matter, nvidia has been doing too a lot so not really important.

So you all argue about power consumption with 800w+ PSUs? Really, chill.. Power consumption is overrated. :p


Also there is no TITAN-X or what ever you all like about Titan, GM200 won't have DP so no Titan variant, just Geforce (google Nvidia Japan conference 30-12-2014)..
Talk about Wccf spreading this false Titan-X hype to the max..:shadedshu:
 
Last edited:
Joined
Apr 30, 2008
Messages
4,875 (0.84/day)
Location
Multidimensional
System Name Boomer Master Race
Processor AMD Ryzen 7 7800X3D 4.2Ghz - 5Ghz CPU
Motherboard MSI B650I Edge Wifi ITX Motherboard
Cooling CM 280mm AIO + 2x 120mm Slim fans
Memory G.Skill Trident Z5 Neo 32GB 6000MHz
Video Card(s) Galax RTX 4060 8GB (Temporary Until Next Gen)
Storage Kingston KC3000 M.2 1TB + 2TB HDD
Display(s) Asus TUF 24Inch 165Hz || AOC 24Inch 180Hz
Case Cooler Master NR200P Max TG ITX Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply CoolerMaster V850 SFX Gold 850W PSU
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 10 Home 64bit
Benchmark Scores Don't do them anymore.
Least with 680 to 770, Nvidia did clock bumps and made 770 faster, unlike AMD where 7970 to 280x had its clocks lowered.

news articles can bash all they want pointing at past where a company complete screwed up on a product and yes that stock blocker from a 6000 series card was a complete screw up, Pretty bad that you lose 20% performance after 5min of gaming

I am laughing up a storm atm, remember all the AMD fans jumping on rumors and thinking 380x was gonna be a 4096 gcn monster yet its not even close to what they were expecting.


HD7970 - 925mhz GPU / 1375mhz Memory || R9 280x 1000mhz GPU / 1500mhz You were saying? o_O But I'm pretty sure you were referring to the HD7970Ghz edition. Still the 280x traded blows well with the GTX 770 while being slightly cheaper depending on the country you lived in.

Agreed those reference coolers were awful just like the GTX 480's cooler.

I don't see the big deal :wtf: If the 380x was going to be the 4096 monster, pretty sure the 390x would of been the dual GPU config. Just different labels :wtf:
 
Joined
Aug 2, 2011
Messages
1,448 (0.31/day)
Processor Ryzen 9 7950X3D
Motherboard MSI X670E MPG Carbon Wifi
Cooling Custom loop, 2x360mm radiator,Lian Li UNI, EK XRes140,EK Velocity2
Memory 2x16GB G.Skill DDR5-6400 @ 6400MHz C32
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra OC Scanner core +750 mem
Storage MP600 2TB,960 EVO 1TB,XPG SX8200 Pro 1TB,Micron 1100 2TB,OCZ Vector 512GB,1.5TB Caviar Green
Display(s) Acer X34S, Acer XB270HU
Case LianLi O11 Dynamic White
Audio Device(s) Logitech G-Pro X Wireless
Power Supply EVGA P3 1200W
Mouse Logitech G502 Lightspeed
Keyboard Logitech G512 Carbon w/ GX Brown
VR HMD HP Reverb G2 (V2)
Software Win 11
View attachment 62540

I could see how it would be possible for shrinking gpu's to show many of the same problems. they are loving smaller lith for for mobile devices but perhaps there is bigger hurtles on the high end gpu side of things.

I find it funny people simply looked over this. For good reason too.

You're comparing apples to oranges here, for two different reasons.

1. You're comparing CPU architecture to GPU, which are very different in design.
2. You're comparing a chip produced by Intel's fabs to that of one designed by AMD, but produced at either TSMC or Global Foundries.
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Also there is no TITAN-X or what ever you all like about Titan, GM200 won't have DP so no Titan variant, just Geforce (google Nvidia Japan conference 30-12-2014)..
Why would the name Titan have to be linked to double precision? Why couldn't Nvidia differentiate a Titan from a 980 Ti by higher clocks, higher board power limit, larger vRAM allocation. Just because something happened in a previous series, it doesn't automatically follow that the convention is set in stone. There are plenty of examples where just a difference in clock has divided two completely separate models - two that spring immediately to mind are the HD 3870/3850 and HD 4870/4850.
Talk about Wccf spreading this false Titan-X hype to the max..:shadedshu:
WTFtech is all about the hype, whatever flavour. Take it seriously at your peril. I seem to remember that this forum went batshit crazy when WCCF announced AMD's GPUs were going to be 20nm even as people (including myself) attempted to inject some reason by showing that 20nm isn't particularly feasible for large GPUs. If the rumour fits peoples wish list, good luck with trying to dissuade them.
 
Last edited:
Joined
Jun 13, 2012
Messages
1,316 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
HD7970 - 925mhz GPU / 1375mhz Memory || R9 280x 1000mhz GPU / 1500mhz You were saying? o_O But I'm pretty sure you were referring to the HD7970Ghz edition. Still the 280x traded blows well with the GTX 770 while being slightly cheaper depending on the country you lived in.

I was refering to ghz since its pretty much only card people remember of 7970, originally rlsed in janurary, 2 month later in march nvidia dropped the 680 bomb shell on them. In may AMD pushed bios for ghz clocks and new cards.
 
Joined
Mar 11, 2007
Messages
702 (0.11/day)
Processor Intel Core i5 4690K
Motherboard AsRock Z97 Extreme4
Cooling Hyper 212 Evo
Memory 16GB
Video Card(s) R9 Nano
Storage 256GB SATA SSD 2TB WD Blue
Display(s) 1920x1080
Case Cooler Master Elite 130
Power Supply CX650M
Software Argh, Windows 10. I hated Windows 7. I hate Windows 10 more. Give me back XP!!!
It seems AMD is doubling down on their Fermi 2.0!
 
Joined
Apr 19, 2011
Messages
2,198 (0.46/day)
Location
So. Cal.
Yes, that would further cut back on watts being used while idling. For simplicities sake I used the idle watts measurements from the cards reviewed on this site.
Well there's one site that has at times has shown "monitor-off" power numbers, although anymore I can't recall which. That's site has shown how something like a R9 280 will drop to like 2 Amps, while a 760 is still pulling 7Amps all while doing nothing... It's incredible that isn't factored into the conversation and equation on efficiency.
 
Joined
Apr 10, 2012
Messages
1,400 (0.32/day)
Location
78°55' N, 11°56' E
System Name -aLiEn beaTs-
Processor Intel i7 11700kf @ 5.055Ghz
Motherboard MSI Z490 Unify
Cooling Corsair H115i Pro RGB
Memory G.skill Royal Silver 4400 cl17 @ 4403mhz
Video Card(s) Zotac GTX 980TI AMP!Omega Factory OC 1418MHz
Storage Intel SSD 330, Crucial SSD MX300 & MX500
Display(s) Samsung C24FG73 144HZ
Case CoolerMaster HAF 932 USB3.0
Audio Device(s) X-Fi Titanium HD @ 2.1 Bose acoustimass 5
Power Supply CoolerMaster 850W v2 gold atx 2.52
Mouse Razer viper 8k
Keyboard Logitech G19s
Software Windows 11 Pro 21h2 64Bit
Benchmark Scores ► ♪♫♪♩♬♫♪♭
Why would the name Titan have to be linked to double precision? Why couldn't Nvidia differentiate a Titan from a 980 Ti by higher clocks, higher board power limit, larger vRAM allocation. Just because something happened in a previous series, it doesn't automatically follow that the convention is set in stone. There are plenty of examples where just a difference in clock has divided two completely separate models - two that spring immediately to mind are the HD 3870/3850 and HD 4870/4850.

WTFtech is all about the hype, whatever flavour. Take it seriously at your peril. I seem to remember that this forum went batshit crazy when WCCF announced AMD's GPUs were going to be 20nm even as people (including myself) attempted to inject some reason by showing that 20nm isn't particularly feasible for large GPUs. If the rumour fits peoples wish list, good luck with trying to dissuade them.
Because that's what a Titan is, a crippled Tesla card for consumers with Double precision, without it its not worthy of a Titan name, simple as.

What's more, nvidia said @ that Japan tech conference there won't be any DP gpu with Maxwell, only with Pascal and then we will see new Tesla/Titans again.
http://www.kitguru.net/components/g...lopment-of-graphics-processing-architectures/

Besides these GM200 Geforce 1080gtx? will feature 6Gb vram anyway, so Titan name is irrelevant now if that extra 3gb vram buffer by GK110 made them a little more special.



Anyway OT, can't wait for this FijiXT, really interested in that 3D memory, should make a big change or two at higher resolutions.
 
Joined
Jan 2, 2015
Messages
1,099 (0.33/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
I find it funny people simply looked over this. For good reason too.

You're comparing apples to oranges here, for two different reasons.

1. You're comparing CPU architecture to GPU, which are very different in design.
2. You're comparing a chip produced by Intel's fabs to that of one designed by AMD, but produced at either TSMC or Global Foundries.

I suppose your right to a certain point but its all silicone and it takes a lot of money and engineering to shrink..
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Because that's what a Titan is, a crippled Tesla card for consumers with Double precision, without it its not worthy of a Titan name, simple as.
The company might choose to use the name in any way it sees fit. Just because the previous Titan had a certain feature set, it doesn't mean that the next is bound by the same criteria. The name is Titan, not Titan Double Precision. You're welcome to your opinion, but please don't represent it as absolute fact.
Personally, I'd like to see the name changed to Zeus (son of Titans) if only to troll AMD rumour lovers, btarunr, and RCoon ;)
What's more, nvidia said @ that Japan tech conference there won't be any DP gpu with Maxwell
That is incorrect. GM 200 likely has double precision at the same rate of GM 204 ( 1:32 ). Insufficient for Tesla duties, but that is why the rep said that Kepler will continue to be the Tesla option - simply because GK 210's development has been in tandem with GM 200.
 
Joined
Apr 10, 2012
Messages
1,400 (0.32/day)
Location
78°55' N, 11°56' E
System Name -aLiEn beaTs-
Processor Intel i7 11700kf @ 5.055Ghz
Motherboard MSI Z490 Unify
Cooling Corsair H115i Pro RGB
Memory G.skill Royal Silver 4400 cl17 @ 4403mhz
Video Card(s) Zotac GTX 980TI AMP!Omega Factory OC 1418MHz
Storage Intel SSD 330, Crucial SSD MX300 & MX500
Display(s) Samsung C24FG73 144HZ
Case CoolerMaster HAF 932 USB3.0
Audio Device(s) X-Fi Titanium HD @ 2.1 Bose acoustimass 5
Power Supply CoolerMaster 850W v2 gold atx 2.52
Mouse Razer viper 8k
Keyboard Logitech G19s
Software Windows 11 Pro 21h2 64Bit
Benchmark Scores ► ♪♫♪♩♬♫♪♭
Titan is a class of its own, not really a high-end reference chip name. but if they do call it Titan, it won't have a Titan like price.

Current Titans cost so much because of FP64 DP, not because of extra 3GB vram.
That is incorrect. GM 200 likely has double precision at the same rate of GM 204 ( 1:32 ). Insufficient for Tesla duties, but that is why the rep said that Kepler will continue to be the Tesla option - simply because GK 210's development has been in tandem with GM 200.

Exactly, its improved FP32, but its not FP64. So it can't be used like with GK110 Titan @ FP64 mode.

Btw that GK210 is 2x improved and energy efficient GK110 each with 2496cores.. So its not really Maxwell either :)



What Im also trying to say is, all this means no "absurd" prices for us end-users, from both camps AMD FijiXT and NV GM200, the usual 550-650$/€.
 
Joined
Mar 1, 2013
Messages
67 (0.02/day)
The company might choose to use the name in any way it sees fit. Just because the previous Titan had a certain feature set, it doesn't mean that the next is bound by the same criteria. The name is Titan, not Titan Double Precision. You're welcome to your opinion, but please don't represent it as absolute fact.
Personally, I'd like to see the name changed to Zeus (son of Titans) if only to troll AMD rumour lovers, btarunr, and RCoon ;)

That is incorrect. GM 200 likely has double precision at the same rate of GM 204 ( 1:32 ). Insufficient for Tesla duties, but that is why the rep said that Kepler will continue to be the Tesla option - simply because GK 210's development has been in tandem with GM 200.

http://i.imgur.com/68pxm0d.gif
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Btw that GK210 is 2x improved and energy efficient GK110 each with 2496cores.. So its not really Maxwell either :)
Didn't say it was. GK 210 has to be Kepler (as should be apparent in the link I supplied). The revised silicon was obviously designed because Maxwell wasn't going to offer full FP64. Just for the record, GK 210 has twice the cache of the GK 110, a significant boost algorithm, and improved double precision performance in addition to the energy efficiency you mentioned - so quite a bit of design reworking, which would account for the length of its gestation (and why the company invested in the R&D if Maxwell wasn't targeting the co-processor market).
What Im also trying to say is, all this means no "absurd" prices for us end-users, from both camps AMD FijiXT and NV GM200, the usual 550-650$/€.
IF there aren't significant differences between the top GM 200 and the salvage parts, I'd agree. Titan's productivity popularity probably had less to do with FP64 than it did the 6GB of vRAM it carries for content creation duties. I wouldn't be at all surprised if Nvidia leveraged a top price against a 12GB top part. There are already 8GB 290X's, and I'd assume that would spill over to any rejigged SKUs. Assuming Nvidia answers that with 8GB GTX 980's, a 6GB GM 200 looks a little "underspecced" from a marketing viewpoint.
 
Last edited:
Joined
Oct 13, 2008
Messages
16 (0.00/day)
Location
Canada
System Name Enthusiast Built
Processor Intel Core i7 4770k @ 4.5
Motherboard MSI Z87- GD65 gaming
Cooling Kraken X60
Memory Intel overclocking series Patriot Viper 2400 Mhz
Video Card(s) EVGA 780 ti ACX Dual CLASSIFIED
Storage Intel SSD -120G, Samsung SSD-128G, ADATA SSD- 128G
Display(s) My tv: Samsung 27inch @1080i
Case NZXT
Audio Device(s) onboard 7.1 hd (
Power Supply EVGA 1000W
Software Windows 8.1 64Bit
Benchmark Scores coming soon
When will we have a universal physics support system from both Companies. Games do feel better with Nvidia physics applied to them. I cannot be the only one that wants this from game developers. Something like Havok physics did but GPU based.
 
Joined
Oct 2, 2004
Messages
13,791 (1.94/day)
They just "feel" better, but the aren't better. Especially if you'd know all the shit NVIDIA has been doing to push their PhysX crap (removing entire physics effects from games that used to be done through CPU in other games, basic stuff like smashed glass falling and staying on the ground)

Oh and for general public information, AMD's TressFX works on ALL graphic cards, not just AMD, because unlike NVIDIA's proprietary crap, TressFX works through DirectCompute, which means support on all modern graphic cards.
 
Joined
Mar 28, 2014
Messages
586 (0.16/day)
Processor AMD FX-8320
Motherboard AsRock 970 PRO3 R2.0
Cooling Thermalright Ultra120 eXtreme + 2 LED Green fans
Memory 2 x 4096 MB DDR3-1333 A-Data
Video Card(s) SAPPHIRE 4096M R9 FURY X 4G D5
Storage ST1000VX000 • SV35.6 Series™ 1000 GB 7200 rpm
Display(s) Acer S277HK wmidpp 27" 4K (3840 x 2160) IPS
Case Cooler Master HAF 912 Plus Black + Red Lights
Audio Device(s) Onboard Realtek
Power Supply OCZ ProXStream 1000W
Mouse Genius NetScroll 100X
Keyboard Logitech Wave
Software Windows 7 Ultimate 64-bit
Idle power draw for the cards

R9 290X Reference Card 17 watts
R9 290X Lightning 22 watts

GTX 980 Reference 8 watts
GTX 980 Gaming 14 watts

Mistake. :D

GTX 980 8 W;
GTX 970 9 W;

R9 290 16 W;
R9 290X 17 W.

http://www.techpowerup.com/reviews/Gigabyte/GTX_960_G1_Gaming/26.html

That is exactly double the power consumption and the question is principled....
In multi-display it is even worse.

GTX 980 9 W;
GTX 970 10 W;

R9 290 51 W;
R9 290X 54 W.

:(


I can probably find more money than that on the pavement... daily :D
Idling is not an issue. People are grasping at straws...

You are mistaken. It actually shows that something is not working properly
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
AMD sort-of-news devolves into flame war. Colour me shocked!
They just "feel" better, but the aren't better. Especially if you'd know all the shit NVIDIA has been doing to push their PhysX crap
FUN FACT #1: ATI looked at buying AGEIA but wouldn't meet the asking price.
FUN FACT #2: Nvidia offered AMD a PhysX licence (after paying the $150 million asking fee to buy AGEIA), but AMD decided to go with HavokFX, because OpenCL gaming was the next big thing. This is the same Havok that required a licensing fee and was supported by exactly zero games.
FUN FACT #3: When the PhysX hack for AMD cards arrived, it was AMD who threw up the roadblock.

So, ATI/AMD couldn't be bothered buying PhysX, couldn't be bothered licensing it once Nvidia purchased it, and actively blocked the development of a workaround that would allow the AMD community from using it. If you have an Nvidia card you can use it. If you have an AMD card, why should you care? AMD certainly don't.
 
Joined
Jan 2, 2015
Messages
1,099 (0.33/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
physx.. whats that donkey shit people assign a extra gpu to handle? haha it was never needed but it sure has a nice name.. maybe most of the people at AMD didnt want to feel dirty about putting a logo on something that pc's can just do. are we talking about freesync? got confused there for a sec.
 
Last edited:
Joined
Jun 13, 2012
Messages
1,316 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
They just "feel" better, but the aren't better. Especially if you'd know all the shit NVIDIA has been doing to push their PhysX crap (removing entire physics effects from games that used to be done through CPU in other games, basic stuff like smashed glass falling and staying on the ground)

Oh and for general public information, AMD's TressFX works on ALL graphic cards, not just AMD, because unlike NVIDIA's proprietary crap, TressFX works through DirectCompute, which means support on all modern graphic cards.

You do know Tressfx is limited to HAIR. PhysX does a lot more then that, as in how body falls down stairs or when a bullet hits a wall and how pieces of the wall hit the floor. Next time read up on tech before spouting off like you know anything.

AMD sort-of-news devolves into flame war. Colour me shocked!

FUN FACT #1: ATI looked at buying AGEIA but wouldn't meet the asking price.
FUN FACT #2: Nvidia offered AMD a PhysX licence (after paying the $150 million asking fee to buy AGEIA), but AMD decided to go with HavokFX, because OpenCL gaming was the next big thing. This is the same Havok that required a licensing fee and was supported by exactly zero games.
FUN FACT #3: When the PhysX hack for AMD cards arrived, it was AMD who threw up the roadblock.

So, ATI/AMD couldn't be bothered buying PhysX, couldn't be bothered licensing it once Nvidia purchased it, and actively blocked the development of a workaround that would allow the AMD community from using it. If you have an Nvidia card you can use it. If you have an AMD card, why should you care? AMD certainly don't.

yea sad how so many people forget the fact AMD had their chance to license it a long time ago yet refused and now they create PR that they are/were locked outta it. AMD wants everything for free cause well they don't have money to do it them selves. Nvidia is a business, they're not UNICEF.


physx.. whats that donkey shit people assign a extra gpu to handle? haha it was never needed but it sure has a nice name.. maybe most of the people at AMD didnt want to feel dirty about putting a logo on something that pc's can just do. are we talking about freesync? got confused there for a sec.

O i thought we were talking about Mantle for a second there. (cue the AMD fan to claim freesync is the industry standard or mantle is Open source.)
 
Last edited:
Joined
Sep 15, 2007
Messages
3,944 (0.65/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
AMD sort-of-news devolves into flame war. Colour me shocked!

FUN FACT #1: ATI looked at buying AGEIA but wouldn't meet the asking price.
FUN FACT #2: Nvidia offered AMD a PhysX licence (after paying the $150 million asking fee to buy AGEIA), but AMD decided to go with HavokFX, because OpenCL gaming was the next big thing. This is the same Havok that required a licensing fee and was supported by exactly zero games.
FUN FACT #3: When the PhysX hack for AMD cards arrived, it was AMD who threw up the roadblock.

So, ATI/AMD couldn't be bothered buying PhysX, couldn't be bothered licensing it once Nvidia purchased it, and actively blocked the development of a workaround that would allow the AMD community from using it. If you have an Nvidia card you can use it. If you have an AMD card, why should you care? AMD certainly don't.

Why would AMD give money to those crooks at Nvidia (their arch nemesis no less)? Nvidia locks physx to gpu even though it can all be done on CPU (even for their poor vid card owners)...it's pointless, 100% pointless.
Nvidia are such crybaby bitches that they actively block their cards from using physx when made secondary to an AMD card. You made that point...and it goes against your propaganda!

Obviously, you have no rebuttal against TressFX LOL. Nvidia won't have ANY of this open standard stuff. They'll bankrupt the company before they let it happen. That's how arrogant and greedy they are.

And I know you fanboys are INCREDIBLY butt hurt about Mantle and Freesync. Let me see those tears, baby!
 
Last edited:
Joined
Jun 13, 2012
Messages
1,316 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Why would AMD give money to those crooks at Nvidia (their arch nemesis no less)? Nvidia locks physx to gpu even though it can all be done on CPU (even for their poor vid card owners)...it's pointless, 100% pointless.
Nvidia are such crybaby bitches that they actively block their cards from using physx when made secondary to an AMD card.

Obviously, you have no rebuttal against TressFX LOL. Nvidia won't have ANY of this open standard stuff. They'll bankrupt the company before they let it happen. That's how arrogant and greedy they are.

And I know you fanboys are INCREDIBLY butt hurt about Mantle and Freesync. Let me see those tears, baby!

Cause cpu is to slow to do the work, GPU is much faster do the kinda calculations needed for it.

Last i checked Freesync and Mantle are Proprietary CLOSED software for AMD. SO tell us another AMD fanboy blind lie.
 
Last edited:
Top