• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon R9 380X Based on "Grenada," a Refined "Hawaii"

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,875 (3.07/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 Infinity RS4000\Paradigm P Studio 20, Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
Insane?
Totally changes the game?

http://www.geforce.com/whats-new/articles/borderlands-2-physx

Ermmm you mean some extra particles that bounce away when you shoot something or some particle made water flowing somewhere?
Because that does not change the game in any way shape or form.
Its exactly the same gimmicky nonsense that physX does in Warframe.
Hell in that article they do not refer to the physX as "effects" for nothing, thats all it adds, some effects.

It adds nothing but some orbs flying around, while it could be the entire basis for how things are build up and react (ya know... Physics) like those tech demo's they show of it.

The fact that you can turn it off is pretty much the dead give away that it infact does not "totally change the game" because a game that is based around those physX would not work without it.
You cannot turn havok off in for example HL2, because the game does not function anymore if that would be the case.


Basically dumbed down what could be done without PhysX hardware and made the extra which is not really NV PhysX. Lets face it it's nothing in that vid that a CPU could not handle, never mind the game being cartoon like it seen much better in other games and even Arma 3 has better shit than that.

Shit GTA 4 has better Physics than that game and that ran on a good system today runs really well.
 
Joined
Jun 13, 2012
Messages
1,327 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
All vendor implementations of all standards are proprietary by that definition. Way to completely invalidate your own argument!

No, you are trying to twist things to fit your own logic. AMD took what was standard and used it in their own way that is LOCKED to their hardware and software. It can't be used by Nvidia that is what makes it proprietary. What you are tring to claim is HDMI 2.0 on gtx900 cards is proprietary and AMD can't use HDMI 2.0, that is your ass-backwards logic.

Basically dumbed down what could be done without PhysX hardware and made the extra which is not really NV PhysX. Lets face it it's nothing in that vid that a CPU could not handle, never mind the game being cartoon like it seen much better in other games and even Arma 3 has better shit than that.

Um cpu could handle it? Try using same setting and set the phyx to cpu and see how well the game runs then.
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,875 (3.07/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 Infinity RS4000\Paradigm P Studio 20, Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
No, you are trying to twist things to fit your own logic. AMD took what was standard and used it in their own way that is LOCKED to their hardware and software. It can't be used by Nvidia that is what makes it proprietary. What you are tring to claim is HDMI 2.0 on gtx900 cards is proprietary and AMD can't use HDMI 2.0, that is your ass-backwards logic.



Um cpu could handle it? Try using same setting and set the phyx to cpu and see how well the game runs then.

It would need to be optimized for CPU which i bet not much was done. Other company's can do it so they could if they really wanted too, same ol BS over again to try to make it look better than it actually is.

BL2 is BS anyways 1/2 made frigging game from lazy asses. They just got lucky as they were out of time with the 1st one and people liked it so they pushed more 1/2 done bs and not make what they intended to make in the 1st time.
 
Joined
Nov 2, 2008
Messages
9 (0.00/day)
Benchmark Scores 9001
No, you are trying to twist things to fit your own logic. AMD took what was standard and used it in their own way that is LOCKED to their hardware and software. It can't be used by Nvidia that is what makes it proprietary. What you are tring to claim is HDMI 2.0 on gtx900 cards is proprietary and AMD can't use HDMI 2.0, that is your ass-backwards logic.

No, they took a standard, and branded their implementation of it. No duh the "FreeSync" is locked to their hardware. It's a brand name. You're complaining that Asus can't go around selling screens with "Dell UltraSharp" labels. If nVidia chooses to support Adaptive-Sync for G-Sync instead of their ACTUALLY-PROPRIETARY-WITH-ALL-THE-NEGATIVE-CONNOTATIONS-BECAUSE-IT-ISN'T-STANDARDIZED custom hardware, there's no reason a monitor couldn't be both "FreeSync certified" and "G-Sync certified" at the same time.

There's "ass-backwards logic" in here, but it ain't in my posts.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
No, they took a standard, and branded their implementation of it.
Actually, it's because they really don't have HDMI 2.0 but rather 1.4a with some modifications to support 2.0-like features. Get your terms right.
No duh the "FreeSync" is locked to their hardware. It's a brand name. You're complaining that Asus can't go around selling screens with "Dell UltraSharp" labels.
One is a technology, the other is branding. There is a big difference.

Lets clarify one thing here because I think people don't know definitions:
All vendor implementations of all standards are proprietary by that definition. Way to completely invalidate your own argument!
Lets take a quote:
Proprietary software is software that is owned by an individual or a company (usually the one that developed it). There are almost always major restrictions on its use, and its source code is almost always kept secret.
That does not make the fact a company developed it turn it into a proprietary software. It's proprietary because it's not open which imposes restrictions. Something is proprietary usually by how much of it you're willing to share. There is no such thing as open source proprietary software, which is what you would get if a company wrote open software by your definition.

For someone with only 7 posts, you've dug yourself a nice little hole for yourself rather quickly on a seemingly stupid topic (the definition of "proprietary").
there's no reason a monitor couldn't be both "FreeSync certified" and "G-Sync certified" at the same time.
I will agree with this statement unless there are specific rules for either that forbid having both at once.

With that all said, it's entirely possible to have both a proprietary and open source versions of an implemented specification, but doing it doesn't make it proprietary de-facto.
 
Joined
Nov 2, 2008
Messages
9 (0.00/day)
Benchmark Scores 9001
Lets take a quote:

That does not make the fact a company developed it turn it into a proprietary software. It's proprietary because it's not open which imposes restrictions. Something is proprietary usually by how much of it you're willing to share. There is no such thing as open source proprietary software, which is what you would get if a company wrote open software by your definition.

I like how you stripped the context there so you could claim I said the opposite of what I said.
 
Joined
Jun 23, 2011
Messages
393 (0.08/day)
System Name potato
Processor Ryzen 9 5950X
Motherboard MSI MAG B550 Tomahawk
Cooling Custom WC Loop
Memory 2x16GB G.Skill Trident Z Neo 3600
Video Card(s) RTX3090
Storage 512GB, 2TB NVMe + 2TB SATA || 32TB spinning rust
Display(s) XIAOMI Curved 34" 144Hz UWQHD
Case be quiet dark base pro 900
Audio Device(s) Edifier R1800T, Logitech G733
Power Supply Corsair HX1000
Mouse Logitech G Pro
Keyboard Logitech G913
Software win 11 amd64
on every amd threads @ TPU? good god
 
Top