• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GIGABYTE Outs RX 5600 XT Gaming OC VBIOS Update and Easy Updater

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,851 (0.33/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASUS ROG Strix X670E-I Gaming WiFi
Cooling ID-COOLING SE-207-XT Slim Snow
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA)
Storage 2TB Samsung 990 Pro NVMe
Display(s) AOpen Fire Legend 24" (25XV2Q), Dough Spectrum One 27" (Glossy), LG C4 42" (OLED42C4PUA)
Case ASUS Prime AP201 33L White
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight (White), G303 Shroud Edition
Keyboard Wooting 60HE / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.3447
OpenCuda like most of Nvidia's tech is closed and designed to force you onto their hardware. CUDA cores suck compared to AMDs compute units, but NV tries their hardest to kill or stall OpenCL adoption because the only company that can make CUDA cores is NV, so they cripple OpenCL on their hardware, and make CUDA look so much better.

CUDA is open-source and ties into existing OpenCL code quite well. Even by itself, it is still more fleshed out than OpenCL. Most machine learning libraries that have a CUDA backend can still run the code directly on a x86 CPU or any other GPGPU without major issues.

AMD doesn't care much about the machine learning market as much as NVIDIA does. If they did, they should've continued with GCN (the Vega 56, 64 and Radeon VII are great cards for the enthusiast researcher) and not step backwards with RDNA (1.0, however they say 2.0 would be robust for developers).

ROCm (uses Google's pre-owned TensorFlow), which thankfully is still alive, is still primitive to NVIDIA's cuDNN. MIOpen (AMD's deep learning math library) works extremely well on NVIDIA's own tensor cores (which is not surprising since it's HIP-compatible).

NVIDIA's current CUDA cores are the same as AMD's GCN/RDNA shader processors. It all depends on the libraries used. It just so happens that NVIDIA's implementation is the most compatible at the moment.
 
Joined
Jun 28, 2016
Messages
3,595 (1.26/day)
OpenCuda like most of Nvidia's tech is closed and designed to force you onto their hardware. CUDA cores suck compared to AMDs compute units, but NV tries their hardest to kill or stall OpenCL adoption because the only company that can make CUDA cores is NV, so they cripple OpenCL on their hardware, and make CUDA look so much better.

Stuffing black box code into games that devs aren't even allowed to look at. That suspiciously enhances performance for one side but causes problems for the other.
That post showed that you really don't know what you're talking about.
CUDA is open-source. Half of your absurdly long post swirls around some "black box" nonsense.
And BTW: black boxes are perfectly fine. Don't be afraid of them. Abstraction is what let programming evolve so quickly (and become so easy).

Your main problem, albeit shared with many AMD fans on this forum, is lack of understanding what these companies actually make.
You seem to think Nvidia's product is the GPU. That the software layer should just be a free and fully hardware agnostic contribution to humanity. Cute.

In the real world what Nvidia makes is a solution. That solution consists of a GPU and a software layer that lets you use it as efficiently as possible. You buy a card - you pay for both.
No one is forced to write using CUDA. Just like no one is forced to use AMD's in-house libraries (yes, AMD does the same thing - it's just that no one cares :) ).

The reason for CUDA popularity is very simple. There was a time when almost no one outside of gaming business knew how to program GPUs. We knew they are good for some problems, but coding was so difficult, learning this made very little sense for engineers and scientists.

My first contact with CUDA was on a condensed matter physics seminary in 2008. The lecturer (late 30s) started with something like: "Hey, there's this great new thing called CUDA and we can compute stuff on GPUs. It's super easy." And the audience (mostly late 50s and over) said: "GPUs?"
When I was leaving university in 2012, CUDA was in syllabus of the first-year programming course.
It was a great idea by Nvidia and it just paid off.

Compared to what we had in 2007 (OpenGL) it was like Python vs C++ in usability, but with no performance loss.

As for OpenCL - it's not a bad initiative, but it loses importance and this won't change.
Nvidia has CUDA. Intel focuses on OneAPI. Apple left it for Metal.
OpenCL simply isn't needed. And, frankly, it's not that great with latest AMD offerings.
 
Joined
Aug 27, 2015
Messages
555 (0.18/day)
Location
In the middle of nowhere
System Name Scrapped Parts, Unite !
Processor Ryzen 5 3600 @4.0 Ghz
Motherboard MSI B450-A Pro MAX
Cooling Stock
Memory Team Group Elite 16 GB 3133Mhz
Video Card(s) Colorful iGame GeForce GTX1060 Vulcan U 6G
Storage Hitachi 500 GB, Sony 1TB, KINGSTON 400A 120GB // Samsung 160 GB
Display(s) HP 2009f
Case Xigmatek Asgard Pro // Cooler Master Centurion 5
Power Supply OCZ ModXStream Pro 500 W
Mouse Logitech G102
Software Windows 10 x64
Benchmark Scores Minesweeper 30fps, Tetris 40 fps, with overheated CPU and GPU
Why Sapphire , XFX , Powercolor , Asrock don't make Nvidia GPUs ? That kind of logic flows both ways , yet somehow you focus only on one side of the coin . I would be impressed if you came up and accused AMD for this !

XFX used to make Nvidia-based GPU, which then also make AMD-based gpu when AMD release HD 4000 series, Nvidia then punish XFX by not supplying newer Fermi chip for GTX 400 series. in retaliation XFX drop nvidia product line completely and only made HD 5000 series.

this is just one of the AIB story that made it into internet, Nvidia has been pressuring smaller AIB only Nvidia-based gpu not both,
ASUS, Gigabyte, MSI have leeway to make cards from both side because they have leverage as major AIB.

do your research first, before you posting.
 
Last edited:
Joined
Oct 4, 2017
Messages
695 (0.29/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2
XFX used to make Nvidia-based GPU, which then also make AMD-based gpu when AMD release HD 4000 series, Nvidia then punish XFX by not supplying newer Fermi chip for GTX 400 series. in retaliation XFX drop nvidia product line completely and only made HD 5000 series.
this is just one of the AIB story that made it into internet, Nvidia has been pressuring smaller AIB only Nvidia-based gpu not both,

You see this what i don't like about some peoples , they take something that really happened and they invent a story full of BS that suits their agenda !

We are all aware of the story between Nvidia and XFX that still doesn't explain why XFX doesn't make Nvidia cards today , oh yeah i guess you must be one of those who confuse multi million businesses with children so because those children argued in the past they shall never talk to each other again right ? Yeah right .........

There are 3 other companies on that list that you fail to explain why they do make exclusively AMD GPUs , not only this but you are ready to blaim Nvidia for this when at the same breath you will not hesitate a second to blame Nvidia for EVGA not making AMD GPUs . Indeed DOUBLE STANDARDS FTW .... this is where you peoples loose all credibility !

ASUS, Gigabyte, MSI have leeway to make cards from both side because they have leverage as major AIB.

Yeah sure either this or this is another BS made up argument that suits your narrative .... and here comes AFOX which makes both Nvidia and AMD but im sure AFOX being the small brand they are have a ton more leverage than Asrock , Powercolor and Sapphire right ? right ?

do your research first, before you posting.

Dude there is difference between doing some research and making up BS to suit your agenda , besides this coming for you is the icing on the cake !
 
Last edited:
Joined
Jun 2, 2017
Messages
7,950 (3.15/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Why Sapphire , XFX , Powercolor , Asrock don't make Nvidia GPUs ? That kind of logic flows both ways , yet somehow you focus only on one side of the coin . I would be impressed if you came up and accused AMD for this !
[/QUOTE]

Some of those companies worked with Nvidia in the past and the bad taste from dealing with them made them go the ATI/AMD way. You should really read the objective history of Nvidia before thinking that I am promoting one over the other.
 
Joined
Jun 28, 2016
Messages
3,595 (1.26/day)
Some of those companies worked with Nvidia in the past and the bad taste from dealing with them made them go the ATI/AMD way. You should really read the objective history of Nvidia before thinking that I am promoting one over the other.
But how do you know companies that are Nvidia-only haven't had bad experience with ATI/AMD? :)
 
Top