• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Vega 56 Pulse or RTX 2060 Founder/Zotac Twin?

Vega56 or 2060?

  • Vega

    Votes: 32 50.8%
  • RTX

    Votes: 31 49.2%

  • Total voters
    63
  • Poll closed .
Status
Not open for further replies.

bug

Joined
May 22, 2015
Messages
13,245 (4.05/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
1) There was no reference to Linux made, just the classical lie
2) When I'm working, OS doesn't matter, if I'm gaming AND using PC, games simply do not support it, so, shrug, how common a task is "working under Linux with OpenCL"?

It's pretty common now that OpenCL is used in so many places. From LibreOffice to image and video processing, there are many workflows that benefit from OpenCL. Serious compute works still eschews OpenCL in favour of CUDA though.

(To add insult to injury, Nvidia beats AMD at OpenCL despite Nvidia's lack of support for OpenCL 2.0. They do it leveraging only OpenCL 1.2)
 
Joined
Jun 28, 2016
Messages
3,595 (1.25/day)
2) When I'm working, OS doesn't matter, if I'm gaming AND using PC, games simply do not support it, so, shrug, how common a task is "working under Linux with OpenCL"?
I'd be really surprised if you weren't regularily running software that uses OpenCL. Even 7zip is GPU-accelerated these days.

Apart from stuff that benefits a bit from GPUs (you may notice or no), there are some applications that can potentially get a huge boost: rendering, photo/video editing, simulations etc.
Most of it is written around OpenCL, but sometimes there's also a CUDA option (for example in Adobe Premiere Pro, albeit that's not on Linux yet).
Also, real ilght is a complex matter, diffraction and what not and, so there go your soft shades even if your ray tracing is running on uber machine.
Light is fairly simple as long as you can live in classical physics (and that's enough for 3D rendering we're talking about). :)
Serious compute works still eschews OpenCL in favour of CUDA though.
Depends on your definition of "serious compute". :)
If you know that software will be run on Nvidia, going with CUDA is a no-brainer. If you're building a computation environment from scratch, it usually also makes sense to go with Nvidia (yes, it's likely cheaper ;-)).
But if software has to work on both AMD and Nvidia (and something else potentially) OpenCL is perfectly fine and is often used.
For example: until not so long ago everything for Macs had to be written in OpenCL, since you can't spec them with Nvidia GPU. It's changing now thanks to eGPU.
 

bug

Joined
May 22, 2015
Messages
13,245 (4.05/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Depends on your definition of "serious compute". :)
If you know that software will be run on Nvidia, going with CUDA is a no-brainer. If you're building a computation environment from scratch, it usually also makes sense to go with Nvidia (yes, it's likely cheaper ;-)).
But if software has to work on both AMD and Nvidia (and something else potentially) OpenCL is perfectly fine and is often used.
For example: until not so long ago everything for Macs had to be written in OpenCL, since you can't spec them with Nvidia GPU. It's changing now thanks to eGPU.
"Serious compute" doesn't care about AMD or Nvidia. "Serious compute" is about getting results. CUDA is usually 10x as fast as AMD's OpenCL.
I'm not an advocate of closed source solutions, but when closed source can be that fast and open source can't... well...
 
Last edited:
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
CUDA is usually 10x as fast as AMD's OpenCL

Thanks for spreading FUD, citizen.
And it takes certain qualities to blow it to this magnitude.
 

bug

Joined
May 22, 2015
Messages
13,245 (4.05/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Thanks for spreading FUD, citizen.
And it takes certain qualities to blow it to this magnitude.
Unfortunately I can't source that, because apps do not sport swappable compute backends. But it is what I understood from conversations with guys that did try both frameworks.
 
Joined
Aug 6, 2017
Messages
7,412 (3.00/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
tbh both vega 56 and rtx 2060 are good choices with good points and flaws on both sides. the rtx gpu itself is faster,more advanced and much more efficient. vega 56 is gcn,with all the strong points and disadvantages of the architecture that's becoming very long in the tooth.Vega is equipped with 8gb of memory though while the decision to put 6gb on a $350 card that will now occupy the gtx 1080/v64 place is questionable.GCN had an advantage back in the early days of dx12 and vulkan,pascal improved on that significantly and with the arrival of turing I can see nothing that amd has done over the course of the last 3-4 years that would still give them any edge over nvidia cards.They've slept on innovation and will pay the price now that a 1920 cuda turing card can match a vega cards not only in dx11 but dx12 and vulkan too. The rtx card should age better than vega.
For me RTX 2060 is a premium 1080p card,exceptional at performing in any game as long as you stay at this resolution. At 1440p it's mostly fine but you're looking at borderline vram usage.Not saying it's not enough,but it's definitely on the verge. E.g. all recent Ubisoft games show 5-6GB vram usage on my card already.
 
Last edited:
  • Like
Reactions: bug

bug

Joined
May 22, 2015
Messages
13,245 (4.05/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
tbh both vega 56 and rtx 2060 are good choices with good points and flaws on both sides. the rtx gpu itself is faster,more advanced and much more efficient. vega 56 is gcn,with all the strong points and disadvantages of the architecture that's becoming very long in the tooth.Vega is equipped with 8gb of memory though while the decision to put 6gb on a $350 card that will now occupy the gtx 1080/v64 place is questionable.GCN had an advantage back in the early days of dx12 and vulkan,pascal improved on that significantly and with the arrival of turing I can see nothing that amd has done over the course of the last 3-4 years that would still give them any edge over nvidia cards.They've slept on innovation and will pay the price now that a 1920 cuda turing card can match a vega cards not only in dx11 but dx12 and vulkan too. The rtx card should age better than vega.
For me RTX 2060 is a premium 1080p card,exceptional at performing in any game as long as you stay at this resolution. At 1440p it's mostly fine but you're looking at borderline vram usage.Not saying it's not enough,but it's definitely on the verge. E.g. all recent Ubisoft games show 5-6GB vram usage on my card already.
See here: https://www.techspot.com/article/1785-nvidia-geforce-rtx-2060-vram-enough/
And they were keen enough to look at actual performance, because we all know allocated memory is not a good indicator ;)

Still, it's a missed opportunity, 8GB VRAM on a 256bit interface would have made a killer product. Maybe they figured it would cut too close to 2070. Or maybe they're saving it for 2060Ti.
 
Joined
Aug 6, 2017
Messages
7,412 (3.00/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
See here: https://www.techspot.com/article/1785-nvidia-geforce-rtx-2060-vram-enough/
And they were keen enough to look at actual performance, because we all know allocated memory is not a good indicator ;)

Still, it's a missed opportunity, 8GB VRAM on a 256bit interface would have made a killer product. Maybe they figured it would cut too close to 2070. Or maybe they're saving it for 2060Ti.
it's enough in the scenario they tested.
I chose 1080ti over 2080 for the vram myself. I use a lot of aa even on a 24" 1440p monitor and have seen my gtx 1080 run out of vram at the settings I use. I find 24" 1080p downright terrible for games.
If you're willing to adjust the settings in case there's problems then 2060 with 6gb will run fine at 1440p,no doubt about that.
 
Last edited:
Joined
Mar 10, 2015
Messages
3,984 (1.19/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
allocated memory is not a good indicator

allocated memory == used memory. Your going to have problems if all your memory is allocated and something else needs it.
 
Joined
Jun 28, 2016
Messages
3,595 (1.25/day)
"Serious compute" doesn't care about AMD or Nvidia. "Serious compute" is about getting results.
That is just some weird propaganda. :-D
Yes, computation is about getting results (serious or not).
But people that work with computing definitely care what hardware and software they use.
I'm not an advocate of closed source solutions, but when closed source can be that fast and open source can't... well...
Why aren't you an advocate of closed source? Something wrong with the results? :-D

allocated memory == used memory. Your going to have problems if all your memory is allocated and something else needs it.
Wrong.
A lot of software allocates more memory than it needs.
 

bug

Joined
May 22, 2015
Messages
13,245 (4.05/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Why aren't you an advocate of closed source? Something wrong with the results? :-D

I'm a software developer, I like it better when I can actually dig through the documentation and source code as opposed to paying thousands of dollars for courses teaching me how to use proprietary stuff.
I'm also an engineer, meaning I need to be pragmatic and use whatever gets the job done, so I understand the need for closed source as well. But I prefer it to be open.

allocated memory == used memory. Your going to have problems if all your memory is allocated and something else needs it.
Quite wrong. Memory is routinely allocated pre-emptively because frequent small allocations are too costly. Allocated and unused RAM is just swapped out. For video cards, that's much less of a problem, because how often do you start several games and let them compete for VRAM?
 
Joined
Mar 10, 2015
Messages
3,984 (1.19/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Wrong.
A lot of software allocates more memory than it needs.

Sure, write a program that allocates all your GPU VRAM and then go play a game. Please report the results.

how often do you start several games and let them compete for VRAM?

So, if your OS decides its going to need 4GB of your vram because it might need it, how do you think your games will play?
 
Joined
Jun 15, 2016
Messages
1,042 (0.36/day)
Location
Pristina
System Name My PC
Processor 4670K@4.4GHz
Motherboard Gryphon Z87
Cooling CM 212
Memory 2x8GB+2x4GB @2400GHz
Video Card(s) XFX Radeon RX 580 GTS Black Edition 1425MHz OC+, 8GB
Storage Intel 530 SSD 480GB + Intel 510 SSD 120GB + 2x500GB hdd raid 1
Display(s) HP envy 32 1440p
Case CM Mastercase 5
Audio Device(s) Sbz ZXR
Power Supply Antec 620W
Mouse G502
Keyboard G910
Software Win 10 pro
This thread lost his purpose long time ago except for shooting the breeze.
 

bug

Joined
May 22, 2015
Messages
13,245 (4.05/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
So, if your OS decides its going to need 4GB of your vram because it might need it, how do you think your games will play?
The OS doesn't allocate VRAM on its own. Applications and games do. What a game will do usually is look at its settings and based on that it will try to shove as many textures in the available VRAM as possible. Sometimes that means it could load textures it won't need until half an hour later, sometimes your VRAM will be barely enough for a few seconds.
 
Joined
Mar 10, 2015
Messages
3,984 (1.19/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Not worth it. Have a nice weekend.
 
Status
Not open for further replies.
Top