• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon 7 is Released, What Would You Buy?

Radeon 7 is Released, What Would You Buy?

  • Radeon 7, of course

    Votes: 30 18.9%
  • Lower tier GPUs: Vega64/56, RTX 2070/2060 and etc.

    Votes: 10 6.3%
  • Higher tier GPUs

    Votes: 5 3.1%
  • Nothing, I am good with current GPU

    Votes: 80 50.3%
  • Nothing, I am disappointed with current stack of GPUs

    Votes: 34 21.4%

  • Total voters
    159
Joined
Feb 3, 2017
Messages
3,475 (1.33/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Joined
Mar 18, 2008
Messages
5,717 (0.98/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
7nm is expensive; 16GB HBM is also expensive. I think at $699 AMD barely makes money. Radeon 7 feels more like a statement than actual competition. Same as Vega and Fury: to show consumers they are still in the high end game.

TBH the limited supply run may also be likely. If you really like AMD then Radeon 7 feels like the final evolution form of GCN. It is probably gonna be the only high end until whatever comes after Navi.

For computing it is kinda hard to say. I had quite a lot of attempts using my old FuryX for OpenCL based genome alignment accleration. When you can get it working it IS VERY FAST. However the learning curve is quite steep and there is not a whole lot of bioinformatician coding in OpenCL that I can get help from. Maybe it would be better for other people using ML or AI. But then again without good Tensorflow support it is another steep learning curve.
 
Joined
Jan 3, 2015
Messages
2,873 (0.85/day)
System Name The beast and the little runt.
Processor Ryzen 5 5600X - Ryzen 9 5950X
Motherboard ASUS ROG STRIX B550-I GAMING - ASUS ROG Crosshair VIII Dark Hero X570
Cooling Noctua NH-L9x65 SE-AM4a - NH-D15 chromax.black with IPPC Industrial 3000 RPM 120/140 MM fans.
Memory G.SKILL TRIDENT Z ROYAL GOLD/SILVER 32 GB (2 x 16 GB and 4 x 8 GB) 3600 MHz CL14-15-15-35 1.45 volts
Video Card(s) GIGABYTE RTX 4060 OC LOW PROFILE - GIGABYTE RTX 4090 GAMING OC
Storage Samsung 980 PRO 1 TB + 2 TB - Samsung 870 EVO 4 TB - 2 x WD RED PRO 16 GB + WD ULTRASTAR 22 TB
Display(s) Asus 27" TUF VG27AQL1A and a Dell 24" for dual setup
Case Phanteks Enthoo 719/LUXE 2 BLACK
Audio Device(s) Onboard on both boards
Power Supply Phanteks Revolt X 1200W
Mouse Logitech G903 Lightspeed Wireless Gaming Mouse
Keyboard Logitech G910 Orion Spectrum
Software WINDOWS 10 PRO 64 BITS on both systems
Benchmark Scores Se more about my 2 in 1 system here: kortlink.dk/2ca4x
For me the correct answer would be "nothing, I am good with current GPU" and "nothing, I am disappointed with current stack of GPUs".

Well i have a GTX 1080 TI, so there Arent really nothing that is wofh replacing it with. RTX 2080 less Vram, more exspensive and performance difference is not the trouble worf it. RTX 2080 TI is way to exspensive and ray tracing cost way to much performance and AMD Radeon VII does not impress me either. Im good with GTX 1080 TI:D
 
Joined
Nov 13, 2007
Messages
10,209 (1.71/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
For me the correct answer would be "nothing, I am good with current GPU" and "nothing, I am disappointed with current stack of GPUs".

Well i have a GTX 1080 TI, so there Arent really nothing that is wofh replacing it with. RTX 2080 less Vram, more exspensive and performance difference is not the trouble worf it. RTX 2080 TI is way to exspensive and ray tracing cost way to much performance and AMD Radeon VII does not impress me either. Im good with GTX 1080 TI:D

The 1080ti reminds me of the 8800gtx a bit...

If you bought one when they first came out, you were set for 2.5-3 years, it was still top dog until the GTX 280 came out 2.5 years later.

Until nvidia goes 7nm it probably wont be worthwhile upgrading it.

7nm is expensive; 16GB HBM is also expensive. I think at $699 AMD barely makes money. Radeon 7 feels more like a statement than actual competition. Same as Vega and Fury: to show consumers they are still in the high end game.

TBH the limited supply run may also be likely. If you really like AMD then Radeon 7 feels like the final evolution form of GCN. It is probably gonna be the only high end until whatever comes after Navi.

For computing it is kinda hard to say. I had quite a lot of attempts using my old FuryX for OpenCL based genome alignment accleration. When you can get it working it IS VERY FAST. However the learning curve is quite steep and there is not a whole lot of bioinformatician coding in OpenCL that I can get help from. Maybe it would be better for other people using ML or AI. But then again without good Tensorflow support it is another steep learning curve.

I don't understand why they did that, I am sure they had a good reason to, but 16gb is completely unnecessary at this performance level; I wonder how much they could have saved with just 8gb. A $550 8GB R7 vs a $699 2080 is a solid matchup.
 
Last edited:
Joined
Mar 18, 2008
Messages
5,717 (0.98/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Nvidia’s 7nm flagship will probably be where most 1080Ti owners upgrade. Assuming it wont sell for $2000 by then.
 
Joined
Jan 3, 2015
Messages
2,873 (0.85/day)
System Name The beast and the little runt.
Processor Ryzen 5 5600X - Ryzen 9 5950X
Motherboard ASUS ROG STRIX B550-I GAMING - ASUS ROG Crosshair VIII Dark Hero X570
Cooling Noctua NH-L9x65 SE-AM4a - NH-D15 chromax.black with IPPC Industrial 3000 RPM 120/140 MM fans.
Memory G.SKILL TRIDENT Z ROYAL GOLD/SILVER 32 GB (2 x 16 GB and 4 x 8 GB) 3600 MHz CL14-15-15-35 1.45 volts
Video Card(s) GIGABYTE RTX 4060 OC LOW PROFILE - GIGABYTE RTX 4090 GAMING OC
Storage Samsung 980 PRO 1 TB + 2 TB - Samsung 870 EVO 4 TB - 2 x WD RED PRO 16 GB + WD ULTRASTAR 22 TB
Display(s) Asus 27" TUF VG27AQL1A and a Dell 24" for dual setup
Case Phanteks Enthoo 719/LUXE 2 BLACK
Audio Device(s) Onboard on both boards
Power Supply Phanteks Revolt X 1200W
Mouse Logitech G903 Lightspeed Wireless Gaming Mouse
Keyboard Logitech G910 Orion Spectrum
Software WINDOWS 10 PRO 64 BITS on both systems
Benchmark Scores Se more about my 2 in 1 system here: kortlink.dk/2ca4x
The 1080ti reminds me of the 8800gtx a bit...

If you bought one when they first came out, you were set for 2.5-3 years, it was still top dog until the GTX 280 came out 2.5 years later.

Until nvidia goes 7nm it probably wont be worthwhile upgrading it.

I think you are spot on there. I got my card in july 2017 and GTX 1080 TI where released in march 2017. So i have had my card for around one and a half a year now. RTX 2080/2080 TI released in september last year and with the given rate nvidia releases new gen cards. Replacement for RTX 2000 cards comming earliest by the of 2019 and i will think its properly more plausible with an early 2020 release and with the current prices RTX goes for, im keeping my GTX 1080 TI for sure. Dont be surprized if im still on GTX 1080 TI in 2020 as well. So yeah claiming to keep the card for 3 years seems spot on this time.

If i even get a new card before that, its properly a second GTX 1080 TI for some sli fun:D. Compared to RTX cards, GTX 1080 TI whas really spot on in my oppinion when it released and maybe the bedst highend card nvidia has released. Cheap compared to titan, plenty of vram and still performed very close to titan, but to al most the half price. That is not the case with RTX 2080 TI. Yes its faster than 1080 TI but price is stupid, still same vram zise and i dont need ray tracing.
 
Joined
Jan 8, 2017
Messages
8,863 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I had quite a lot of attempts using my old FuryX for OpenCL based genome alignment accleration. When you can get it working it IS VERY FAST. However the learning curve is quite steep and there is not a whole lot of bioinformatician coding in OpenCL.

That's not the fault of AMD, or anyone really. It also doesn't help that the largest GPU provider in the world which is also technically true for OpenCL capable devices, Nvidia, wont support anything past 1.2 and provides zero tools for debugging. If they don't bother to make this environment more usable why would AMD put in the effort ?
 

HTC

Joined
Apr 1, 2008
Messages
4,601 (0.79/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
That's not the fault of AMD, or anyone really. It also doesn't help that the largest GPU provider in the world which is also technically true for OpenCL capable devices, Nvidia, wont support anything past 1.2 and provides zero tools for debugging. If they don't bother to make this environment more usable why would AMD put in the effort ?

To entice more costumers to their offerings?
 
Joined
Oct 1, 2006
Messages
4,883 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
To entice more costumers to their offerings?
Exactly, it is a shame all that compute power is wasted because AMD didn't put in enough effort.

I don't understand why they did that, I am sure they had a good reason to, but 16gb is completely unnecessary at this performance level; I wonder how much they could have saved with just 8gb. A $550 8GB R7 vs a $699 2080 is a solid matchup.
The thing is, Vega 20 is a 2096-bit GPU that requires 4 stacks of HBM2.
They can either use 4 stacks of 4GB HBM2, which RX Vega is already using 2x 4GB, or some how source 2GB stacks of HBM2 just to make a single low volume product.
Nothing use 2GB HBM2 in AMD's product stack right now, in fact I am not sure if anyone is using 2GB HBM2 stacks right now.
I doubt it would actually save much cost in the end.
 
Last edited:
Joined
Jan 8, 2017
Messages
8,863 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
To entice more costumers to their offerings?

It's simply not a burden they should carry exclusively. I reckon they have done enough already.
 
Joined
Oct 1, 2006
Messages
4,883 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
It's simply not a burden they should carry exclusively. I reckon they have done enough already.
nVidia is in the dominant position right now, it is a conflict of interest for them to support OpenCL more instead of CUDA.
CUDA has the bonus of locking down the market for nVidia so that they can charge whatever they want.

Gonna stick with my 8GB 580. It works fine for my 1080P monitor. Until this thing starts on fire or stops running 1080P well, I'm happy with it.
My Vega 56 did indeed literally went up in flames. o_O
 
Joined
Dec 16, 2016
Messages
37 (0.01/day)
Location
Illinois
System Name Gaming PC
Processor Ryzen 7 7800X3D
Motherboard GIgabyte B650 Gaming X AX
Cooling Coolermaster ML280
Memory 32GB G.Skill Flare X5 DDR5 6000
Video Card(s) Sapphire Pulse Radeon 7900 XT
Storage Samsung 990 Pro 1TB
Display(s) Gigabyte M28U
Case NZXT H510 Elite
Audio Device(s) HyperX DuoCast
Power Supply Corsair RM750
Mouse HyperX Pulsefire
Keyboard HyperX Alloy Elite 2
Software Windows 11 Pro
Gonna stick with my 8GB 580. It works fine for my 1080P monitor. Until this thing starts on fire or stops running 1080P well, I'm happy with it.

nVidia is in the dominant position right now, it is a conflict of interest for them to support OpenCL more instead of CUDA.
CUDA has the bonus of locking down the market for nVidia so that they can charge whatever they want.


My Vega 56 did indeed literally went up in flames. o_O

That comes with the territory if you own an AMD GPU. It's an amazing personal space heater.
 
Joined
Mar 18, 2008
Messages
5,717 (0.98/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
That's not the fault of AMD, or anyone really. It also doesn't help that the largest GPU provider in the world which is also technically true for OpenCL capable devices, Nvidia, wont support anything past 1.2 and provides zero tools for debugging. If they don't bother to make this environment more usable why would AMD put in the effort ?
It's simply not a burden they should carry exclusively. I reckon they have done enough already.


From the standpoint of us researchers we need relative easier to code/optimize hardware solutions. I mean yeah if I had multiple computer science grad students helping me optimize OpenCL codes then yeah Radeon’s GCN based solution would be wicked good as they ARE faster at hardware level. But i am just by myself and most of my colleagues don’t even know how to use R, let alone coding in OpenCL.

We are spending tax payers dollar so at the end of the day it is also not researchers/users burden to develop more OpenCL based GPGPU application.

I hate to say it, but for us bioinformatics based applications CUDA and Nvidia’s solution “just works”
 
Joined
Nov 13, 2007
Messages
10,209 (1.71/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
From the standpoint of us researchers we need relative easier to code/optimize hardware solutions. I mean yeah if I had multiple computer science grad students helping me optimize OpenCL codes then yeah Radeon’s GCN based solution would be wicked good as they ARE faster at hardware level. But i am just by myself and most of my colleagues don’t even know how to use R, let alone coding in OpenCL.

We are spending tax payers dollar so at the end of the day it is also not researchers/users burden to develop more OpenCL based GPGPU application.

I hate to say it, but for us bioinformatics based applications CUDA and Nvidia’s solution “just works”
I keep hearing the same sentiment from the AI crowd.

They're there to solve a problem, not to spend all day trying to get the hardware to run.
 
Joined
Mar 18, 2008
Messages
5,717 (0.98/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
I keep hearing the same sentiment from the AI crowd.

They're there to solve a problem, not to spend all day trying to get the hardware to run.

Yes exactly this.

Similarly this is also why I have built multiple threadripper system for multiple molecular biology labs. The performance brought on by TR is simiply unmatchable for thr price we pay. TR CPU needs no coding optimization. As everything is already done at compiler level and our python script just works right off the line. Plug in CPU plug in ram. Then install Linux and off we go! 0 minutes wasted on getting things working.
 

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,378 (2.37/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Nvidia’s 7nm flagship will probably be where most 1080Ti owners upgrade. Assuming it wont sell for $2000 by then.

I'm a bit pissed about the price inflation but AMD's response is pretty shit tbh. For years, people supported AMD for 'keeping it real' but they tried to go toe-to-toe with Nvidia on pricing (Fury X, Vega 64) and now they're doing the same with their top tier card that's not as good as a 2080ti. They've thrown the towel in on the value proposition which is pretty poor. It's as if they're trying to emulate the CPU sector where, arguably, Intel has reacted fairly aggressively (I can now buy an 8/16 core/thread Intel CPU for £470- used to be £900). Except Nvidia hasn't done the same, so AMD is stuck with lower rate GPU at a price matched comparison with Nvidia's 2nd (or 3rd including Titan RTX) best GPU.

Personally, I only recreationally game*, so I won't pay more than £700-800 for an awesome GPU. Radeon 7 is NOT that GPU.

*mid-forties, so too middling to be good at multiplayer or as is often the case - too drunk to hit anything.
 
Joined
Oct 1, 2006
Messages
4,883 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
I'm a bit pissed about the price inflation but AMD's response is pretty shit tbh. For years, people supported AMD for 'keeping it real' but they tried to go toe-to-toe with Nvidia on pricing (Fury X, Vega 64) and now they're doing the same with their top tier card that's not as good as a 2080ti. They've thrown the towel in on the value proposition which is pretty poor. It's as if they're trying to emulate the CPU sector where, arguably, Intel has reacted fairly aggressively (I can now buy an 8/16 core/thread Intel CPU for £470- used to be £900). Except Nvidia hasn't done the same, so AMD is stuck with lower rate GPU at a price matched comparison with Nvidia's 2nd (or 3rd including Titan RTX) best GPU.

Personally, I only recreationally game*, so I won't pay more than £700-800 for an awesome GPU. Radeon 7 is NOT that GPU.

*mid-forties, so too middling to be good at multiplayer or as is often the case - too drunk to hit anything.
If you also include computing tasks, there is also "Poor Volta" which is just strait up better in FP32 and FP64 than Radeon 7, brute force or not.
It also has a good amount of HBM2 which everyone seems to think that it automatically means anything except memory bandwidth.
So yes that would make Radeon 7 compete with nVidia's 4th best GPU.
 
Last edited:
Joined
Dec 16, 2016
Messages
37 (0.01/day)
Location
Illinois
System Name Gaming PC
Processor Ryzen 7 7800X3D
Motherboard GIgabyte B650 Gaming X AX
Cooling Coolermaster ML280
Memory 32GB G.Skill Flare X5 DDR5 6000
Video Card(s) Sapphire Pulse Radeon 7900 XT
Storage Samsung 990 Pro 1TB
Display(s) Gigabyte M28U
Case NZXT H510 Elite
Audio Device(s) HyperX DuoCast
Power Supply Corsair RM750
Mouse HyperX Pulsefire
Keyboard HyperX Alloy Elite 2
Software Windows 11 Pro
I'm a bit pissed about the price inflation but AMD's response is pretty shit tbh. For years, people supported AMD for 'keeping it real' but they tried to go toe-to-toe with Nvidia on pricing (Fury X, Vega 64) and now they're doing the same with their top tier card that's not as good as a 2080ti. They've thrown the towel in on the value proposition which is pretty poor. It's as if they're trying to emulate the CPU sector where, arguably, Intel has reacted fairly aggressively (I can now buy an 8/16 core/thread Intel CPU for £470- used to be £900). Except Nvidia hasn't done the same, so AMD is stuck with lower rate GPU at a price matched comparison with Nvidia's 2nd (or 3rd including Titan RTX) best GPU.

Personally, I only recreationally game*, so I won't pay more than £700-800 for an awesome GPU. Radeon 7 is NOT that GPU.

*mid-forties, so too middling to be good at multiplayer or as is often the case - too drunk to hit anything.

The Radeon 7 also confused me for that reason. It seems to go against everything they've tried to accomplish with the RX, TR, and Ryzen stuff. Realistically, R7 should cost $400 with higher end variants coming out down the road. $700 for that card is insane.
 
Joined
Oct 1, 2006
Messages
4,883 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
The Radeon 7 also confused me for that reason. It seems to go against everything they've tried to accomplish with the RX, TR, and Ryzen stuff. Realistically, R7 should cost $400 with higher end variants coming out down the road. $700 for that card is insane.
TBH it wouldn't feel so awkward if they even price it at $649 or something.
But no, they have to price it toe to toe with the 2080 which is already regarded to be very overpriced.
The fact that Turing does all everything Vega does and has extra gimmicks stack on top doesn't help as well.

Lisa Su spend an entire hour repeating the word gaming 100x, but in the end we got a compute card with no Pro drivers? o_O

It seems to be they priced it at $699 because they figured if Nvidia could do it, they could too which is incredibly disappointing. Problem is, I don't think the RTX cards (aside from probably the 2060) are all that popular and have only served to hurt their reputation in large part because of the insane price. This is a terrible time in the history of gaming to be asking people to pay that much for a video card.
Again, for gaming there is nothing that the Radeon 7 offers, that the 2080 does not already offer months ago.
It is odd that AMD figures that it is fine to sell a card that is as expensive as nVidia's overpriced offerings but with less features.

AMD did a better marketing job for nVidia then even nVidia marketing team can come up with.
 
Last edited:
Joined
Dec 16, 2016
Messages
37 (0.01/day)
Location
Illinois
System Name Gaming PC
Processor Ryzen 7 7800X3D
Motherboard GIgabyte B650 Gaming X AX
Cooling Coolermaster ML280
Memory 32GB G.Skill Flare X5 DDR5 6000
Video Card(s) Sapphire Pulse Radeon 7900 XT
Storage Samsung 990 Pro 1TB
Display(s) Gigabyte M28U
Case NZXT H510 Elite
Audio Device(s) HyperX DuoCast
Power Supply Corsair RM750
Mouse HyperX Pulsefire
Keyboard HyperX Alloy Elite 2
Software Windows 11 Pro
TBH it wouldn't feel so awkward if they even price it at $649 or something.
But no, they have to price it toe to toe with the 2080 which is already regarded to be very overpriced.

It seems to be they priced it at $699 because they figured if Nvidia could do it, they could too which is incredibly disappointing. Problem is, I don't think the RTX cards (aside from probably the 2060) are all that popular and have only served to hurt their reputation in large part because of the insane price. This is a terrible time in the history of gaming to be asking people to pay that much for a video card.

TBH it wouldn't feel so awkward if they even price it at $649 or something.
But no, they have to price it toe to toe with the 2080 which is already regarded to be very overpriced.
The fact that Turing does all everything Vega does and has extra gimmicks stack on top doesn't help as well.

Lisa Su spend an entire hour repeating the word gaming 100x, but in the end we got a compute card with no Pro drivers? o_O

Sure, higher profit margin.
 
Joined
Jun 28, 2014
Messages
2,388 (0.67/day)
Location
Shenandoah Valley, Virginia USA
System Name Home Brewed
Processor i9-7900X and i7-8700K
Motherboard ASUS ROG Rampage VI Extreme & ASUS Prime Z-370 A
Cooling Corsair 280mm AIO & Thermaltake Water 3.0
Memory 64GB DDR4-3000 GSKill RipJaws-V & 32GB DDR4-3466 GEIL Potenza
Video Card(s) 2X-GTX-1080 SLI & 2 GTX-1070Ti 8GB G1 Gaming in SLI
Storage Both have 2TB HDDs for storage, 480GB SSDs for OS, and 240GB SSDs for Steam Games
Display(s) ACER 28" B286HK 4K & Samsung 32" 1080P
Case NZXT Source 540 & Rosewill Rise Chassis
Audio Device(s) onboard
Power Supply Corsair RM1000 & Corsair RM850
Mouse Generic
Keyboard Razer Blackwidow Tournament & Corsair K90
Software Win-10 Professional
Benchmark Scores yes
Without actual reviews to see, this question may be a little too soon to ask.
In theory, I've always found myself rooting for the underdog, AMD.
This can result in diminishing returns as we experienced with Bulldozer and others, but supporting AMD has always been a necessary counterpoint to Intel and NVIDIA's monopolistic tendencies.
We all know what happens without any meaningful competition in the marketplace.

The success of Ryzen I and II was gratifying to me and I hope for more of the same.
Even if this VII GPU doesn't damage NVIDIA's stranglehold on the market, I'll buy a few of them, just to support AMD efforts.
 
Joined
Dec 16, 2016
Messages
37 (0.01/day)
Location
Illinois
System Name Gaming PC
Processor Ryzen 7 7800X3D
Motherboard GIgabyte B650 Gaming X AX
Cooling Coolermaster ML280
Memory 32GB G.Skill Flare X5 DDR5 6000
Video Card(s) Sapphire Pulse Radeon 7900 XT
Storage Samsung 990 Pro 1TB
Display(s) Gigabyte M28U
Case NZXT H510 Elite
Audio Device(s) HyperX DuoCast
Power Supply Corsair RM750
Mouse HyperX Pulsefire
Keyboard HyperX Alloy Elite 2
Software Windows 11 Pro
Without actual reviews to see, this question may be a little too soon to ask.
In theory, I've always found myself rooting for the underdog, AMD.
This can result in diminishing returns as we experienced with Bulldozer and others, but supporting AMD has always been a necessary counterpoint to Intel and NVIDIA's monopolistic tendencies.
We all know what happens without any meaningful competition in the marketplace.

The success of Ryzen I and II was gratifying to me and I hope for more of the same.
Even if this VII GPU doesn't damage NVIDIA's stranglehold on the market, I'll buy a few of them, just to support AMD efforts.

At least for me it's not so much the performance of the R7, but the price point. I think that's why people are so disappointed.
 
D

Deleted member 178884

Guest
It's very interesting even at the price point it is, the 2080 here in the UK still costs more, If I was buying a new GPU it'd be a radeon 7. However I'm happy with my 1080ti ftw3 and I've just put it under a alphacool nexxos waterblock.
 
Joined
Jun 28, 2014
Messages
2,388 (0.67/day)
Location
Shenandoah Valley, Virginia USA
System Name Home Brewed
Processor i9-7900X and i7-8700K
Motherboard ASUS ROG Rampage VI Extreme & ASUS Prime Z-370 A
Cooling Corsair 280mm AIO & Thermaltake Water 3.0
Memory 64GB DDR4-3000 GSKill RipJaws-V & 32GB DDR4-3466 GEIL Potenza
Video Card(s) 2X-GTX-1080 SLI & 2 GTX-1070Ti 8GB G1 Gaming in SLI
Storage Both have 2TB HDDs for storage, 480GB SSDs for OS, and 240GB SSDs for Steam Games
Display(s) ACER 28" B286HK 4K & Samsung 32" 1080P
Case NZXT Source 540 & Rosewill Rise Chassis
Audio Device(s) onboard
Power Supply Corsair RM1000 & Corsair RM850
Mouse Generic
Keyboard Razer Blackwidow Tournament & Corsair K90
Software Win-10 Professional
Benchmark Scores yes
At least for me it's not so much the performance of the R7, but the price point. I think that's why people are so disappointed.
AMD prices seem to adjust after release. I believe it will happen again.
 
Top