• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD "Zen" Processors to Feature SMT, Support up to 8 DDR4 Memory Channels

Joined
Jan 8, 2016
Messages
34 (0.01/day)
Eh if you look beyond the titan no they don't. AMD offers 10bit color in consumer cards nvidia offers 8, and offers full dx12 on gpu, the fury cards have hbm, nvidia has gddr5, the nano offers better performance per watt, gcn offers better performance with resolutions higher than 1080p etc. Actually once you get past the hype and cards should be wiping the floor with nvidia in sales.

And here I thought I was talking about prices....of course, people take whatever context they damn want to suit their needs.

BTW, AMD doesn't offer full DX12 (12_1) support, Nvidia has offered 10-bit support since the Geforce 200 Series (http://nvidia.custhelp.com/app/answ...-bit-per-color-support-on-nvidia-geforce-gpus), and AFAIK Nvidia has beaten AMD in almost every DX11 game out there. I don't know (neither care) what DX12 will mean for the actual gen, but my argument about that is that transitional generations sucks, and that the real DX12 battle begins this year.

So that only leaves you with the HBM argument. Woohoo! Enjoy it for the few more months that remains, fanboy.

Nothing. Intel didn't invent SMT or use it first lol. Intel used it purely for marketing early on (it wasn't worth a crap until the i series). And if you knew anything about bulldozer, the FPU uses SMT (although this was basically a cost cutting measure).
It didn't make fiscal sense for AMD to use it before. They've never had the budget to make the chips even more complex (could barely get them out the door as it was). Zen has been in the works for a long time and with the node shrinks, they have more room to implement better features.

Keep on blabbing. If APUs are so worthless, then why did intel copy it? Is your foot tasty?

Maybe because, like I said, there's a market for them? How dense can you be? The fact that I said that APUs have no use for gamers or other resource-hungry apps doesn't mean that they wouldn't be sold at all.

And the absurd argument of "OMG INTEL/AMD INVENTED IT FIRST, SO IT'S BETTER/THEY'RE MORALLY BETTER THAN THE OTHER!" is just the last resort of fanboys to justify their customer choice. Fortunately I'm smart enough as for making my choices based on raw performance and price/performance ratios, and not out of "loyalty" to a CORPORATION you don't even work in.

Seriously, fanboys of any kind are a nuisance, but I swear AMD fanboys are a pest. You can't say the slightest thing against 'their' brand, or in favor of the competition without them crying around in opposition.
 
Last edited:
Joined
Jan 21, 2016
Messages
25 (0.01/day)
Location
Jersey Boy
System Name The Terminator's baby
Processor Intel Skylake i5-6500 @ 3.2GHz
Motherboard ASRock Z170 PRO4
Cooling Stock
Memory Corsair 2x8GB DDR4-2400
Video Card(s) AMD Radeon R7 240
Storage 2x1TB WD Black in RAID 0
Display(s) AOC I2367F 23" 1080P
Case Fractal Design Define R5
Power Supply Seasonic G-550 550W
Software Windows 7 Professional 64-bit
No need to clash fists here, we can all keep it civil can't we?

The fact of the matter is that the companies AMD competes with have a bigger budget and more resources at their disposal. This means that most of the time they will have the upper hand in price, performance, marketing, business deals, partnerships, etc.
This does not mean that AMD can't have their niche and survive if not flourish in their segments. It's just that they have been in the negative for so long, at this point they really are struggling to survive. If Jim Keller has done a miracle on the design of his ZEN, it will be a success on that front. If it fails, it will be 100% AMDs fault. Once the engine gets revved up to start getting these chips out on shelves, they need to do some HEAVY marketing to appeal to the masses otherwise I'm afraid it won't look good on that balance sheet of theirs.
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.29/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
And here I thought I was talking about prices....of course, people take whatever context they damn want to suit their needs.

BTW, AMD doesn't offer full DX12 (12_1) support, Nvidia has offered 10-bit support since the Geforce 200 Series (http://nvidia.custhelp.com/app/answ...-bit-per-color-support-on-nvidia-geforce-gpus), and AFAIK Nvidia has beaten AMD in almost every DX11 game out there. I don't know (neither care) what DX12 will mean for the actual gen, but my argument about that is that transitional generations sucks, and that the real DX12 battle begins this year.

So that only leaves you with the HBM argument. Woohoo! Enjoy it for the few more months that remains, fanboy.

In your link it states that you need a quadro card to use 10 bit color. My exact wording was consumer card and I used it for a reason.

If you want to be really specific in performance the and dual gpu card from 2 generations ago is still the fastest single "card" solution. Of you look past 1080P the fury series beats everything but the titan and does beat the titan in some games. Again it's exactly what I already posted.

And again if you would look just a little closer into dx12. You will notice somethings.

I am no fanboy I have run both sets of cards and buy based off value/feature set. Last set of cards was three water cooled gtx470's. Those are to this day my favorite cards followed by my original ti4200, which currently holds some of the highest clock speed records ever recorded.
 
Joined
Jan 8, 2016
Messages
34 (0.01/day)
In your link it states that you need a quadro card to use 10 bit color. My exact wording was consumer card and I used it for a reason.

If you want to be really specific in performance the and dual gpu card from 2 generations ago is still the fastest single "card" solution. Of you look past 1080P the fury series beats everything but the titan and does beat the titan in some games. Again it's exactly what I already posted.

And again if you would look just a little closer into dx12. You will notice somethings.

I am no fanboy I have run both sets of cards and buy based off value/feature set. Last set of cards was three water cooled gtx470's. Those are to this day my favorite cards followed by my original ti4200, which currently holds some of the highest clock speed records ever recorded.

And how many "non-professional" 10-bit monitors you currently see in the market, huh? And how much they'd be overshadowed by the future HDR monitors anyway?

Whatever. Like I said, it's not my point, and I'm absolutely tired of fanboyisms. Don't expect me to reply to any more of them. The same I say to anyone else.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.25/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Eh if you look beyond the titan no they don't. AMD offers 10bit color in consumer cards nvidia offers 8, and offers full dx12 on gpu, the fury cards have hbm, nvidia has gddr5, the nano offers better performance per watt, gcn offers better performance with resolutions higher than 1080p etc. Actually once you get past the hype and cards should be wiping the floor with nvidia in sales.

10-Bit color has no affect on the consumer market. 10-Bit panels are expensive, though they are getting cheaper, but your still looking at $350 for a GW2765HT, and that is an exception to the rule, the next cheapest panel after that is the $500 ASUS PB287Q. And 10-bit isn't noticeable better than 8-Bit to the average consumer(that is why almost no 4K TVs support it). 10-Bit is a Professional feature, that only professionals will notice if lacking. And I know someone is going to post those BS pictures with the on on the left being 256 Color, and the one on the right being 8-bit, and claim "look at the difference 10-Bit makes!" Don't waste your time, those pictures all over exaggerate the actual difference. On top of that, AMD cards lack HDMI 2.0, a far more consumer friendly feature than 10-bit. There are a lot of people that like to connect their computer to their TV. I'm sure there are a lot of consumers, a lot more than those that care about 10-bit support, that only can afford to buy one 4K device, and that is their TV, but also want to connect their high end gaming computer to it, to play games in 4K. Not with AMD cards you aren't, because most 4K TVs don't have displayport. So it's an extra $30 for an adapter if you want to do that.

DX12 right now doesn't matter, and probably isn't going to matter for this generation of cards. Tomb Raider is the first AAA game we are likely to see utilize it, maybe, and it is likely the nVidia GPUs will support every feature necessary.

HBM shouldn't be a marketing bullet when HBM cards are still losing in performance to GDDR5 cards. So does it really make a difference? Are people with the Fury cards going "Hey look, my card is so much better because it has HBM...even though it still performs worse...but it is so much better because HBM!"? And what happens when the next generation of nVidia cards come out with HBM2 and AMD is still using HBM1? Are you going to say the nVidia card is now better because it is using the better form of HBM, the next generation of HBM? Somehow I doubt it. HBM isn't a value added feature.

The Nano doesn't offer better performance per watt. It is worse than the 980 and ties the 980Ti at 4K, it is worse than the 980Ti and 980 at 1440p, and is worse than the 970, 980, and 980Ti at 1080p.

GCN only manages to close the gap at higher resolutions, it doesn't offer better performance. The Fury X is the best GCN card available, and it merely ties with the 980Ti at 4K, and the pre-overclocked 980Ti's are actually crushing the Fury X with 15%+ better performance at 4k. And since the Fury X's overclocking is lackluster, to say the least, there is no making that difference up by overclocking the Fury X.
 
Joined
Jan 8, 2016
Messages
34 (0.01/day)
DX12 right now doesn't matter, and probably isn't going to matter for this generation of cards. Tomb Raider is the first AAA game we are likely to see utilize it, maybe, and it is likely the nVidia GPUs will support every feature necessary.

According to my data, the first full DX12 game (no betas or any other crap) will probably be Hitman (March 11 release date), with Quantum Break being the first DX12-only game (April 5). That is unless Tomb Raider's DX12 patch is released sooner than that...
 
Joined
Jan 2, 2015
Messages
1,099 (0.33/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
And here I thought I was talking about prices....of course, people take whatever context they damn want to suit their needs.

BTW, AMD doesn't offer full DX12 (12_1) support, Nvidia has offered 10-bit support since the Geforce 200 Series (http://nvidia.custhelp.com/app/answ...-bit-per-color-support-on-nvidia-geforce-gpus), and AFAIK Nvidia has beaten AMD in almost every DX11 game out there. I don't know (neither care) what DX12 will mean for the actual gen, but my argument about that is that transitional generations sucks, and that the real DX12 battle begins this year.

So that only leaves you with the HBM argument. Woohoo! Enjoy it for the few more months that remains, fanboy.



Maybe because, like I said, there's a market for them? How dense can you be? The fact that I said that APUs have no use for gamers or other resource-hungry apps doesn't mean that they wouldn't be sold at all.

And the absurd argument of "OMG INTEL/AMD INVENTED IT FIRST, SO IT'S BETTER/THEY'RE MORALLY BETTER THAN THE OTHER!" is just the last resort of fanboys to justify their customer choice. Fortunately I'm smart enough as for making my choices based on raw performance and price/performance ratios, and not out of "loyalty" to a CORPORATION you don't even work in.

Seriously, fanboys of any kind are a nuisance, but I swear AMD fanboys are a pest. You can't say the slightest thing against 'their' brand, or in favor of the competition without them crying around in opposition.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.25/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
According to my data, the first full DX12 game (no betas or any other crap) will probably be Hitman (March 11 release date), with Quantum Break being the first DX12-only game (April 5). That is unless Tomb Raider's DX12 patch is released sooner than that...

And it will really be interesting to see how much of a difference it will actually make. Obviously Microsoft is doing the same thing with Quantum Break as they did with Halo, making it DX12 only to try to force people to Windows 10(maybe I'll finally get around to formatting my main computer and installing it).

And when you look at the Steam survey, you've to almost 60% of the users running Windows 7 or 8 and only 35% running Windows 10. So is DX12 going to be a game changer this year?(pun not intended) No, I don't think so.
 
Joined
Jul 31, 2014
Messages
479 (0.14/day)
System Name Diablo | Baal | Mephisto | Andariel
Processor i5-3570K@4.4GHz | 2x Xeon X5675 | i7-4710MQ | i7-2640M
Motherboard Asus Sabertooth Z77 | HP DL380 G6 | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Cooling Swiftech H220-X | Chassis cooled (6 fans + HS) | dual-fanned heatpipes | small-fanned heatpipe
Memory 32GiB DDR3-1600 CL9 | 96GiB DDR3-1333 ECC RDIMM | 32GiB DDR3L-1866 CL11 | 8GiB DDR3L-1600 CL11
Video Card(s) Dual GTX 670 in SLI | Embedded ATi ES1000 | Quadro K2100M | Intel HD 3000
Storage many, many SSDs and HDDs....
Display(s) 1 Dell U3011 + 2x Dell U2410 | HP iLO2 KVMoIP | 3200x1800 Sharp IGZO | 1366x768 IPS with Wacom pen
Case Corsair Obsidian 550D | HP DL380 G6 Chassis | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Audio Device(s) Auzentech X-Fi HomeTheater HD | None | On-board | On-board
Power Supply Corsair AX850 | Dual 750W Redundant PSU (Delta) | Dell 330W+240W (Flextronics) | Lenovo 65W (Delta)
Mouse Logitech G502, Logitech G700s, Logitech G500, Dell optical mouse (emergency backup)
Keyboard 1985 IBM Model F 122-key, Ducky YOTT MX Black, Dell AT101W, 1994 IBM Model M, various integrated
Software FAAAR too much to list

Eventually, even the people who try and avoid the arguments snap from the constant shitty, uninformed circlejerk. I did so myself several threads ago when people were whining about how Intel wasn't innovating on the CPU side (here and later on in the same thread, here).. Here, it looks like @Kurt Maverick had his fill of the constant "AMD can do no bad and has never done any anti-consumer behaviour" circlejerk (spoiler: they're just as bad as Intel and nVidia when they're in the lead.. they're just not in the lead more often than not).
 
Joined
Dec 29, 2014
Messages
861 (0.25/day)
GCN only manages to close the gap at higher resolutions, it doesn't offer better performance. The Fury X is the best GCN card available, and it merely ties with the 980Ti at 4K, and the pre-overclocked 980Ti's are actually crushing the Fury X with 15%+ better performance at 4k. And since the Fury X's overclocking is lackluster, to say the least, there is no making that difference up by overclocking the Fury X.

I suspect that will be as good as it gets for Zen also. Almost competitive, late to market, only makes sense if sold cheap. Hard to make money that way.

AMD has dug such a deep hole in the last 10 years, it would take a miracle for them to crawl out of it. Even if they manage to put out something that *beats* Intel and Nvidia on price/performance, Intel and Nvidia can just drop the price on competing products to retain market share, and keep AMD where they are. That would be nice for consumers while it lasts, but it probably isn't going to make AMD profitable. They'd need to keep that going for years.

The only way out that I see, is AMD hitting homeruns for the next few years to demonstrate that the company has potential, and then they merge with a company that has cash to invest. Or is that even possible with the licenses?
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.43/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Yeah, because APU's are soooo useful outside of budget and office builds....

It doesn't matter. That is 98% of the CPU/APU business...not enthusiasts like us. So yeah, AMD's APU's are extremely useful.
 
Joined
Jan 8, 2016
Messages
34 (0.01/day)
And it will really be interesting to see how much of a difference it will actually make. Obviously Microsoft is doing the same thing with Quantum Break as they did with Halo, making it DX12 only to try to force people to Windows 10(maybe I'll finally get around to formatting my main computer and installing it).

And when you look at the Steam survey, you've to almost 60% of the users running Windows 7 or 8 and only 35% running Windows 10. So is DX12 going to be a game changer this year?(pun not intended) No, I don't think so.

It's still the future. I think that if there has even been a chance of a new API being a real game-changer, that's DX12 / Vulkan.

It doesn't matter. That is 98% of the CPU/APU business...not enthusiasts like us. So yeah, AMD's APU's are extremely useful.

And 99% of the people on the Internet thinks that a given percentage is false. Especially when it suits your argument so conveniently.

Anyway, I never denied that there's no market for them.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.43/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
And 99% of the people on the Internet thinks that a given percentage is false. Especially when it suits your argument so conveniently.

I don't have an argument. I merely point out that the enthusists are an EXTREME minority. The majority of cpu's and apu's sold go to business, followed by regular users who don't do anything special.

I'm sorry if you're disappointed that yours and my desires don't count for anything with either AMD or Intel.
 
Joined
Jan 2, 2015
Messages
1,099 (0.33/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
I don't have an argument. I merely point out that the enthusists are an EXTREME minority. The majority of cpu's and apu's sold go to business, followed by regular users who don't do anything special.

I'm sorry if you're disappointed that yours and my desires don't count for anything with either AMD or Intel.
gaming is a lot of money tho and gaming laptops sell well.. a lot of people actually know that you want to see a amd or nvidia logo somewhere. thats about it tho and well msi and asus have been making other laptop oem's put up a better fight to stay in business
 
Joined
Jul 31, 2014
Messages
479 (0.14/day)
System Name Diablo | Baal | Mephisto | Andariel
Processor i5-3570K@4.4GHz | 2x Xeon X5675 | i7-4710MQ | i7-2640M
Motherboard Asus Sabertooth Z77 | HP DL380 G6 | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Cooling Swiftech H220-X | Chassis cooled (6 fans + HS) | dual-fanned heatpipes | small-fanned heatpipe
Memory 32GiB DDR3-1600 CL9 | 96GiB DDR3-1333 ECC RDIMM | 32GiB DDR3L-1866 CL11 | 8GiB DDR3L-1600 CL11
Video Card(s) Dual GTX 670 in SLI | Embedded ATi ES1000 | Quadro K2100M | Intel HD 3000
Storage many, many SSDs and HDDs....
Display(s) 1 Dell U3011 + 2x Dell U2410 | HP iLO2 KVMoIP | 3200x1800 Sharp IGZO | 1366x768 IPS with Wacom pen
Case Corsair Obsidian 550D | HP DL380 G6 Chassis | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Audio Device(s) Auzentech X-Fi HomeTheater HD | None | On-board | On-board
Power Supply Corsair AX850 | Dual 750W Redundant PSU (Delta) | Dell 330W+240W (Flextronics) | Lenovo 65W (Delta)
Mouse Logitech G502, Logitech G700s, Logitech G500, Dell optical mouse (emergency backup)
Keyboard 1985 IBM Model F 122-key, Ducky YOTT MX Black, Dell AT101W, 1994 IBM Model M, various integrated
Software FAAAR too much to list
gaming is a lot of money tho and gaming laptops sell well.. a lot of people actually know that you want to see a amd or nvidia logo somewhere. thats about it tho and well msi and asus have been making other laptop oem's put up a better fight to stay in business

Gaming is big money for GPU manufacturers, because on the consumer-facing side of things, that's the only thing still needing serious power. The rest of the time, an iGPU is just fine. For Intel on the other hand, gaming and enthusiast is but a tiny bit of marketshare. It's mostly the same story for AMD's CPU side, just slightly less skewed because AMD have no mobile CPUs worth talking about (when is the last time you saw a major laptop vendor ship AMD?).
 
Joined
Jan 2, 2015
Messages
1,099 (0.33/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
Gaming is big money for GPU manufacturers, because on the consumer-facing side of things, that's the only thing still needing serious power. The rest of the time, an iGPU is just fine. For Intel on the other hand, gaming and enthusiast is but a tiny bit of marketshare. It's mostly the same story for AMD's CPU side, just slightly less skewed because AMD have no mobile CPUs worth talking about (when is the last time you saw a major laptop vendor ship AMD?).
i think i just seen maybe dell or hp is shipping with a 380m for gaming but other than that. ya it wasnt a good situation before maxwell and then only got worse but i do recall some pretty nice gaming laptops being made with 290m and so is the 5k mac
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.29/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
10-Bit color has no affect on the consumer market. 10-Bit panels are expensive, though they are getting cheaper, but your still looking at $350 for a GW2765HT, and that is an exception to the rule, the next cheapest panel after that is the $500 ASUS PB287Q. And 10-bit isn't noticeable better than 8-Bit to the average consumer(that is why almost no 4K TVs support it). 10-Bit is a Professional feature, that only professionals will notice if lacking. And I know someone is going to post those BS pictures with the on on the left being 256 Color, and the one on the right being 8-bit, and claim "look at the difference 10-Bit makes!" Don't waste your time, those pictures all over exaggerate the actual difference. On top of that, AMD cards lack HDMI 2.0, a far more consumer friendly feature than 10-bit. There are a lot of people that like to connect their computer to their TV. I'm sure there are a lot of consumers, a lot more than those that care about 10-bit support, that only can afford to buy one 4K device, and that is their TV, but also want to connect their high end gaming computer to it, to play games in 4K. Not with AMD cards you aren't, because most 4K TVs don't have displayport. So it's an extra $30 for an adapter if you want to do that.

DX12 right now doesn't matter, and probably isn't going to matter for this generation of cards. Tomb Raider is the first AAA game we are likely to see utilize it, maybe, and it is likely the nVidia GPUs will support every feature necessary.

HBM shouldn't be a marketing bullet when HBM cards are still losing in performance to GDDR5 cards. So does it really make a difference? Are people with the Fury cards going "Hey look, my card is so much better because it has HBM...even though it still performs worse...but it is so much better because HBM!"? And what happens when the next generation of nVidia cards come out with HBM2 and AMD is still using HBM1? Are you going to say the nVidia card is now better because it is using the better form of HBM, the next generation of HBM? Somehow I doubt it. HBM isn't a value added feature.

The Nano doesn't offer better performance per watt. It is worse than the 980 and ties the 980Ti at 4K, it is worse than the 980Ti and 980 at 1440p, and is worse than the 970, 980, and 980Ti at 1080p.

GCN only manages to close the gap at higher resolutions, it doesn't offer better performance. The Fury X is the best GCN card available, and it merely ties with the 980Ti at 4K, and the pre-overclocked 980Ti's are actually crushing the Fury X with 15%+ better performance at 4k. And since the Fury X's overclocking is lackluster, to say the least, there is no making that difference up by overclocking the Fury X.

A lot of this is very true my only argument would be one those adapters aren't $30 for me and I wouldn't by any fury other than the nano. Overclock it and you have a cheaper fury x.

I also haven't really seen any benchmark showing gddr5 beating hbm?
 
Joined
Jun 21, 2013
Messages
535 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
If past AMDs PR and the real world performance figures are of any indication, the claimed 40% improvement will be just a best case scenario figure in one out of 50 difference benchmarks, while the average performance increase will be 20% tops.
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.29/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
If past AMDs PR and the real world performance figures are of any indication, the claimed 40% improvement will be just a best case scenario figure in one out of 50 difference benchmarks, while the average performance increase will be 20% tops.

This is normally the case. I am hoping for once (and considering who designed the CPU) that this is not the case.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.25/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
A lot of this is very true my only argument would be one those adapters aren't $30 for me and I wouldn't by any fury other than the nano. Overclock it and you have a cheaper fury x.

I also haven't really seen any benchmark showing gddr5 beating hbm?

I haven't seen a Displayport to HDMI2.0 adapter yet that was cheaper than $30.

Just look at the latest 980Ti Matrix benchmark here on TPU. It beats the Fury X by 17% at 4K. Sure, HBM provides more memory bandwidth than GDDR5, but when the overall card is still slower what's the point?
 
Joined
Jan 2, 2015
Messages
1,099 (0.33/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
I haven't seen a Displayport to HDMI2.0 adapter yet that was cheaper than $30.

Just look at the latest 980Ti Matrix benchmark here on TPU. It beats the Fury X by 17% at 4K. Sure, HBM provides more memory bandwidth than GDDR5, but when the overall card is still slower what's the point?
its changing tho too and fiji gets better scaling with crossfire. its more than about the bandwidth that hbm provides.. its the low latency architecture that it comes with. it benefits vr among a long list of everything. duel fiji will be shown to blow away 980ti sli in the some situation.
about the article...
it looks like some serious architecture backing up those shiny new 14nm cores. not just low latency but high bandwidth.
along with dx12 its like turning a 4 lane highway into a 20 lane and making the speed limit 100% faster. the async highway
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.25/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
its changing tho too and fiji gets better scaling with crossfire. its more than about the bandwidth that hbm provides.. its the low latency architecture that it comes with. it benefits vr among a long list of everything. duel fiji will be shown to blow away 980ti sli in the some situation.

Whenever someone says "some situations" I immediately add "but those situations are the exception to the norm" in my head.
 
Joined
Jan 2, 2015
Messages
1,099 (0.33/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
Whenever someone says "some situations" I immediately add "but those situations are the exception to the norm" in my head.
it does happen a lot when vram becomes the processing bottleneck and does happen at 4k and up.
 
Joined
Apr 2, 2009
Messages
3,505 (0.64/day)
Me is not gonna say anything until I see an ES chip review (or leak). Right now, we have nothing other than words.
 
Top