• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD "Fiji XT" SKU Name Revealed, ATI Rage Legacy Reborn?

Joined
Sep 7, 2011
Messages
2,785 (0.92/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
They should be given all their GPUs since 2012 seem to be DX 12 tier 3 compliant
Yet another adventure in incomplete information. The resource binding Tiers are actually subsets of the DirectX 11 hardware specification, as has been recounted by virtually every DX12 article published since the specification was publicized, and hardly surprising since DirectX 12 grew out of the DirectX 11.x API used in the Xbox One console.
Your 7850 is already DX12 capable just need windows 10 and driver
Like any " DirectX 12 capable card", it depends upon what features are actually implemented at a game/app level. The card could be DirectX 12 capable, but if the game developer chooses to use - as example, 12_1 conservative rasterization, then the HD 7850 wont be compliant with that feature. Making a blanket statement that a GCN 1.0 architecture card is DX12 capable requires some definite caveats.



It's kind of like how AMD's GPUs were supposedly more " future proof" because they were DX11.2 complaint, yet a total of zero games actually ever used those resources for any discrete graphics card. Hardware support is DX11 based, feature set is both DX11 and DX12 based.
 
Joined
Apr 30, 2012
Messages
3,477 (1.25/day)
Yet another adventure in incomplete information. The resource binding Tiers are actually subsets of the DirectX 11 hardware specification, as has been recounted by virtually every DX12 article published since the specification was publicized, and hardly surprising since DirectX 12 grew out of the DirectX 11.x API used in the Xbox One console.

Like any " DirectX 12 capable card", it depends upon what features are actually implemented at a game/app level. The card could be DirectX 12 capable, but if the game developer chooses to use - as example, 12_1 conservative rasterization, then the HD 7850 wont be compliant with that feature. Making a blanket statement that a GCN 1.0 architecture card is DX12 capable requires some definite caveats.



It's kind of like how AMD's GPUs were supposedly more " future proof" because they were DX11.2 complaint, yet a total of zero games actually ever used those resources for any discrete graphics card. Hardware support is DX11 based, feature set is both DX11 and DX12 based.
So what your saying is Tier 1 is higher then Tier 3 ?

Feature Set 12_1 just makes what is an optional features in lower set mandatory. If the GPU supports Tier 3 then there is no caveats other then software implementation.
 
Last edited:
Joined
Jun 13, 2012
Messages
1,110 (0.40/day)
System Name desktop
Processor i7-4770k
Motherboard Asus z87-plus
Cooling Corsair h80
Memory 32gb G.Skill Ares @ 2400mhz
Video Card(s) EVGA GeForce GTX 1080 SC (ACX 3.0)
Storage 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB271HU 27inch IPS G-Sync 165hz
Audio Device(s) Sound Blaster x-FI Platium, Turtle beach Elite pro 2 + superamp.
Power Supply OCZ Z Series 850W
Mouse Razer Deathadder Elite
Keyboard Logitch G710+
Power issue is irrelevant anyway for most gamers. If the 390x uses 50 more watts than the Titan X and you game 15 to 20 hours a week at the USA national average of 12 cents per kWh then it adds 39 to 43 cents per month to your power bill.
Here is the thing to think about, 290x had 1x8pin and 1x8pin and drew 300watts. the new "fury" has 2x8pin. So 50watts more? given AMD's history kinda seems like that 50watts could be 150watts. Yea power bill may not seem like much but that is still a lot heat being made.
 
Joined
Sep 7, 2011
Messages
2,785 (0.92/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
So what your saying is Tier 1 is higher then Tier 3 ?
No. I pretty much thought it was obvious. The feature set and the hardware level aren't the same thing, and saying that that a GCN 1.0 card is DirectX12 capable doesn't take into account the features actually implemented on a software level on a particular game.
Just for the record, I made no mention of Tier levels, and as far as I can tell, neither did you...so why bother trying to insinuate them into an argument?
If the GPU supports Tier 3 then there is no caveats other then software implementation.
That's actually a pretty big caveat. :rolleyes:
 
Last edited:
Joined
Apr 30, 2012
Messages
3,477 (1.25/day)
No. I pretty much thought it was obvious. The feature set and the hardware level aren't the same thing, and saying that that a GCN 1.0 card is DirectX12 capable doesn't take into account the features actually implemented on a software level on a particular game.
Well you made it seam it was. The one I linked clearly says conservative rasterization Tier 3. You implied somehow CR Tier 2 wouldn't be supported even though it listed as supporting CR Tier 3. If you want to argue about implementation your a little late since almost all games never implement more then they need to or can afford to. Then again I don't recall you making such points for every game ever released.

Not sure what the fuzz is about given the question that was made.
 
Joined
Sep 7, 2011
Messages
2,785 (0.92/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Well you made it seam it was. The one I linked clearly says conservative rasterization Tier 3. You implied somehow CR Tier 2 wouldn't be supported even though it listed as supporting CR Tier 3.
Uh, no. You're talking about hardware level support and I was obviously referring to making the distinction between hardware level support and feature level support. The first is baked into the architecture, the second are features for inclusion in software.
The list of DX12 " supported" hardware has been out for some time. It basically covers all DirectX 11 hardware with the exception of VLIW architecture, so technically most (if not all) current GPUs are DX12 compliant - it does not follow that all GPUs have access to all DX12 features.
If you want to argue about implementation your a little late since almost all games never implement more then they need to or can afford to.
You were having an extended BRB while both AMD's Gaming Evolved and Nvidia's GameWorks/TWIMTBP, have been adding features over and above the "need" level for some years, whether it be useless compute shader cycles, or PhysX, or extreme tessellation. By your reckoning EA wouldn't have looked to optimize CPU performance by implementing DX11.1 features for BF4....yet they did for some unfathomable reason aside from giving Win8 systems a 3%-6% edge in performance over the same hardware running DX11/Win7.
Not sure what the fuzz is about given the question that was made.
I would tend to note that a blanket question that has caveats attached might warrant a fuller explanation. Given the length of your answering post, it would seem trivial to add that not all feature sets are available to each architecture. But to each their own.
 
Joined
Apr 30, 2012
Messages
3,477 (1.25/day)
Uh, no. You're talking about hardware level support and I was obviously referring to making the distinction between hardware level support and feature level support. The first is baked into the architecture, the second are features for inclusion in software.
The list of DX12 " supported" hardware has been out for some time. It basically covers all DirectX 11 hardware with the exception of VLIW architecture, so technically most (if not all) current GPUs are DX12 compliant - it does not follow that all GPUs have access to all DX12 features.

You were having an extended BRB while both AMD's Gaming Evolved and Nvidia's GameWorks/TWIMTBP, have been adding features over and above the "need" level for some years, whether it be useless compute shader cycles, or PhysX, or extreme tessellation. By your reckoning EA wouldn't have looked to optimize CPU performance by implementing DX11.1 features for BF4....yet they did for some unfathomable reason aside from giving Win8 systems a 3%-6% edge in performance over the same hardware running DX11/Win7.

I would tend to note that a blanket question that has caveats attached might warrant a fuller explanation. Given the length of your answering post, it would seem trivial to add that not all feature sets are available to each architecture. But to each their own.
You seem bothered. :laugh:

Simple solution for a grumpy old man. You could of just answered the mans question in your own way instead.
 
Joined
Oct 22, 2014
Messages
7,165 (3.81/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E5-2680 10c/20t 2.8GHz @ 3.0GHz
Motherboard Asrock X79 Extreme 11
Cooling Coolermaster 240 RGB A.I.O.
Memory G. Skill 16Gb (4x4Gb) 2133Mhz
Video Card(s) Nvidia GTX 710
Storage Sandisk X 400 256Gb
Display(s) AOC 22" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Home Premium 64 bit
The brand naming indicates that AMD wants to change the terms on which its top-end product competes with NVIDIA's. Low noise and high-performance will be the focus, not power draw. Nobody buys an Aventador for its MPG.
Reading between the lines, that means they couldn't do it. :rolleyes:
 
Joined
Aug 20, 2007
Messages
12,100 (2.69/day)
System Name Pioneer
Processor Intel i9 9900k
Motherboard ASRock Z390 Taichi
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory G.SKILL TridentZ Series 32GB (4 x 8GB) DDR4-3200 @ 13-13-13-33-2T
Video Card(s) EVGA GTX 1080 FTW2
Storage Mushkin Pilot-E 2TB NVMe SSD w/ EKWB M.2 Heatsink
Display(s) LG 32GK850G-B 1440p 32" AMVA Panel G-Sync 144hz Display
Case Thermaltake Core X31
Audio Device(s) Onboard TOSLINK to Schiit Modi MB to Schiit Asgard 2 Amp to AKG K7XX Ruby Red Massdrop Headphones
Power Supply EVGA SuperNova T2 850W 80Plus Titanium
Mouse ROCCAT Kone EMP
Keyboard WASD CODE 104-Key w/ Cherry MX Green Keyswitches, Doubleshot Vortex PBT White Transluscent Keycaps
Software Windows 10 x64 Enterprise... yes, it's legit.
As in Erinyes ( Furies of Greek mythology) ?
Bear in mind Furies came into creation from the blood spilled when Uranus was castrated.
I don't know what you're talking about, but you keep your castrator away from my anus!

Fury aye? I can dig it... Would love AMD/ATi to bring back Ruby advertisements/graphical benchmarks
Aye, Ruby, she's fallen a long way since her glory days...

 
Joined
Sep 17, 2014
Messages
10,469 (5.47/day)
Location
Mars
Processor i7 8700k 4.7Ghz @ 1.26v
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) MSI GTX 1080 Gaming X @ 2100/5500
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Eizo Foris FG2421
Case Fractal Design Define C TG
Power Supply EVGA G2 750w
Mouse Logitech G502 Protheus Spectrum
Keyboard Sharkoon MK80 (Brown)
Software W10 x64
It may have, but I think it to be silly. Rage. It is a negative emotion. I seem to be alone in this, but to me it is like calling it AMD Depression, AMD Violence, or AMD Malicious. It is just negative.


I will post this link for something that talks about constructive anger. Got it from a very quick google search. I even just skimmed it instead of really reading it.
http://www.spring.org.uk/2012/03/the-upside-of-anger-6-psychological-benefits-of-getting-mad.php

All 6 benefits I don't think need anger at all. You can do every one of those with a positive emotion. Anger is negative. Anger is low quality.
Wait here, I'll go ask AMD to name their new GPU after flowers and cookies.
 
Joined
May 9, 2012
Messages
6,658 (2.40/day)
Location
Ovronnaz, Wallis, Switzerland
System Name Monster Panzer Max [MPM]/Nostalg33k/Fiio X5 3rd gen/Xiaomi Mi Box S/Honor View 20
Processor i5-6600K 3.9/E8500/RK3188/S905X 4X1.5 A53/Kirin 980 2xA76 (2.6 GHz)+2xA76 (1.92)+4xA53 (1.8GHz)
Motherboard Gigabyte Z170X Gaming 7/XFX 650i Ultra/Fiio/uh?/uh?
Cooling Corsair H115i /Alphacool Eisberg /uh?/Aluminum heatsink/Heatpipe
Memory 4x4gb HyperX Predator 2800 CL14/2gb DDR2 800/1gb/2gb LPDDR3/6gb LPDDR4X dual channel
Video Card(s) MSI GTX 1070 ARMOR 8gb OC/Asus 8800 Ultra/Mali 400MP4/Mali 450MP5/Mali G76MP10
Storage 120gb OCZ VertexIII,1tb/8gb SSHD,2xToshiba 1tb/none/32gb+64gb/8gb/128gb UFS 2.1
Display(s) Medion X58222 32"5ms OC 75hz 2880x1620/Philips 273E3LHSB 27"1ms 1920x1080/~4" 480x800/6.4 inch FHD+
Case Cougar Panzer Max/none/Aluminum and tempered glass/None/alu frame +back/front glass
Audio Device(s) Fiio Q1 Mark II+Logitec Z333/SB Audigy2 Platinum/dual AK4490EN /HDMI audio output/Trn V60/Fiio Fa1
Power Supply Seasonic M12II Evo 750 /Enermax Coolergiant 480/12v 1.5A/Aukey QC3.0 9-12V 1.96A
Mouse Asus ROG Spatha/touch/Xiaomi XMRM-006/touch
Keyboard GMMK TKL+Gateron Red+white keys/touch/none/touch
Software Win10 64/none/Android 5.1.1 custom/Android TV 8.1/Android 9.1.0
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
.........ah....is Kingston willing to share?
AMD/ATI was still first on that ... so Kingston has nothing to say in it :rolleyes:
people tend to forget that ATI used Fury in the naming scheme since 1999 ;)

but some other brand surely have used Fury before them ... well maybe not in computer
(altho when people say Fury i think ATI :laugh: )

Wait here, I'll go ask AMD to name their new GPU after flowers and cookies.
NO! cookies are evil, cookies are dark side!
Flower ... well ok .... but only:
Drosera, Nepenthes, Dionea and Darlingtonia...
 
  • Like
Reactions: 64K
Joined
Mar 13, 2014
Messages
4,300 (2.04/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI GTX 980 Ti GAMING
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) ASUS 27 inch 1440p PLS PB278Q
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Here is the thing to think about, 290x had 1x8pin and 1x8pin and drew 300watts. the new "fury" has 2x8pin. So 50watts more? given AMD's history kinda seems like that 50watts could be 150watts. Yea power bill may not seem like much but that is still a lot heat being made.
Let's have a look at the last Flagships from Nvidia and AMD. The reference GTX 780 Ti drew 229 watts average and 269 watts peak. The reference R9 290x drew 246 watts average and 282 peak. The difference is irrelevant on the power bill for most gamers.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_780_Ti/25.html

http://www.techpowerup.com/reviews/AMD/R9_290X/25.html

Do you really believe that the 390x will draw 150 watts more than the Titan X? Looking at the review on the Titan X that would mean the 390x would draw 373 watts average and 393 watts peak.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/27.html

Can we just stick to the tech and not be hysterical please?
 
Joined
Sep 5, 2004
Messages
1,901 (0.34/day)
Location
The Kingdom of Norway
System Name Wiak's Gaming Rig 2017
Processor Ryzen 1700X
Motherboard ASUS PRIME X370-PRO
Cooling Noctua NB-9B SE2
Memory Corsair Vengenace LPX 3200 CL16 @ 2933
Video Card(s) MSI Radeon 480 8GB Gaming X
Storage Samsung 960 EVO 500GB / Samsung 850 EVO 1TB
Display(s) 3x Dell U2412M
Case Corsair 200R
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850
Mouse Corsair Sabre Laser
Keyboard Logitech Orion Brown (G610)
Software Windows 10?
if its fury, they might have two cards up their sleaves, like a "Radeon Fury" and a "Radeon Fury Maxx (dual GPU)"
why i think there will be a dual gpu one, why not?, HBM will save like 94% of the board and if they use watercooling they cool it pretty easy, given the fact that HBM also uses alot less power per speed, and if they underclock it per gpu, its a viable option
 
Joined
Sep 5, 2004
Messages
1,901 (0.34/day)
Location
The Kingdom of Norway
System Name Wiak's Gaming Rig 2017
Processor Ryzen 1700X
Motherboard ASUS PRIME X370-PRO
Cooling Noctua NB-9B SE2
Memory Corsair Vengenace LPX 3200 CL16 @ 2933
Video Card(s) MSI Radeon 480 8GB Gaming X
Storage Samsung 960 EVO 500GB / Samsung 850 EVO 1TB
Display(s) 3x Dell U2412M
Case Corsair 200R
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850
Mouse Corsair Sabre Laser
Keyboard Logitech Orion Brown (G610)
Software Windows 10?
given the fact they use a imposer now, it might be even possible to put a pair of GPUs on that together with HBM memory..
think of the awesome perf of such a card
 
Joined
Sep 7, 2011
Messages
2,785 (0.92/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
given the fact they use a imposer now, it might be even possible to put a pair of GPUs on that together with HBM memory.
Unlikely at this stage. The interposer for Fiji scales out to ~ 830-860mm² judged from available metrics. The cost and yield rate for such a low production part or at least twice that size (remember that interposer module assembled by UMC includes the whole package- HBM stacks and GPU) would likely be prohibitive for the size and production run, and that's assuming UMC could produce viable interposers of that size in the first place, which seems doubtful.

There wouldn't be anything to stop AMD devising a dual card with dual interposer modules as far as I'm aware.
 
Joined
Dec 22, 2011
Messages
2,993 (1.03/day)
System Name Zimmer Frame Rates
Processor Intel i7 920 @ Stock speeds baby
Motherboard EVGA X58 3X SLI
Cooling True 120
Memory Corsair Vengeance 12GB
Video Card(s) Palit GTX 980 Ti Super JetStream
Storage Of course
Display(s) Crossover 27Q 27" 2560x1440
Case Antec 1200
Audio Device(s) Don't be silly
Power Supply XFX 650W Core
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 10
Benchmark Scores Epic
Let's have a look at the last Flagships from Nvidia and AMD. The reference GTX 780 Ti drew 229 watts average and 269 watts peak. The reference R9 290x drew 246 watts average and 282 peak. The difference is irrelevant on the power bill for most gamers.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_780_Ti/25.html

http://www.techpowerup.com/reviews/AMD/R9_290X/25.html

Do you really believe that the 390x will draw 150 watts more than the Titan X? Looking at the review on the Titan X that would mean the 390x would draw 373 watts average and 393 watts peak.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/27.html

Can we just stick to the tech and not be hysterical please?
Indeed, makes you wonder why people kicked up such a stink about the GTX 480, pesky double standards.
 
Joined
Jun 13, 2012
Messages
1,110 (0.40/day)
System Name desktop
Processor i7-4770k
Motherboard Asus z87-plus
Cooling Corsair h80
Memory 32gb G.Skill Ares @ 2400mhz
Video Card(s) EVGA GeForce GTX 1080 SC (ACX 3.0)
Storage 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB271HU 27inch IPS G-Sync 165hz
Audio Device(s) Sound Blaster x-FI Platium, Turtle beach Elite pro 2 + superamp.
Power Supply OCZ Z Series 850W
Mouse Razer Deathadder Elite
Keyboard Logitch G710+
Do you really believe that the 390x will draw 150 watts more than the Titan X? Looking at the review on the Titan X that would mean the 390x would draw 373 watts average and 393 watts peak.
Other GCN 1.2 card which is 285 has half the cores fury had and drew 200watts. So AMD now doubled it, so 350-400watts is very possible draw for the card. Remember they are shipping the damn thing with a water cooler on it.
 
Joined
Mar 13, 2014
Messages
4,300 (2.04/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI GTX 980 Ti GAMING
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) ASUS 27 inch 1440p PLS PB278Q
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Indeed, makes you wonder why people kicked up such a stink about the GTX 480, pesky double standards.
The tech sites definitely had a field day on that GPU. :)
 
Joined
Mar 10, 2010
Messages
6,850 (1.92/day)
Location
Manchester uk
System Name RyzenGtEvo
Processor Amd R7 3800X@4.350/525
Motherboard Crosshair hero7 @bios 2703
Cooling 360EK extreme rad+ 360$EK slim all push, cpu Monoblock Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in two sticks.
Video Card(s) Sapphire refference Rx vega 64 EK waterblocked
Storage Samsung Nvme Pg981, silicon power 1Tb samsung 840 basic as a primocache drive for, WD2Tbgrn +3Tbgrn,
Display(s) Samsung UAE28"850R 4k freesync, LG 49" 4K 60hz ,Oculus
Case Lianli p0-11 dynamic
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi
Mouse Roccat Kova
Keyboard Roccat Iksu force fx
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy
People like a debate its the way :)

If you're going to spread fud it might as well be viable sooo if a Fury is a 390X with Stacked memory that might allow four derivatives 8gb /4gb each watercooled or air

Then another two versions with ddr5 so says the rumour mill sound's odd to me so some of this has to be nonsense


I think the 8gb watercooled will be the fury ,dunno if im buying or not I might hold off.
 
Last edited:
Joined
Sep 29, 2013
Messages
97 (0.04/day)
Processor Intel i7 4960x Ivy-Bridge E @ 4.6 Ghz @ 1.42V
Motherboard x79 AsRock Extreme 11.0
Cooling EK Supremacy Copper Waterblock
Memory 65.5 GBs Corsair Platinum Kit @ 666.7Mhz
Video Card(s) PCIe 3.0 x16 -- Asus GTX Titan Maxwell
Storage Samsung 840 500GBs + OCZ Vertex 4 500GBs 2x 1TB Samsung 850
Audio Device(s) Soundblaster ZXR
Power Supply Corsair 1000W
Mouse Razer Naga
Keyboard Corsair K95
Software Zbrush, 3Dmax, Maya, Softimage, Vue, Sony Vegas Pro, Acid, Soundforge, Adobe Aftereffects, Photoshop
I only care about relative power draw. If I can get 50% more perf for the same power draw, that's all good. I'd rather have 50% more perf at same power than same perf at 50% power. Titan X disappointed (stock level) in that regard as it's perf wasn't 'awesome' (but it's power wasn't bad).

If 'Fury' (if that is to be it, though sounds too much like the new Mad Max movie title) delivers the perf I want to see at the same power draw, it's the way forward for me. And i must add, it looks more and more as if 980ti is being thrust out fast to make a quick buck before the AMD card drops and takes the crown. BUT, if the AMD card doesn't take the crown, it'll be a very bad day (IMO) for AMD. Like i say though, I feel Nvidia are rushing a bit too much here so I'm erring on the AMD being better option.

Source:
Hagedoorn, Hilbert, 05-30-2015, EVGA GeForce GTX 980 Ti SKU lineup leaks with prices:
http://www.guru3d.com/news-story/evga-leaks-geforce-gtx-980-ti-sku-lineup-and-prices.html

Above is bolded to support my point in this post. With the NVidia GTX 980 Ti going between $799.99 and $827.99 for the drop in performance to a GTX Titan-X, taking into consideration what I've bold up above, it's a safe bet that this will be true. Also, this is the worst case scenario, but this is also taking account that GTX 980 Ti was never going to beat AMD R9-390x Aka Rage by much on performance differences or averages.

I've always suspected a head of time before the announcement of D3D12.0 that for the CPU side, there has always been a bottleneck of some form. This is what brought performance down for PC Games. On the GPU side, I believe it's the framebuffer speeds. Taking into account that there are other areas besides CPU bottleneck and framebuffer speeds that drop performance. Other areas that slow down performance: The CPU does a lot of other renderings like Shadows, Particle Effects, Physics, etc... If these things could be offloaded to the GPU, performance would go up even further. D3D12.0 aka DirectX 12.0 is a solution to the CPU bottleneck, and in some ways, utilization of HBM is a solution to the gain in performance for Discrete Graphic Cards. Implementation of HBM now for AMD is going to be to their benefit for this generation, and this is probably the defining x-factor that's making NVidia sell their units faster in comparison to previous generations. So yes, in conclusion, NVidia will possibly start thrusting out their products faster, this generation, before AMD goes live with their Flavor of the Generation, to make a quick buck.


Back to the post:

The Rage Branding is nothing more than a marketing gimmick. Gimmicks don't sell, even with a history of lacking performance and issues for AMD's past. It works for NVidia because they have a track record of churning out products that work, or have a higher rate of not failing, or consumers are happier with NVidia products over AMD products. NVidia sells on gimmicks because it has a past history of delivering or meeting expectations probably over 75% to 90% of the time. This is why NVidia can sell products at a higher price on marketing gimmicks. AMD, I feel you are getting a little ahead of yourself with this. Build a reputation of products that work, that bring the performance, and are idiot-proof for consumers. Once you make a big deal about that, then you can start selling your consumers the b.s. that NVidia is good at for the past 10 years.

Just for the record, Uranus or Ouranos isn't a Titan. He's a Primordial God. The Titan's are just the reject inbreeds that were brought into existence between the Primordial Gods. A lot of people believe that Uranus is related to the Zodiac Gemini because Ouranos represents Heaven, and Heaven is in the air. This ideology is false. What Gemini represents as a Zodiac, or a connection to God or Gods, is the ideology of Duality. Duality is exerted or intertwined on everything. If Taurus represents Mother Earth aka Gaia (Gaea), than Uranus is possibly the dualistic opposite of Gaia.

The whole Titan History and Mythology is just an interpretation of how things came to existence that they (pre-greeks) can't explain and didn't fully understand. In addition, Greek Mythology is just refined religion and information that they got from the Mesopotamian (Bablyonian and Assyrians). The Hebrews did it with the Abraham-God in relation to the Babylonian God, Anu, (who is the chief god, the god who created the Heavens and Creation itself).

For NVidia, the Titan name-brand is nothing more than a marketing gimmick. AMD is just following NVdia's idea of it with the Rage gimmick.
 
Joined
Dec 22, 2011
Messages
2,993 (1.03/day)
System Name Zimmer Frame Rates
Processor Intel i7 920 @ Stock speeds baby
Motherboard EVGA X58 3X SLI
Cooling True 120
Memory Corsair Vengeance 12GB
Video Card(s) Palit GTX 980 Ti Super JetStream
Storage Of course
Display(s) Crossover 27Q 27" 2560x1440
Case Antec 1200
Audio Device(s) Don't be silly
Power Supply XFX 650W Core
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 10
Benchmark Scores Epic
The tech sites definitely had a field day on that GPU. :)
Even AMD did one of their dodgy videos about it, which from what I gather just confirmed how well it sold. :p
 
Joined
Sep 7, 2011
Messages
2,785 (0.92/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
If you're going to spread fud it might as well be viable sooo if a Fury is a 390X with Stacked memory that might allow four derivatives 8gb /4gb each watercooled or air
You're in bonus territory right there.
Aren't the Fury and the 390X two completely different SKUs? The Fury (Fiji-based) with 4GB of HBM (as AMD themselves have confirmed regarding the 4GB limitation), while the 390X is a reworked Hawaii-based card - essentially the 290X with 4GB or 8GB of GDDR5?
The Rage Branding is nothing more than a marketing gimmick. Gimmicks don't sell, even with a history of lacking performance and issues for AMD's past.
They can sell, it just depends upon how they are marketed, and what status the brand already enjoys.
It works for NVidia because they have a track record of churning out products that work, or have a higher rate of not failing, or consumers are happier with NVidia products over AMD products. NVidia sells on gimmicks because it has a past history of delivering or meeting expectations probably over 75% to 90% of the time.
Comes down to brand awareness - what is generally ascribed as TOMA - Top of the Mind Awareness. Market leaders tend to get that way just as much by public perception as by products. Lead or create a market rather than follow another companies strategy, build a company persona based on the companies products and services rather than comparing with the competition ( When was the last time Intel publicly compared any of its products with AMD's?). Being perceived as a market leader is half the battle. The other half is building a successful product/service base as you've said, and showing the market well defined strategies with consistent meeting of strategic goals. Changing company emphasis at the drop of a hat or revolving door management changes, tends to convey the impression of a indecisiveness - not a quality associated with a TOMA brand.
For NVidia, the Titan name-brand is nothing more than a marketing gimmick.
Pretty much. The company saw that their product lines were seeing significant uptake in other markets ( desktop " supercomputers", rendering etc.) with boards like the GTX 580, and slapped a new naming nomenclature designed to leverage a high price with an aura of exclusivity. It seems to have worked...
AMD is just following NVdia's idea of it with the Rage gimmick.
...bearing in mind the old adage, "Imitation is the sincerest form of flattery".
"Fury" leverages a name revered in the annals of ATI, so why not go with it? As to whether it succeeds, that might depend upon how it fares in the benchmark war. Could be "Radeon Fury cowers Nvidia" , or " Radeon Fury at not being top dog"....or more likely, given previous releases between the companies, the name will engender little if the benchmarks are split between vendors depending upon title, game IQ, and resolution. The fact that the 980 Ti is being clocked by vendors at 1200MHz+ out of the box, might suggest a close race.
 
Last edited:
Joined
May 9, 2012
Messages
6,658 (2.40/day)
Location
Ovronnaz, Wallis, Switzerland
System Name Monster Panzer Max [MPM]/Nostalg33k/Fiio X5 3rd gen/Xiaomi Mi Box S/Honor View 20
Processor i5-6600K 3.9/E8500/RK3188/S905X 4X1.5 A53/Kirin 980 2xA76 (2.6 GHz)+2xA76 (1.92)+4xA53 (1.8GHz)
Motherboard Gigabyte Z170X Gaming 7/XFX 650i Ultra/Fiio/uh?/uh?
Cooling Corsair H115i /Alphacool Eisberg /uh?/Aluminum heatsink/Heatpipe
Memory 4x4gb HyperX Predator 2800 CL14/2gb DDR2 800/1gb/2gb LPDDR3/6gb LPDDR4X dual channel
Video Card(s) MSI GTX 1070 ARMOR 8gb OC/Asus 8800 Ultra/Mali 400MP4/Mali 450MP5/Mali G76MP10
Storage 120gb OCZ VertexIII,1tb/8gb SSHD,2xToshiba 1tb/none/32gb+64gb/8gb/128gb UFS 2.1
Display(s) Medion X58222 32"5ms OC 75hz 2880x1620/Philips 273E3LHSB 27"1ms 1920x1080/~4" 480x800/6.4 inch FHD+
Case Cougar Panzer Max/none/Aluminum and tempered glass/None/alu frame +back/front glass
Audio Device(s) Fiio Q1 Mark II+Logitec Z333/SB Audigy2 Platinum/dual AK4490EN /HDMI audio output/Trn V60/Fiio Fa1
Power Supply Seasonic M12II Evo 750 /Enermax Coolergiant 480/12v 1.5A/Aukey QC3.0 9-12V 1.96A
Mouse Asus ROG Spatha/touch/Xiaomi XMRM-006/touch
Keyboard GMMK TKL+Gateron Red+white keys/touch/none/touch
Software Win10 64/none/Android 5.1.1 custom/Android TV 8.1/Android 9.1.0
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
Source:
Hagedoorn, Hilbert, 05-30-2015, EVGA GeForce GTX 980 Ti SKU lineup leaks with prices:
http://www.guru3d.com/news-story/evga-leaks-geforce-gtx-980-ti-sku-lineup-and-prices.html

Above is bolded to support my point in this post. With the NVidia GTX 980 Ti going between $799.99 and $827.99 for the drop in performance to a GTX Titan-X, taking into consideration what I've bold up above, it's a safe bet that this will be true. Also, this is the worst case scenario, but this is also taking account that GTX 980 Ti was never going to beat AMD R9-390x Aka Rage by much on performance differences or averages.

I've always suspected a head of time before the announcement of D3D12.0 that for the CPU side, there has always been a bottleneck of some form. This is what brought performance down for PC Games. On the GPU side, I believe it's the framebuffer speeds. Taking into account that there are other areas besides CPU bottleneck and framebuffer speeds that drop performance. Other areas that slow down performance: The CPU does a lot of other renderings like Shadows, Particle Effects, Physics, etc... If these things could be offloaded to the GPU, performance would go up even further. D3D12.0 aka DirectX 12.0 is a solution to the CPU bottleneck, and in some ways, utilization of HBM is a solution to the gain in performance for Discrete Graphic Cards. Implementation of HBM now for AMD is going to be to their benefit for this generation, and this is probably the defining x-factor that's making NVidia sell their units faster in comparison to previous generations. So yes, in conclusion, NVidia will possibly start thrusting out their products faster, this generation, before AMD goes live with their Flavor of the Generation, to make a quick buck.


Back to the post:

The Rage Branding is nothing more than a marketing gimmick. Gimmicks don't sell, even with a history of lacking performance and issues for AMD's past. It works for NVidia because they have a track record of churning out products that work, or have a higher rate of not failing, or consumers are happier with NVidia products over AMD products. NVidia sells on gimmicks because it has a past history of delivering or meeting expectations probably over 75% to 90% of the time. This is why NVidia can sell products at a higher price on marketing gimmicks. AMD, I feel you are getting a little ahead of yourself with this. Build a reputation of products that work, that bring the performance, and are idiot-proof for consumers. Once you make a big deal about that, then you can start selling your consumers the b.s. that NVidia is good at for the past 10 years.

Just for the record, Uranus or Ouranos isn't a Titan. He's a Primordial God. The Titan's are just the reject inbreeds that were brought into existence between the Primordial Gods. A lot of people believe that Uranus is related to the Zodiac Gemini because Ouranos represents Heaven, and Heaven is in the air. This ideology is false. What Gemini represents as a Zodiac, or a connection to God or Gods, is the ideology of Duality. Duality is exerted or intertwined on everything. If Taurus represents Mother Earth aka Gaia (Gaea), than Uranus is possibly the dualistic opposite of Gaia.

The whole Titan History and Mythology is just an interpretation of how things came to existence that they (pre-greeks) can't explain and didn't fully understand. In addition, Greek Mythology is just refined religion and information that they got from the Mesopotamian (Bablyonian and Assyrians). The Hebrews did it with the Abraham-God in relation to the Babylonian God, Anu, (who is the chief god, the god who created the Heavens and Creation itself).

For NVidia, the Titan name-brand is nothing more than a marketing gimmick. AMD is just following NVdia's idea of it with the Rage gimmick.
Except that the branding is Fury and not Rage.
 
Joined
Sep 15, 2011
Messages
5,322 (1.77/day)
Processor Intel Core i7 3770k @ 4.3GHz
Motherboard Asus P8Z77-V LK
Memory 16GB(2x8) DDR3@2133MHz 1.5v Patriot
Video Card(s) MSI GeForce GTX 1080 GAMING X 8G
Storage 59.63GB Samsung SSD 830 + 465.76 GB Samsung SSD 840 EVO + 2TB Hitachi + 300GB Velociraptor HDD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Anker
Software Win 10 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Source:
Hagedoorn, Hilbert, 05-30-2015, EVGA GeForce GTX 980 Ti SKU lineup leaks with prices:
http://www.guru3d.com/news-story/evga-leaks-geforce-gtx-980-ti-sku-lineup-and-prices.html

Above is bolded to support my point in this post. With the NVidia GTX 980 Ti going between $799.99 and $827.99 for the drop in performance to a GTX Titan-X, taking into consideration what I've bold up above, it's a safe bet that this will be true. Also, this is the worst case scenario, but this is also taking account that GTX 980 Ti was never going to beat AMD R9-390x Aka Rage by much on performance differences or averages.
My guess is that this 390X will sell for 850$ or more...

There you go. This is how prices evolved; in only a couple of years they literally DOUBLED the price for the top cards. Greedy basterds!
 
Joined
May 9, 2012
Messages
6,658 (2.40/day)
Location
Ovronnaz, Wallis, Switzerland
System Name Monster Panzer Max [MPM]/Nostalg33k/Fiio X5 3rd gen/Xiaomi Mi Box S/Honor View 20
Processor i5-6600K 3.9/E8500/RK3188/S905X 4X1.5 A53/Kirin 980 2xA76 (2.6 GHz)+2xA76 (1.92)+4xA53 (1.8GHz)
Motherboard Gigabyte Z170X Gaming 7/XFX 650i Ultra/Fiio/uh?/uh?
Cooling Corsair H115i /Alphacool Eisberg /uh?/Aluminum heatsink/Heatpipe
Memory 4x4gb HyperX Predator 2800 CL14/2gb DDR2 800/1gb/2gb LPDDR3/6gb LPDDR4X dual channel
Video Card(s) MSI GTX 1070 ARMOR 8gb OC/Asus 8800 Ultra/Mali 400MP4/Mali 450MP5/Mali G76MP10
Storage 120gb OCZ VertexIII,1tb/8gb SSHD,2xToshiba 1tb/none/32gb+64gb/8gb/128gb UFS 2.1
Display(s) Medion X58222 32"5ms OC 75hz 2880x1620/Philips 273E3LHSB 27"1ms 1920x1080/~4" 480x800/6.4 inch FHD+
Case Cougar Panzer Max/none/Aluminum and tempered glass/None/alu frame +back/front glass
Audio Device(s) Fiio Q1 Mark II+Logitec Z333/SB Audigy2 Platinum/dual AK4490EN /HDMI audio output/Trn V60/Fiio Fa1
Power Supply Seasonic M12II Evo 750 /Enermax Coolergiant 480/12v 1.5A/Aukey QC3.0 9-12V 1.96A
Mouse Asus ROG Spatha/touch/Xiaomi XMRM-006/touch
Keyboard GMMK TKL+Gateron Red+white keys/touch/none/touch
Software Win10 64/none/Android 5.1.1 custom/Android TV 8.1/Android 9.1.0
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
My guess is that this 390X will sell for 850$ or more...

There you go. This is how prices evolved; in only a couple of years they literally DOUBLED the price for the top cards. Greedy bastards!
yeah only the price of the hardware for the customer doubled ... not the manufacturing cost...
in the end who is a greedy bastard?(not directed to you but more a general picture) the customer who want the most expensive piece of hardware at half the price, or the manufacturer who need to sell the hardware at a price dictated by supply and demand and general manufacturing cost?
i am of the 1st kind ... i paid 1/3 of a 290 initial launch price for a 290 ... but 2nd hand and 3 week of use ... well ... greedy or not who care as long as you have the mean to get it cheaper, yep indeed human are greedy in general :)


plus if that card will be 850$ it's still a whooping 15% less than a Titanics .... woops sorry Titan X and if it manage to beat it a cost more than 850$ then ... it's not a problem xD as long as the price is under a Titanics ... (rargh Titan X sorry i got a ICEBERG... aherm i mean a cold ... i can't think straight) we are ok :roll:

sidenote ... could we drop the analogy to any mythological story? it means nothing that AMD chose to re take the name of 3 sister born of the castration of the father of the Titans (well ... it mean they intend to castrate nvidia?... :rolleyes: ) since that branding from ATI originate from way before the Titan got on the way... (well they did nut-kick nvidia more than one time ... )
 
Last edited:
Top