• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD "Fiji XT" SKU Name Revealed, ATI Rage Legacy Reborn?

Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
They should be given all their GPUs since 2012 seem to be DX 12 tier 3 compliant
Yet another adventure in incomplete information. The resource binding Tiers are actually subsets of the DirectX 11 hardware specification, as has been recounted by virtually every DX12 article published since the specification was publicized, and hardly surprising since DirectX 12 grew out of the DirectX 11.x API used in the Xbox One console.
Your 7850 is already DX12 capable just need windows 10 and driver
Like any " DirectX 12 capable card", it depends upon what features are actually implemented at a game/app level. The card could be DirectX 12 capable, but if the game developer chooses to use - as example, 12_1 conservative rasterization, then the HD 7850 wont be compliant with that feature. Making a blanket statement that a GCN 1.0 architecture card is DX12 capable requires some definite caveats.



It's kind of like how AMD's GPUs were supposedly more " future proof" because they were DX11.2 complaint, yet a total of zero games actually ever used those resources for any discrete graphics card. Hardware support is DX11 based, feature set is both DX11 and DX12 based.
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
Yet another adventure in incomplete information. The resource binding Tiers are actually subsets of the DirectX 11 hardware specification, as has been recounted by virtually every DX12 article published since the specification was publicized, and hardly surprising since DirectX 12 grew out of the DirectX 11.x API used in the Xbox One console.

Like any " DirectX 12 capable card", it depends upon what features are actually implemented at a game/app level. The card could be DirectX 12 capable, but if the game developer chooses to use - as example, 12_1 conservative rasterization, then the HD 7850 wont be compliant with that feature. Making a blanket statement that a GCN 1.0 architecture card is DX12 capable requires some definite caveats.



It's kind of like how AMD's GPUs were supposedly more " future proof" because they were DX11.2 complaint, yet a total of zero games actually ever used those resources for any discrete graphics card. Hardware support is DX11 based, feature set is both DX11 and DX12 based.

So what your saying is Tier 1 is higher then Tier 3 ?

Feature Set 12_1 just makes what is an optional features in lower set mandatory. If the GPU supports Tier 3 then there is no caveats other then software implementation.
 
Last edited:
Joined
Jun 13, 2012
Messages
1,316 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Power issue is irrelevant anyway for most gamers. If the 390x uses 50 more watts than the Titan X and you game 15 to 20 hours a week at the USA national average of 12 cents per kWh then it adds 39 to 43 cents per month to your power bill.
Here is the thing to think about, 290x had 1x8pin and 1x8pin and drew 300watts. the new "fury" has 2x8pin. So 50watts more? given AMD's history kinda seems like that 50watts could be 150watts. Yea power bill may not seem like much but that is still a lot heat being made.
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
So what your saying is Tier 1 is higher then Tier 3 ?
No. I pretty much thought it was obvious. The feature set and the hardware level aren't the same thing, and saying that that a GCN 1.0 card is DirectX12 capable doesn't take into account the features actually implemented on a software level on a particular game.
Just for the record, I made no mention of Tier levels, and as far as I can tell, neither did you...so why bother trying to insinuate them into an argument?
If the GPU supports Tier 3 then there is no caveats other then software implementation.
That's actually a pretty big caveat. :rolleyes:
 
Last edited:
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
No. I pretty much thought it was obvious. The feature set and the hardware level aren't the same thing, and saying that that a GCN 1.0 card is DirectX12 capable doesn't take into account the features actually implemented on a software level on a particular game.

Well you made it seam it was. The one I linked clearly says conservative rasterization Tier 3. You implied somehow CR Tier 2 wouldn't be supported even though it listed as supporting CR Tier 3. If you want to argue about implementation your a little late since almost all games never implement more then they need to or can afford to. Then again I don't recall you making such points for every game ever released.

Not sure what the fuzz is about given the question that was made.
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Well you made it seam it was. The one I linked clearly says conservative rasterization Tier 3. You implied somehow CR Tier 2 wouldn't be supported even though it listed as supporting CR Tier 3.
Uh, no. You're talking about hardware level support and I was obviously referring to making the distinction between hardware level support and feature level support. The first is baked into the architecture, the second are features for inclusion in software.
The list of DX12 " supported" hardware has been out for some time. It basically covers all DirectX 11 hardware with the exception of VLIW architecture, so technically most (if not all) current GPUs are DX12 compliant - it does not follow that all GPUs have access to all DX12 features.
If you want to argue about implementation your a little late since almost all games never implement more then they need to or can afford to.
You were having an extended BRB while both AMD's Gaming Evolved and Nvidia's GameWorks/TWIMTBP, have been adding features over and above the "need" level for some years, whether it be useless compute shader cycles, or PhysX, or extreme tessellation. By your reckoning EA wouldn't have looked to optimize CPU performance by implementing DX11.1 features for BF4....yet they did for some unfathomable reason aside from giving Win8 systems a 3%-6% edge in performance over the same hardware running DX11/Win7.
Not sure what the fuzz is about given the question that was made.
I would tend to note that a blanket question that has caveats attached might warrant a fuller explanation. Given the length of your answering post, it would seem trivial to add that not all feature sets are available to each architecture. But to each their own.
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
Uh, no. You're talking about hardware level support and I was obviously referring to making the distinction between hardware level support and feature level support. The first is baked into the architecture, the second are features for inclusion in software.
The list of DX12 " supported" hardware has been out for some time. It basically covers all DirectX 11 hardware with the exception of VLIW architecture, so technically most (if not all) current GPUs are DX12 compliant - it does not follow that all GPUs have access to all DX12 features.

You were having an extended BRB while both AMD's Gaming Evolved and Nvidia's GameWorks/TWIMTBP, have been adding features over and above the "need" level for some years, whether it be useless compute shader cycles, or PhysX, or extreme tessellation. By your reckoning EA wouldn't have looked to optimize CPU performance by implementing DX11.1 features for BF4....yet they did for some unfathomable reason aside from giving Win8 systems a 3%-6% edge in performance over the same hardware running DX11/Win7.

I would tend to note that a blanket question that has caveats attached might warrant a fuller explanation. Given the length of your answering post, it would seem trivial to add that not all feature sets are available to each architecture. But to each their own.

You seem bothered. :laugh:

Simple solution for a grumpy old man. You could of just answered the mans question in your own way instead.
 
Joined
Oct 22, 2014
Messages
13,210 (3.83/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E3-1260L v5
Motherboard MSI E3 KRAIT Gaming v5
Cooling Tt tower + 120mm Tt fan
Memory G.Skill 16GB 3600 C18
Video Card(s) Asus GTX 970 Mini
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
The brand naming indicates that AMD wants to change the terms on which its top-end product competes with NVIDIA's. Low noise and high-performance will be the focus, not power draw. Nobody buys an Aventador for its MPG.

Reading between the lines, that means they couldn't do it. :rolleyes:
 
Joined
Aug 20, 2007
Messages
20,714 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches
Software Windows 11 Enterprise (legit), Gentoo Linux x64
As in Erinyes ( Furies of Greek mythology) ?
Bear in mind Furies came into creation from the blood spilled when Uranus was castrated.

I don't know what you're talking about, but you keep your castrator away from my anus!

Fury aye? I can dig it... Would love AMD/ATi to bring back Ruby advertisements/graphical benchmarks

Aye, Ruby, she's fallen a long way since her glory days...

 
Joined
Sep 17, 2014
Messages
20,782 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
It may have, but I think it to be silly. Rage. It is a negative emotion. I seem to be alone in this, but to me it is like calling it AMD Depression, AMD Violence, or AMD Malicious. It is just negative.


I will post this link for something that talks about constructive anger. Got it from a very quick google search. I even just skimmed it instead of really reading it.
http://www.spring.org.uk/2012/03/the-upside-of-anger-6-psychological-benefits-of-getting-mad.php

All 6 benefits I don't think need anger at all. You can do every one of those with a positive emotion. Anger is negative. Anger is low quality.

Wait here, I'll go ask AMD to name their new GPU after flowers and cookies.
 
Joined
May 9, 2012
Messages
8,380 (1.93/day)
Location
Ovronnaz, Wallis, Switzerland
System Name main/SFFHTPCARGH!(tm)/Xiaomi Mi TV Stick/Samsung Galaxy S23/Ally
Processor Ryzen 7 5800X3D/i7-3770/S905X/Snapdragon 8 Gen 2/Ryzen Z1 Extreme
Motherboard MSI MAG B550 Tomahawk/HP SFF Q77 Express/uh?/uh?/Asus
Cooling Enermax ETS-T50 Axe aRGB /basic HP HSF /errr.../oh! liqui..wait, no:sizable vapor chamber/a nice one
Memory 64gb Corsair Vengeance Pro 3600mhz DDR4/8gb DDR3 1600/2gb LPDDR3/8gb LPDDR5x 4200/16gb LPDDR5
Video Card(s) Hellhound Spectral White RX 7900 XTX 24gb/GT 730/Mali 450MP5/Adreno 740/RDNA3 768 core
Storage 250gb870EVO/500gb860EVO/2tbSandisk/NVMe2tb+1tb/4tbextreme V2/1TB Arion/500gb/8gb/256gb/2tb SN770M
Display(s) X58222 32" 2880x1620/32"FHDTV/273E3LHSB 27" 1920x1080/6.67"/AMOLED 2X panel FHD+120hz/FHD 120hz
Case Cougar Panzer Max/Elite 8300 SFF/None/back/back-front Gorilla Glass Victus 2+ UAG Monarch Carbon
Audio Device(s) Logi Z333/SB Audigy RX/HDMI/HDMI/Dolby Atmos/KZ x HBB PR2/Edifier STAX Spirit S3 & SamsungxAKG beans
Power Supply Chieftec Proton BDF-1000C /HP 240w/12v 1.5A/4Smart Voltplug PD 30W/Asus USB-C 65W
Mouse Speedlink Sovos Vertical-Asus ROG Spatha-Logi Ergo M575/Xiaomi XMRM-006/touch/touch
Keyboard Endorfy Thock 75% <3/none/touch/virtual
VR HMD Medion Erazer
Software Win10 64/Win8.1 64/Android TV 8.1/Android 13/Win11 64
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
.........ah....is Kingston willing to share?
AMD/ATI was still first on that ... so Kingston has nothing to say in it :rolleyes:
people tend to forget that ATI used Fury in the naming scheme since 1999 ;)

but some other brand surely have used Fury before them ... well maybe not in computer
(altho when people say Fury i think ATI :laugh: )

Wait here, I'll go ask AMD to name their new GPU after flowers and cookies.
NO! cookies are evil, cookies are dark side!
Flower ... well ok .... but only:
Drosera, Nepenthes, Dionea and Darlingtonia...
 
  • Like
Reactions: 64K

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Here is the thing to think about, 290x had 1x8pin and 1x8pin and drew 300watts. the new "fury" has 2x8pin. So 50watts more? given AMD's history kinda seems like that 50watts could be 150watts. Yea power bill may not seem like much but that is still a lot heat being made.

Let's have a look at the last Flagships from Nvidia and AMD. The reference GTX 780 Ti drew 229 watts average and 269 watts peak. The reference R9 290x drew 246 watts average and 282 peak. The difference is irrelevant on the power bill for most gamers.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_780_Ti/25.html

http://www.techpowerup.com/reviews/AMD/R9_290X/25.html

Do you really believe that the 390x will draw 150 watts more than the Titan X? Looking at the review on the Titan X that would mean the 390x would draw 373 watts average and 393 watts peak.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/27.html

Can we just stick to the tech and not be hysterical please?
 
Joined
Sep 5, 2004
Messages
1,956 (0.27/day)
Location
The Kingdom of Norway
Processor Ryzen 5900X
Motherboard Gigabyte B550I AORUS PRO AX 1.1
Cooling Noctua NB-U12A
Memory 2x 32GB Fury DDR4 3200mhz
Video Card(s) PowerColor Radeon 5700 XT Red Dragon
Storage Kingston FURY Renegade 2TB PCIe 4.0
Display(s) 2x Dell U2412M
Case Phanteks P400A
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850 (from 2012)
Software Windows 10?
if its fury, they might have two cards up their sleaves, like a "Radeon Fury" and a "Radeon Fury Maxx (dual GPU)"
why i think there will be a dual gpu one, why not?, HBM will save like 94% of the board and if they use watercooling they cool it pretty easy, given the fact that HBM also uses alot less power per speed, and if they underclock it per gpu, its a viable option
 
Joined
Sep 5, 2004
Messages
1,956 (0.27/day)
Location
The Kingdom of Norway
Processor Ryzen 5900X
Motherboard Gigabyte B550I AORUS PRO AX 1.1
Cooling Noctua NB-U12A
Memory 2x 32GB Fury DDR4 3200mhz
Video Card(s) PowerColor Radeon 5700 XT Red Dragon
Storage Kingston FURY Renegade 2TB PCIe 4.0
Display(s) 2x Dell U2412M
Case Phanteks P400A
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850 (from 2012)
Software Windows 10?
given the fact they use a imposer now, it might be even possible to put a pair of GPUs on that together with HBM memory..
think of the awesome perf of such a card
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
given the fact they use a imposer now, it might be even possible to put a pair of GPUs on that together with HBM memory.
Unlikely at this stage. The interposer for Fiji scales out to ~ 830-860mm² judged from available metrics. The cost and yield rate for such a low production part or at least twice that size (remember that interposer module assembled by UMC includes the whole package- HBM stacks and GPU) would likely be prohibitive for the size and production run, and that's assuming UMC could produce viable interposers of that size in the first place, which seems doubtful.

There wouldn't be anything to stop AMD devising a dual card with dual interposer modules as far as I'm aware.
 
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Let's have a look at the last Flagships from Nvidia and AMD. The reference GTX 780 Ti drew 229 watts average and 269 watts peak. The reference R9 290x drew 246 watts average and 282 peak. The difference is irrelevant on the power bill for most gamers.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_780_Ti/25.html

http://www.techpowerup.com/reviews/AMD/R9_290X/25.html

Do you really believe that the 390x will draw 150 watts more than the Titan X? Looking at the review on the Titan X that would mean the 390x would draw 373 watts average and 393 watts peak.

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/27.html

Can we just stick to the tech and not be hysterical please?

Indeed, makes you wonder why people kicked up such a stink about the GTX 480, pesky double standards.
 
Joined
Jun 13, 2012
Messages
1,316 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Do you really believe that the 390x will draw 150 watts more than the Titan X? Looking at the review on the Titan X that would mean the 390x would draw 373 watts average and 393 watts peak.

Other GCN 1.2 card which is 285 has half the cores fury had and drew 200watts. So AMD now doubled it, so 350-400watts is very possible draw for the card. Remember they are shipping the damn thing with a water cooler on it.
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Indeed, makes you wonder why people kicked up such a stink about the GTX 480, pesky double standards.

The tech sites definitely had a field day on that GPU. :)
 
Joined
Mar 10, 2010
Messages
11,878 (2.31/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
People like a debate its the way :)

If you're going to spread fud it might as well be viable sooo if a Fury is a 390X with Stacked memory that might allow four derivatives 8gb /4gb each watercooled or air

Then another two versions with ddr5 so says the rumour mill sound's odd to me so some of this has to be nonsense


I think the 8gb watercooled will be the fury ,dunno if im buying or not I might hold off.
 
Last edited:
Joined
Sep 29, 2013
Messages
97 (0.03/day)
Processor Intel i7 4960x Ivy-Bridge E @ 4.6 Ghz @ 1.42V
Motherboard x79 AsRock Extreme 11.0
Cooling EK Supremacy Copper Waterblock
Memory 65.5 GBs Corsair Platinum Kit @ 666.7Mhz
Video Card(s) PCIe 3.0 x16 -- Asus GTX Titan Maxwell
Storage Samsung 840 500GBs + OCZ Vertex 4 500GBs 2x 1TB Samsung 850
Audio Device(s) Soundblaster ZXR
Power Supply Corsair 1000W
Mouse Razer Naga
Keyboard Corsair K95
Software Zbrush, 3Dmax, Maya, Softimage, Vue, Sony Vegas Pro, Acid, Soundforge, Adobe Aftereffects, Photoshop
I only care about relative power draw. If I can get 50% more perf for the same power draw, that's all good. I'd rather have 50% more perf at same power than same perf at 50% power. Titan X disappointed (stock level) in that regard as it's perf wasn't 'awesome' (but it's power wasn't bad).

If 'Fury' (if that is to be it, though sounds too much like the new Mad Max movie title) delivers the perf I want to see at the same power draw, it's the way forward for me. And i must add, it looks more and more as if 980ti is being thrust out fast to make a quick buck before the AMD card drops and takes the crown. BUT, if the AMD card doesn't take the crown, it'll be a very bad day (IMO) for AMD. Like i say though, I feel Nvidia are rushing a bit too much here so I'm erring on the AMD being better option.


Source:
Hagedoorn, Hilbert, 05-30-2015, EVGA GeForce GTX 980 Ti SKU lineup leaks with prices:
http://www.guru3d.com/news-story/evga-leaks-geforce-gtx-980-ti-sku-lineup-and-prices.html

Above is bolded to support my point in this post. With the NVidia GTX 980 Ti going between $799.99 and $827.99 for the drop in performance to a GTX Titan-X, taking into consideration what I've bold up above, it's a safe bet that this will be true. Also, this is the worst case scenario, but this is also taking account that GTX 980 Ti was never going to beat AMD R9-390x Aka Rage by much on performance differences or averages.

I've always suspected a head of time before the announcement of D3D12.0 that for the CPU side, there has always been a bottleneck of some form. This is what brought performance down for PC Games. On the GPU side, I believe it's the framebuffer speeds. Taking into account that there are other areas besides CPU bottleneck and framebuffer speeds that drop performance. Other areas that slow down performance: The CPU does a lot of other renderings like Shadows, Particle Effects, Physics, etc... If these things could be offloaded to the GPU, performance would go up even further. D3D12.0 aka DirectX 12.0 is a solution to the CPU bottleneck, and in some ways, utilization of HBM is a solution to the gain in performance for Discrete Graphic Cards. Implementation of HBM now for AMD is going to be to their benefit for this generation, and this is probably the defining x-factor that's making NVidia sell their units faster in comparison to previous generations. So yes, in conclusion, NVidia will possibly start thrusting out their products faster, this generation, before AMD goes live with their Flavor of the Generation, to make a quick buck.


Back to the post:

The Rage Branding is nothing more than a marketing gimmick. Gimmicks don't sell, even with a history of lacking performance and issues for AMD's past. It works for NVidia because they have a track record of churning out products that work, or have a higher rate of not failing, or consumers are happier with NVidia products over AMD products. NVidia sells on gimmicks because it has a past history of delivering or meeting expectations probably over 75% to 90% of the time. This is why NVidia can sell products at a higher price on marketing gimmicks. AMD, I feel you are getting a little ahead of yourself with this. Build a reputation of products that work, that bring the performance, and are idiot-proof for consumers. Once you make a big deal about that, then you can start selling your consumers the b.s. that NVidia is good at for the past 10 years.

Just for the record, Uranus or Ouranos isn't a Titan. He's a Primordial God. The Titan's are just the reject inbreeds that were brought into existence between the Primordial Gods. A lot of people believe that Uranus is related to the Zodiac Gemini because Ouranos represents Heaven, and Heaven is in the air. This ideology is false. What Gemini represents as a Zodiac, or a connection to God or Gods, is the ideology of Duality. Duality is exerted or intertwined on everything. If Taurus represents Mother Earth aka Gaia (Gaea), than Uranus is possibly the dualistic opposite of Gaia.

The whole Titan History and Mythology is just an interpretation of how things came to existence that they (pre-greeks) can't explain and didn't fully understand. In addition, Greek Mythology is just refined religion and information that they got from the Mesopotamian (Bablyonian and Assyrians). The Hebrews did it with the Abraham-God in relation to the Babylonian God, Anu, (who is the chief god, the god who created the Heavens and Creation itself).

For NVidia, the Titan name-brand is nothing more than a marketing gimmick. AMD is just following NVdia's idea of it with the Rage gimmick.
 
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
The tech sites definitely had a field day on that GPU. :)

Even AMD did one of their dodgy videos about it, which from what I gather just confirmed how well it sold. :p
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
If you're going to spread fud it might as well be viable sooo if a Fury is a 390X with Stacked memory that might allow four derivatives 8gb /4gb each watercooled or air
You're in bonus territory right there.
Aren't the Fury and the 390X two completely different SKUs? The Fury (Fiji-based) with 4GB of HBM (as AMD themselves have confirmed regarding the 4GB limitation), while the 390X is a reworked Hawaii-based card - essentially the 290X with 4GB or 8GB of GDDR5?
The Rage Branding is nothing more than a marketing gimmick. Gimmicks don't sell, even with a history of lacking performance and issues for AMD's past.
They can sell, it just depends upon how they are marketed, and what status the brand already enjoys.
It works for NVidia because they have a track record of churning out products that work, or have a higher rate of not failing, or consumers are happier with NVidia products over AMD products. NVidia sells on gimmicks because it has a past history of delivering or meeting expectations probably over 75% to 90% of the time.
Comes down to brand awareness - what is generally ascribed as TOMA - Top of the Mind Awareness. Market leaders tend to get that way just as much by public perception as by products. Lead or create a market rather than follow another companies strategy, build a company persona based on the companies products and services rather than comparing with the competition ( When was the last time Intel publicly compared any of its products with AMD's?). Being perceived as a market leader is half the battle. The other half is building a successful product/service base as you've said, and showing the market well defined strategies with consistent meeting of strategic goals. Changing company emphasis at the drop of a hat or revolving door management changes, tends to convey the impression of a indecisiveness - not a quality associated with a TOMA brand.
For NVidia, the Titan name-brand is nothing more than a marketing gimmick.
Pretty much. The company saw that their product lines were seeing significant uptake in other markets ( desktop " supercomputers", rendering etc.) with boards like the GTX 580, and slapped a new naming nomenclature designed to leverage a high price with an aura of exclusivity. It seems to have worked...
AMD is just following NVdia's idea of it with the Rage gimmick.
...bearing in mind the old adage, "Imitation is the sincerest form of flattery".
"Fury" leverages a name revered in the annals of ATI, so why not go with it? As to whether it succeeds, that might depend upon how it fares in the benchmark war. Could be "Radeon Fury cowers Nvidia" , or " Radeon Fury at not being top dog"....or more likely, given previous releases between the companies, the name will engender little if the benchmarks are split between vendors depending upon title, game IQ, and resolution. The fact that the 980 Ti is being clocked by vendors at 1200MHz+ out of the box, might suggest a close race.
 
Last edited:
Joined
May 9, 2012
Messages
8,380 (1.93/day)
Location
Ovronnaz, Wallis, Switzerland
System Name main/SFFHTPCARGH!(tm)/Xiaomi Mi TV Stick/Samsung Galaxy S23/Ally
Processor Ryzen 7 5800X3D/i7-3770/S905X/Snapdragon 8 Gen 2/Ryzen Z1 Extreme
Motherboard MSI MAG B550 Tomahawk/HP SFF Q77 Express/uh?/uh?/Asus
Cooling Enermax ETS-T50 Axe aRGB /basic HP HSF /errr.../oh! liqui..wait, no:sizable vapor chamber/a nice one
Memory 64gb Corsair Vengeance Pro 3600mhz DDR4/8gb DDR3 1600/2gb LPDDR3/8gb LPDDR5x 4200/16gb LPDDR5
Video Card(s) Hellhound Spectral White RX 7900 XTX 24gb/GT 730/Mali 450MP5/Adreno 740/RDNA3 768 core
Storage 250gb870EVO/500gb860EVO/2tbSandisk/NVMe2tb+1tb/4tbextreme V2/1TB Arion/500gb/8gb/256gb/2tb SN770M
Display(s) X58222 32" 2880x1620/32"FHDTV/273E3LHSB 27" 1920x1080/6.67"/AMOLED 2X panel FHD+120hz/FHD 120hz
Case Cougar Panzer Max/Elite 8300 SFF/None/back/back-front Gorilla Glass Victus 2+ UAG Monarch Carbon
Audio Device(s) Logi Z333/SB Audigy RX/HDMI/HDMI/Dolby Atmos/KZ x HBB PR2/Edifier STAX Spirit S3 & SamsungxAKG beans
Power Supply Chieftec Proton BDF-1000C /HP 240w/12v 1.5A/4Smart Voltplug PD 30W/Asus USB-C 65W
Mouse Speedlink Sovos Vertical-Asus ROG Spatha-Logi Ergo M575/Xiaomi XMRM-006/touch/touch
Keyboard Endorfy Thock 75% <3/none/touch/virtual
VR HMD Medion Erazer
Software Win10 64/Win8.1 64/Android TV 8.1/Android 13/Win11 64
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
Source:
Hagedoorn, Hilbert, 05-30-2015, EVGA GeForce GTX 980 Ti SKU lineup leaks with prices:
http://www.guru3d.com/news-story/evga-leaks-geforce-gtx-980-ti-sku-lineup-and-prices.html

Above is bolded to support my point in this post. With the NVidia GTX 980 Ti going between $799.99 and $827.99 for the drop in performance to a GTX Titan-X, taking into consideration what I've bold up above, it's a safe bet that this will be true. Also, this is the worst case scenario, but this is also taking account that GTX 980 Ti was never going to beat AMD R9-390x Aka Rage by much on performance differences or averages.

I've always suspected a head of time before the announcement of D3D12.0 that for the CPU side, there has always been a bottleneck of some form. This is what brought performance down for PC Games. On the GPU side, I believe it's the framebuffer speeds. Taking into account that there are other areas besides CPU bottleneck and framebuffer speeds that drop performance. Other areas that slow down performance: The CPU does a lot of other renderings like Shadows, Particle Effects, Physics, etc... If these things could be offloaded to the GPU, performance would go up even further. D3D12.0 aka DirectX 12.0 is a solution to the CPU bottleneck, and in some ways, utilization of HBM is a solution to the gain in performance for Discrete Graphic Cards. Implementation of HBM now for AMD is going to be to their benefit for this generation, and this is probably the defining x-factor that's making NVidia sell their units faster in comparison to previous generations. So yes, in conclusion, NVidia will possibly start thrusting out their products faster, this generation, before AMD goes live with their Flavor of the Generation, to make a quick buck.


Back to the post:

The Rage Branding is nothing more than a marketing gimmick. Gimmicks don't sell, even with a history of lacking performance and issues for AMD's past. It works for NVidia because they have a track record of churning out products that work, or have a higher rate of not failing, or consumers are happier with NVidia products over AMD products. NVidia sells on gimmicks because it has a past history of delivering or meeting expectations probably over 75% to 90% of the time. This is why NVidia can sell products at a higher price on marketing gimmicks. AMD, I feel you are getting a little ahead of yourself with this. Build a reputation of products that work, that bring the performance, and are idiot-proof for consumers. Once you make a big deal about that, then you can start selling your consumers the b.s. that NVidia is good at for the past 10 years.

Just for the record, Uranus or Ouranos isn't a Titan. He's a Primordial God. The Titan's are just the reject inbreeds that were brought into existence between the Primordial Gods. A lot of people believe that Uranus is related to the Zodiac Gemini because Ouranos represents Heaven, and Heaven is in the air. This ideology is false. What Gemini represents as a Zodiac, or a connection to God or Gods, is the ideology of Duality. Duality is exerted or intertwined on everything. If Taurus represents Mother Earth aka Gaia (Gaea), than Uranus is possibly the dualistic opposite of Gaia.

The whole Titan History and Mythology is just an interpretation of how things came to existence that they (pre-greeks) can't explain and didn't fully understand. In addition, Greek Mythology is just refined religion and information that they got from the Mesopotamian (Bablyonian and Assyrians). The Hebrews did it with the Abraham-God in relation to the Babylonian God, Anu, (who is the chief god, the god who created the Heavens and Creation itself).

For NVidia, the Titan name-brand is nothing more than a marketing gimmick. AMD is just following NVdia's idea of it with the Rage gimmick.

Except that the branding is Fury and not Rage.
 
Joined
Sep 15, 2011
Messages
6,457 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Source:
Hagedoorn, Hilbert, 05-30-2015, EVGA GeForce GTX 980 Ti SKU lineup leaks with prices:
http://www.guru3d.com/news-story/evga-leaks-geforce-gtx-980-ti-sku-lineup-and-prices.html

Above is bolded to support my point in this post. With the NVidia GTX 980 Ti going between $799.99 and $827.99 for the drop in performance to a GTX Titan-X, taking into consideration what I've bold up above, it's a safe bet that this will be true. Also, this is the worst case scenario, but this is also taking account that GTX 980 Ti was never going to beat AMD R9-390x Aka Rage by much on performance differences or averages.

My guess is that this 390X will sell for 850$ or more...

There you go. This is how prices evolved; in only a couple of years they literally DOUBLED the price for the top cards. Greedy basterds!
 
Joined
May 9, 2012
Messages
8,380 (1.93/day)
Location
Ovronnaz, Wallis, Switzerland
System Name main/SFFHTPCARGH!(tm)/Xiaomi Mi TV Stick/Samsung Galaxy S23/Ally
Processor Ryzen 7 5800X3D/i7-3770/S905X/Snapdragon 8 Gen 2/Ryzen Z1 Extreme
Motherboard MSI MAG B550 Tomahawk/HP SFF Q77 Express/uh?/uh?/Asus
Cooling Enermax ETS-T50 Axe aRGB /basic HP HSF /errr.../oh! liqui..wait, no:sizable vapor chamber/a nice one
Memory 64gb Corsair Vengeance Pro 3600mhz DDR4/8gb DDR3 1600/2gb LPDDR3/8gb LPDDR5x 4200/16gb LPDDR5
Video Card(s) Hellhound Spectral White RX 7900 XTX 24gb/GT 730/Mali 450MP5/Adreno 740/RDNA3 768 core
Storage 250gb870EVO/500gb860EVO/2tbSandisk/NVMe2tb+1tb/4tbextreme V2/1TB Arion/500gb/8gb/256gb/2tb SN770M
Display(s) X58222 32" 2880x1620/32"FHDTV/273E3LHSB 27" 1920x1080/6.67"/AMOLED 2X panel FHD+120hz/FHD 120hz
Case Cougar Panzer Max/Elite 8300 SFF/None/back/back-front Gorilla Glass Victus 2+ UAG Monarch Carbon
Audio Device(s) Logi Z333/SB Audigy RX/HDMI/HDMI/Dolby Atmos/KZ x HBB PR2/Edifier STAX Spirit S3 & SamsungxAKG beans
Power Supply Chieftec Proton BDF-1000C /HP 240w/12v 1.5A/4Smart Voltplug PD 30W/Asus USB-C 65W
Mouse Speedlink Sovos Vertical-Asus ROG Spatha-Logi Ergo M575/Xiaomi XMRM-006/touch/touch
Keyboard Endorfy Thock 75% <3/none/touch/virtual
VR HMD Medion Erazer
Software Win10 64/Win8.1 64/Android TV 8.1/Android 13/Win11 64
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
My guess is that this 390X will sell for 850$ or more...

There you go. This is how prices evolved; in only a couple of years they literally DOUBLED the price for the top cards. Greedy bastards!
yeah only the price of the hardware for the customer doubled ... not the manufacturing cost...
in the end who is a greedy bastard?(not directed to you but more a general picture) the customer who want the most expensive piece of hardware at half the price, or the manufacturer who need to sell the hardware at a price dictated by supply and demand and general manufacturing cost?
i am of the 1st kind ... i paid 1/3 of a 290 initial launch price for a 290 ... but 2nd hand and 3 week of use ... well ... greedy or not who care as long as you have the mean to get it cheaper, yep indeed human are greedy in general :)


plus if that card will be 850$ it's still a whooping 15% less than a Titanics .... woops sorry Titan X and if it manage to beat it a cost more than 850$ then ... it's not a problem xD as long as the price is under a Titanics ... (rargh Titan X sorry i got a ICEBERG... aherm i mean a cold ... i can't think straight) we are ok :roll:

sidenote ... could we drop the analogy to any mythological story? it means nothing that AMD chose to re take the name of 3 sister born of the castration of the father of the Titans (well ... it mean they intend to castrate nvidia?... :rolleyes: ) since that branding from ATI originate from way before the Titan got on the way... (well they did nut-kick nvidia more than one time ... )
 
Last edited:
Top