• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon R9 380X Based on "Grenada," a Refined "Hawaii"

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Who cares about efficiency really. If they can still manage noise levels, I don't really care. Efficiency mattered for miners, but you don't play games 24/7 or do you?
I care that the 970 has a multi-monitor idle consumption of <5-watts and the 290 is closer to 55-watts. So, yes. People like me, who aren't gaming most of the time (but do still game regularly) but are using multiple monitor for productivity reasons, do care a little bit as power usage as it adds up over time. Is it a huge factor? No. Is it one worth considering? Sure.

Also, higher efficiency would mean lower temps or more overhead for higher clocks which is never a bad thing.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Your GTX 970 is NEVER using just 5W when idling on a desktop...
My bad, it's 10 watts. Still 40 lower... Also I find it funny when you say "Your 970" as I don't own one as I'm pretty sure my specs still say I'm rocking two 6870s which also suck with multi-monitor power consumption. :)

All I'm saying is that it's a selling point and nothing bad comes out of improving efficiency.
 
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
Because multi-monitor setup is somehow the majority of what users have? And if you can afford GTX 970 and 2 monitors, surely that few watts of difference makes zero real world difference.

Now show me a single monitor difference? Something large majority of users have...
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,147 (2.94/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Because multi-monitor setup is somehow the majority of what users have? And if you can afford GTX 970 and 2 monitors, surely that few watts of difference makes zero real world difference.

Now show me a single monitor difference? Something large majority of users have...
Why would I care what other users want when I'm looking at a GPU for myself? I think you misread my post...
I care that the 970 has a multi-monitor idle consumption of <5-watts and the 290 is closer to 55-watts.

I never said most people have multiple monitors... but I do. You already know the answer to your own question as well. I don't appreciate the rhetorical questions. It's not a lot of money, but it's easily a 12-pack of beer every month or two that I wouldn't otherwise have otherwise. While GPU alone might not make a difference, inefficiencies add up which contributes to the amount of heat in your machine. If you want a quiet machine, you're not going to do it by eating power for breakfast. The simple fact is that it doesn't simply come down to the cost of electricity. A HTPC with a 200-watt idle like my tower wouldn't be a very quiet HTPC, now would it?
 
Joined
Dec 15, 2011
Messages
1,962 (0.43/day)
Location
Sydney NSW, Australia
System Name Shoebox Sandwich | Razer Blade 14 | Synology DS418 w/ 4X8TB WD Red | iPhone 15 Pro
Processor AMD Ryzen 7 7800X3D | AMD Ryzen 9 5900HX
Motherboard ASRock B650E PG-ITX WiFi | Razer thing
Cooling NZXT Kraken 240 RGB w/ Lian Li P28 + Corsair AF120 Slim | Double fan vapour chamber
Memory G.Skill Flare X5 2x16GB 6000MHz CL32 DDR5 | 16GB 3200MHz DDR4
Video Card(s) Sapphire Radeon RX 7900 XTX | NVIDIA GeForce RTX 3070 8GB
Storage Seagate FireCuda 520 2TB + Lexar NM790 4TB | 1TB NVMe SSD + 2TB NVMe external SSD
Display(s) Alienware AW3423DW & Sony A90K 42" | 14" QHD IPS 165Hz & Sony X90J 65" 4K TV
Case FormD T1 - v2.1 - Titanium colourway w Aluminium mesh | Razer Blade 14
Audio Device(s) Schiit Hel 2E (Focal Clear & HiFiMAN HE-4XX & DROP PC38X) | Samsung Q950A
Power Supply Corsair SF750 Platinum | Razer 230W brick
Mouse Razer Viper Ultimate Mercury | Roccat Kain 202
Keyboard DIY Geek 64 PCB (Kailh Box Deep Sea - Matcha XDA Keycaps) | Corsair K63
VR HMD Oculus Quest
Software Windows 11 Home | Windows 11 Home
Benchmark Scores 2022 MINI Countryman SE PHEV Hybrid ALL4 | British Racing Green| Malt Brown Interior| YOURS Trim
Who cares about efficiency really. If they can still manage noise levels, I don't really care. Efficiency mattered for miners, but you don't play games 24/7 or do you?
I care for efficiency as long as it doesn't come at an ungodly increase in price for the end product. The reason I care for efficiency is that the cooler's fans can run slower (and therefore quieter) on the efficient graphics card because it will be producing less heat from using less energy than a more hungry graphics card that uses more energy and thus producing more heat and thus needing to spin the fan(s) at a higher rpm (noisy) to achieve the same/similar temperatures as the more efficient graphics card. At the high end spectrum of graphics cards it seems that liquid cooling could become the norm for managing temperatures at a reasonable noise level (e.g. 295X2). That being said if I was into custom liquid cooling, then I probably would not care all that much for efficiency as it would be pretty easy to add more rads.
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.65/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
For Nvidia, GTX 760 is a GTX 680 rebrand. GTX 960 is a GTX 780 rebrand.

GTX 770 was a refresh of the GTX 680. GTX 960 is a totally different architecture from the GTX 780. When the GM210 drops we will see that Nvidia intended the 960,970,980 to be mid range GPUs even though the 980 is the fastest GPU so far in the Maxwell lineup.

I've been reading some rumors that say the R9 380X will be more power efficient and with the same number of cores as the R9 290X and more room to push the clocks higher due to improved efficiency then it should take the title from Nvidia as the fastest GPU around.
 
Joined
May 13, 2008
Messages
669 (0.11/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
It would have been nice to see a 20nm chip. probably could have lowered the power draw anyway but maybe hbm is more efficient and will help out. have not seen anything about the efficiency...

Yeah, but understandable given cost/yields/etc. 20nm would have likely saved 20% on the gpu side. HBM should save about 15w on the memory side (vs the old product, granted this is faster).

640GB/sec, why? Unless AMD is planning to address the "4K problem" by swapping out textures all the time, I don't see any benefit to this, and lots of drawbacks (price being one of them). Considering nVIDIA's GPUs have always been able to match AMD's for performance, while using much narrower bus widths (Hawaii: 512bit, Maxwell: 256bit), I'm not seeing any good reason, unless of course AMD's architecture is far more bandwidth-dependant than nVIDIA's.

Okay, that's a loaded question. The short-ish answer is yes, AMD's arch is more memory dependant because of (typically) greater available floating point/shader resources (amd's units do double-duty for sf as well....where-as nvidia uses less fpu and smaller special function units at a ratio that is often close to fully utilized in many scenarios) plus the fact that around 1/3 of what would be nvidia's similarish required bandwidth is derived from a mixture of greater on-die cache and whatever benefits their newer compression allows. If I had to guess, the split on that is something like 75% cache improvements, 25% compression improvements. IOW, the compression improvements help around 8% to slightly more, just like Tonga for AMD.

The really odd thing about this lineup, is what AMD expects to field in the discrete mobile arena. Presently, the top part is Pitcairn based (M290X) in its third generation of cards. The M295X's (Tonga) heat production in the iMac probably preclude its use in laptops, and Hawaii is clearly unsuitable.

That's where HBM starts. Better too have too much bandwidth than too little.

Hey, someone has to lead the charge. Just imagine the marketing mileage from 640GB/sec. It's like those nutty theoretical fillrates pumped up to eleven!

Fiji will do double duty as a compute chip, where on-card bandwidth will play a much greater role in GPGPU. FWIW, even Nvidia are unlikely to go below 384-bit for their compute chip. The one thing that will hold Fiji back is the 4GB as a FirePro (not the bandwidth). AMD already has the W9100 with 16GB of onboard GDDR5 for a reason.

First, never say never...IIRC nvidia sold big-ass Fermi (granted cut wayyy down and clocked in the basement) in laptops.

Right about HBM...plus, if they shrink/reconfigure the core design on 14/16nm for 50% more resources, the mem controller could probably be reasonably recycled....it's also possible they could cut cache or what-not because of that off-die bw. Not saying they did/will...but who knows? It's possible it's there for more than being ahead-of-it's-time (but a necessary evil) on bw and behind-it's-time on density configuration. Even if they didn't change anything, it should be good for a good extra chunk of performance (double over what is required gives typically around a 16% boost...this in essence could give something like 8% over what one might expect given the other specs and typical core usage).

Either way you look at it, this thing *has* to compete with a GM200 21SMM part. Say that can do 1400mhz best-case, that essentially means this has to do 1200 to compete. The bandwidth required for quite literally 10TF is..well...a lot. You'd be talking needing an 8ghz/512-bit controller which wouldn't exactly be small nor power efficient (if even possible with within die size limits). As odd as it sounds, twice the controllers at (what apparently amounts to 5ghz) is likely both less transistors and more efficient within the gpu logic..

Don't try comparing the arguments over a 970's memory shenanigans to AMD's next uber chip (Fiji). I'm not clued up on it but many say HBM only caters for 4GB memory allowance (for now?...). The 970 is the cheaper Maxwell performance part whereas 390X will be the single gpu top tier.

And yes, those that bought cards with 4GB (or not as the 970 is) would have figured that into their choice. If 390X is to be AMD's next gen top tier card, you would hope it would have more as they have already seen fit (AIB's) to release a 8GB 290X with albeit small rewards at 4k.

IMO, I don't know if we need >4GB for gaming purpose except on poorly coded things (look at the recent CoD for bad memory hogging, or Titanfall IIRC). But if we do need >4GB in the next year or so, I'm pretty sure there will be developments to allow higher memory usage on the AMD chips.

So, to be plain - 4GB won't be an immediate worry and I'm sure it will be addressed when needed.

Correct. HBM is currently 1GB. Implementation, unlike the setup of gddr5, is limited to four chips. That means 4GB. 2nd gen is due end of year. Does that mean a refresh before 14/16nm? Conceivably...but who knows how fast amd is transitioning to the smaller process. I continue to disagree about 4GB being enough...If one was to argue things should be properly coded for 4k/8GB (or possibly 6GB in many instances), we could have a conversation. That said, it's not going to stop HBM memory density from increasing and badly optimized console ports targeted toward that shared pool of memory at a low resolution from being a scaling factor in some regards. I still stand by GM200/R390x for the most part being 1440p-targeted chips (think in terms of a 900p 30fps ps4 game scaled to 1440p/60)...just like GM204 is mostly a 1080p-targeted chip. In those respects, it can be argued 4GB(/6GB in some cases) is sufficient.
 
Last edited:
Joined
Apr 29, 2014
Messages
4,180 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Well now we seem to be getting a bigger picture of whats going on, so it seems the R9 380X is going to be an upgraded R9 290X similar to how the R9 285 was an upgraded R9 280 (though probably without reducing the memory this time). If that's the case then the power differences in the improved design is going to be put towards improved overclocks to knock the performance up a bit. The R9 390X sounds cool but I do agree the 4gb even with that insane bandwidth is going to be the part I worry about on the gaming front especially since that is probably going to be the first (Depending on when Nvidia release their Titan II) chip that will handle 4K on a single card (Not perfectly of course but decently enough). Hopefully by that point I we might see some way to do 8gb versions of the cards for the hardcore users out there who want the extra VRAM though that is skeptical based on the HBM limits currently.

Cannot wait to see more of the develop!
 
Joined
Mar 10, 2014
Messages
1,793 (0.48/day)
Well now we seem to be getting a bigger picture of whats going on, so it seems the R9 380X is going to be an upgraded R9 290X similar to how the R9 285 was an upgraded R9 280 (though probably without reducing the memory this time). If that's the case then the power differences in the improved design is going to be put towards improved overclocks to knock the performance up a bit.

Not really comparable. r9-280 is gcn1.0 and r9-285 is two generations advanced gcn1.2. R9-290x is gcn1.1 and r9-380x seems to be still gcn1.1, redefined optimized hawaii that its. More like a hd7970GHz tahiti to r9-280x tahiti xtl.

The R9 390X sounds cool but I do agree the 4gb even with that insane bandwidth is going to be the part I worry about on the gaming front especially since that is probably going to be the first (Depending on when Nvidia release their Titan II) chip that will handle 4K on a single card (Not perfectly of course but decently enough). Hopefully by that point I we might see some way to do 8gb versions of the cards for the hardcore users out there who want the extra VRAM though that is skeptical based on the HBM limits currently.

Cannot wait to see more of the develop!
There's no bigger than 1GB hbm memories out yet. 8GB memory would need higher density hbm 2GB memory packages from hynix, before getting 8GB card. Memory available now it would need 2048bit memory interface from the gpu(I don't think you can split memory bandwidth between 2 hbm memory package, I could be wrong though).

In short r9-380/x, r9-370/x and r7-260/x(I really hope this isn't rebranded pitcairn as 3dcenter rumored it to be) is kind of meh been there done that. Really interesting parts will be r9-390 -series.
 
Joined
Apr 29, 2014
Messages
4,180 (1.15/day)
Location
Texas
System Name SnowFire / The Reinforcer
Processor i7 10700K 5.1ghz (24/7) / 2x Xeon E52650v2
Motherboard Asus Strix Z490 / Dell Dual Socket (R720)
Cooling RX 360mm + 140mm Custom Loop / Dell Stock
Memory Corsair RGB 16gb DDR4 3000 CL 16 / DDR3 128gb 16 x 8gb
Video Card(s) GTX Titan XP (2025mhz) / Asus GTX 950 (No Power Connector)
Storage Samsung 970 1tb NVME and 2tb HDD x4 RAID 5 / 300gb x8 RAID 5
Display(s) Acer XG270HU, Samsung G7 Odyssey (1440p 240hz)
Case Thermaltake Cube / Dell Poweredge R720 Rack Mount Case
Audio Device(s) Realtec ALC1150 (On board)
Power Supply Rosewill Lightning 1300Watt / Dell Stock 750 / Brick
Mouse Logitech G5
Keyboard Logitech G19S
Software Windows 11 Pro / Windows Server 2016
Not really comparable. r9-280 is gcn1.0 and r9-285 is two generations advanced gcn1.2. R9-290x is gcn1.1 and r9-380x seems to be still gcn1.1, redefined optimized hawaii that its. More like a hd7970GHz tahiti to r9-280x tahiti xtl.
Yes but my reference was to the fact that its supposed to be the upgraded variants shown only currently in Tonga (For instance the compression method) not so much the jump in differences.

There's no bigger than 1GB hbm memories out yet. 8GB memory would need higher density hbm 2GB memory packages from hynix, before getting 8GB card. Memory available now it would need 2048bit memory interface from the gpu(I don't think you can split memory bandwidth between 2 hbm memory package, I could be wrong though).

In short r9-380/x, r9-370/x and r7-260/x(I really hope this isn't rebranded pitcairn as 3dcenter rumored it to be) is kind of meh been there done that. Really interesting parts will be r9-390 -series.
That's why I said its "skeptical based on the HBM limits currently", its only going to be revealed once we get the full rundown on the stacked memory as to if this is going to be possible at all to double up or we will be waiting for bigger chips.
 
Last edited:

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
12,945 (2.60/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Razer Huntsman Tournament Edition
Software Windows 11 Pro 64-Bit
I wonder what the pricing will be like on the 390/390x.
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.65/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
I wonder what the pricing will be like on the 390/390x.

Just speculation right now but the R9 280X launched for $300 and the R9 290 launched for $400 and the R9 290X launched for $550. I would guess that the 300 series will launch within $50 of the 200 series. I expect the R9 380X will outperform the GTX 980 if the rumors are true. That would be something if AMD launched that GPU for $300-$350 in a few months. $200-$250 cheaper than the 980. Ouch.
 
Joined
Mar 28, 2014
Messages
586 (0.16/day)
Processor AMD FX-8320
Motherboard AsRock 970 PRO3 R2.0
Cooling Thermalright Ultra120 eXtreme + 2 LED Green fans
Memory 2 x 4096 MB DDR3-1333 A-Data
Video Card(s) SAPPHIRE 4096M R9 FURY X 4G D5
Storage ST1000VX000 • SV35.6 Series™ 1000 GB 7200 rpm
Display(s) Acer S277HK wmidpp 27" 4K (3840 x 2160) IPS
Case Cooler Master HAF 912 Plus Black + Red Lights
Audio Device(s) Onboard Realtek
Power Supply OCZ ProXStream 1000W
Mouse Genius NetScroll 100X
Keyboard Logitech Wave
Software Windows 7 Ultimate 64-bit
Yes, AMD cards' severe weakness is their tremendous idling power consumption> It's so high to the level of being ridiculously high. They are amateurs in this regard compared to nvidia engineers. :(

Also, yes, stacked memory needs some time to ramp up but I guess they needed to start at some point and it is now.
 
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
40W is ridiculously high for GPU's that came out 2,5 years ago and you're all comparing it to a brand new GPU from NVIDIA released 4 months ago. Ooooook...
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.65/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Yes, AMD cards' severe weakness is their tremendous idling power consumption> It's so high to the level of being ridiculously high. They are amateurs in this regard compared to nvidia engineers. :(

Also, yes, stacked memory needs some time to ramp up but I guess they needed to start at some point and it is now.

Idle power draw for the cards

R9 290X Reference Card 17 watts
R9 290X Lightning 22 watts

GTX 980 Reference 8 watts
GTX 980 Gaming 14 watts

If leave your computer on 24 hours a day every day idling and you pay the national average per kWh (12 cents) then the reference R9 290X will add 78 cents a month to your power bill. For the factory OC cards the R9 290X will add 70 cents a month. It's just not much at all and if that amount matters to you or you pay a lot more for electricity then I would say turn your rig off when not in use.
 
Joined
Sep 7, 2011
Messages
233 (0.05/day)
Location
Pekanbaru - Riau - Indonesia - Earth - Universe
System Name My Best Friend...
Processor Qualcomm Snapdragon 650
Motherboard Made By Xiaomi
Cooling Air and My Hands :)
Memory 3GB LPDDR3
Video Card(s) Adreno 510
Storage Sandisk 32GB SDHC Class 10
Display(s) 5.5" 1080p IPS BOE
Case Made By Xiaomi
Audio Device(s) Snapdragon ?
Power Supply 2A Adapter
Mouse On Screen
Keyboard On Screen
Software Android 6.0.1
Benchmark Scores 90339
40W is ridiculously high for GPU's that came out 2,5 years ago and you're all comparing it to a brand new GPU from NVIDIA released 4 months ago. Ooooook...

Ouchhhh, bullseye :)

goodpoint bro, just wait R300 and we can talk and compare it to G9 series, but if AMD too late, it'll another story..
 
Joined
Nov 3, 2013
Messages
2,141 (0.56/day)
Location
Serbia
Processor Ryzen 5600
Motherboard X570 I Aorus Pro
Cooling Deepcool AG400
Memory HyperX Fury 2 x 8GB 3200 CL16
Video Card(s) RX 6700 10GB SWFT 309
Storage SX8200 Pro 512 / NV2 512
Display(s) 24G2U
Case NR200P
Power Supply Ion SFX 650
Mouse G703 (TTC Gold 60M)
Keyboard Keychron V1 (Akko Matcha Green) / Apex m500 (Gateron milky yellow)
Software W10
Idle power draw for the cards

R9 290X Reference Card 17 watts
R9 290X Lightning 22 watts

GTX 980 Reference 8 watts
GTX 980 Gaming 14 watts

If leave your computer on 24 hours a day every day idling and you pay the national average per kWh (12 cents) then the reference R9 290X will add 78 cents a month to your power bill. For the factory OC cards the R9 290X will add 70 cents a month. It's just not much at all and if that amount matters to you or you pay a lot more for electricity then I would say turn your rig off when not in use.
I can probably find more money than that on the pavement... daily :D
Idling is not an issue. People are grasping at straws...
 
Joined
Nov 18, 2010
Messages
7,125 (1.45/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
IMHO AMD screwed on one thing... Bermuda should have been a triangle :D a triple head :laugh:
 
Joined
Apr 19, 2012
Messages
122 (0.03/day)
Location
San Diego, California
System Name Mi Negra
Processor Intel Core i7-2600K Sandy Bridge 3.4GHz (3.8GHz Turbo Boost) LGA 1155 95W Quad-Core Desktop Processo
Motherboard Gigabyte GA-Z68XP-UD3-iSSD LGA 1155 Intel Z68 HDMI SATA 6Gb/s USB 3.0 ATX Intel Motherboard
Cooling Arctic Cooling Freezer 7 Pro Rev.2 with 92mm PWM Fan
Memory Patriot Viper Xtreme Series DDR3 8 GB (2 x 4 GB) PC3-12800 1600MHz
Video Card(s) Nvidia Founders Edition GeForce GTX 1080 8GB GDDR5X PCI Express 3.0 Graphics Card
Storage Samsung 750 EVO 250GB 2.5" 250G SATA III Internal SSD 3-D 3D Vertical Solid State Drive MZ-750250BW
Display(s) Samsung UN40JU6500 40" Class 4K Ultra HD Smart LED TV
Case In Win 303 Black SECC Steel/Tempered Glass Case ATX Mid Tower, Dual Chambered/High Air Computer Case
Audio Device(s) Creative Sound Blaster X-Fi Titanium Fatal1ty Professional 70SB088600002 7.1 Channels 24-bit 96KHz P
Power Supply Antec High Current Pro HCP-1200 1200W ATX12V / EPS12V SLI Ready 80 PLUS GOLD Certified Yes, High Cur
Mouse Logitech G700s Black 13 Buttons Tilt Wheel USB RF Wireless Laser 5700 dpi Gaming Mouse
Keyboard Logitech G810 Orion Spectrum RGB Mechanical Gaming Keyboard
Software Microsoft Windows 10 Professional 64-bit
They need to release them already. IM VIDEO CARDLESS.
 
Joined
Feb 14, 2012
Messages
2,323 (0.52/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
Joined
Feb 11, 2009
Messages
5,403 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Kinda dissapointed with this, just some higher clocks? thats not going to put much of a dent in the landscape :(. I want actual new cards with new tech that are actually more power efficient and faster and full 4k capable :(

#4Kapable
 
Joined
Jun 13, 2012
Messages
1,328 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Yeah, that's wrong too. 760 was the 670 and the 770 was the 680. Both Kepler.

As Breit rightly says, GM is not GK.

Least with 680 to 770, Nvidia did clock bumps and made 770 faster, unlike AMD where 7970 to 280x had its clocks lowered.

Actually you both are exaggerating... 290x stock blower is a disaster and that's a fact, it's poorly made to begin with. But yes you are right, news article should not contain such bashing already, that's our task to do :D.

news articles can bash all they want pointing at past where a company complete screwed up on a product and yes that stock blower from a 6000 series card was a complete screw up, Pretty bad that you lose 20% performance after 5min of gaming

Wow, I was disappointed... I thought 380X is Fiji and Bermuda is 390X. Well I guess to put this in perspective is 380X will shoot straight at 970. 390X will compete with the future Titan II/980 Ti.

What could be disappointing is the efficiency. With these chips you will only be getting 285 like efficiency, which is no where near Maxwell. Heat should be maintained pretty well IMO. AIB coolers with double fans are going to keep the temperatures down.

I am laughinh up a storm atm, remember all the AMD fans jumping on rumors and thinking 380x was gonna be a 4096 gcn monster yet its not even close to what they were expecting.
 
Last edited:
Top