Friday, October 15th 2021

Gigabyte Launches AMD Radeon RX 6600 Eagle 8G Graphics Card

Gigabyte Technology Co. Ltd, a leading manufacturer of premium gaming hardware, today announced a new AMD Radeon RX 6600 graphics card - the Gigabyte Radeon RX 6600 Eagle 8G. The Eagle graphics card is the best choice for those who desire a unique design optimized for power efficiency and durability, and the ability to experience incredible high-framerate 1080p gaming.

AMD Radeon RX 6600 graphics cards are based on the breakthrough AMD RDNA 2 gaming architecture, designed to deliver the optimal balance of performance and power efficiency. Offering 32 MB of high-performance AMD Infinity Cache, 8 GB of GDDR6 memory, AMD Smart Access Memory technology and other advanced features, the new graphics cards are designed to bring next-generation desktop gaming experiences to the midrange market. They also support AMD FidelityFX Super Resolution, an open-source spatial upscaling solution designed to increase framerates in select titles while delivering high-resolution gaming experiences.
The Gigabyte Windforce 3X cooling system features three unique blade fans, alternate spinning, composite copper heat pipes in direct contact with the GPU, 3D active fans, screen cooling and graphene nano lubricant that work together to provide efficient heat dissipation. These cooling technologies keep temperatures low at all times, resulting in higher and more stable performance. The middle fan spins in reverse to optimize airflow for heat dissipation, enabling more efficient performance at a lower temperature. Screen cooling extends the heatsink to allow air to pass through, providing better heat dissipation and preventing heat accumulation so to improve stability. In addition, graphene nano lubricant can extend the fan life by 2.1 times, delivering nearly the lifespan of the double ball bearing while providing quiet operation.

The design of Eagle graphics card is inspired by science-fiction with mechanical materials, providing a transparent cover and bright logo. In addition, the back plate not only strengthens the overall structure of the graphics card, but also prevents the PCB from bending or sustaining damage. GIGABYTE graphics cards use a multi-phase power supply, providing over-temperature protection and load balancing for each MOSFET and allowing the MOSFET to operate at a lower temperature. The Ultra Durable-certified chokes and capacitors provide excellent performance and longer system life.
Source: Gigabyte
Add your own comment

53 Comments on Gigabyte Launches AMD Radeon RX 6600 Eagle 8G Graphics Card

#26
lexluthermiester
ValantarThe "value" lies mainly in GPU makers adding $30-60 to the MSRP for a "premium" SKU that delivers no tangible benefits to users.
Once again, you are expressing an opinion. Some people WANT the extra thermal performance and are willing to spend a bit extra to get it. Also once again, if that's not your jam, buy something else. There's no reason to continue harping on about it.
Calmmowell some, like me, get OCD about temps, i get anxious whenever my GPU goes above 65c for example and start tweaking system fan curves (aka me every time it starts getting hotter around summer)
Exactly. For someone people, it's important and for various reasons.
DeathtoGnomesI do like the engineering on this card, you dont see too many 'over engineered' cards.
I love it when manufactures put the extra effort into a product. So for my this card is a win for anyone who buys it and wants exceptional thermal performance.
Posted on Reply
#27
Valantar
lexluthermiesterOnce again, you are expressing an opinion. Some people WANT the extra thermal performance and are willing to spend a bit extra to get it. Also once again, if that's not you jam, buy something else. There's no reason to continue harping on about it.


Exactly. For someone people, it's important and for various reasons.


I love it when manufactures put the extra effort into a product. So for my this card is a win for anyone who buys it and wants exceptional thermal performance.
I see value in over-engineering high end products where there is something to play with for those inclined to do so. For a midrange product like this? No. It's a marketing exercise designed to drive up ASPs and provide aspirational SKUs tricking people into buying stuff that doesn't give them anything more for their money. I mean, sure, if you're one of the <5 people in the world who wants to LN2 OC a 6600, go for it. For everyone else? Useless. And over-engineering an air cooler for a 130W GPU with strict OC limits? That's doubly useless, as you're not getting anything even remotely rational in return. Of course, human desires are hardly rational, but trying to be more so is often healthy. And divesting ourselves of silly beliefs like "temperatures in the 70s-80s can hurt my GPU" is healthy.
Calmmowell some, like me, get OCD about temps, i get anxious whenever my GPU goes above 65c for example and start tweaking system fan curves (aka me every time it starts getting hotter around summer)
I understand that - but I also understand how this is a behaviour that we have been conditioned into over decades through a combination of marketing and poor thermal engineering, combining into a one-two-punch of fearing our systems overheating and expecting them to do so because many coolers are crap. Your GPU doesn't care whether it's at 30, 50 or 70 degrees. 80 is fine, though it might boost marginally lower. 90 is perfectly fine too, though you'll lose even more clocks - after all, GPU boost is essentially a thermal and power dependent auto OC. But it's healthy for both our minds and wallets to unlearn these habits.
Posted on Reply
#28
trsttte
I think you guys are confusing over-engineering with saving a buck reusing the design parts from other skus.
Posted on Reply
#29
lexluthermiester
trsttteI think you guys are confusing over-engineering with saving a buck reusing the design parts from other skus.
That is also a possibility. The effect is the same, excellent cooling.
ValantarAnd divesting ourselves of silly beliefs like "temperatures in the 70s-80s can hurt my GPU" is healthy.
Was going to respond to this earlier, but wanted to avoid offending you and sounding like a broken record. So here's how I'll respond:

70C, you're right, it'll be mostly harmless. However, 80C is where silicon starts to get "iffy" with is semi-conductor properties. Electron migration begins to be a problem as well as heat induced electron pathway degradation. Saying that 80C is acceptable depends greatly on what "acceptable" means to someone personally.

Me? I think 75C is barely acceptable. I strive to make sure the temps on my system, except for VRMs, stay below 70C, ideally 60C as a ceiling. So this card with it huge heatsink and triple fans will maintain a temperature threshold that I would consider acceptable.
Posted on Reply
#30
GerKNG
it's probably a triple fan card because they have too many left overs from other RDNA2 cards nobody wants.
Posted on Reply
#31
Bomby569
if the heatsinck and pcb is bigger, then it's a bit more cost to make. If it's just more plastic and fan the difference is marginal for gygabite
Posted on Reply
#32
Valantar
lexluthermiesterWas going to respond to this earlier, but wanted to avoid offending you and sounding like a broken record. So here's how I'll respond:

70C, you're right, it'll be mostly harmless. However, 80C is where silicon starts to get "iffy" with is semi-conductor properties. Electron migration begins to be a problem as well as heat induced electron pathway degradation. Saying that 80C is acceptable depends greatly on what "acceptable" means to someone personally.

Me? I think 75C is barely acceptable. I strive to make sure the temps on my system, except for VRMs, stay below 70C, ideally 60C as a ceiling. So this card with it huge heatsink and triple fans will maintain a temperature threshold that I would consider acceptable.
Meh. Electron migration isn't an issue unless you're pushing clocks and voltages very high - much higher than modern boost algorithms will do, and much higher than the firmware/driver limitations of these cards will allow. I've also never, ever heard of a GPU's silicon failing from running in the 80s. Other faults can occur if QC or production quality is too low - solder joints breaking from thermal expansion/contraction cycles; PCBs warping; that kind of thing. But seriously, show me any hard evidence of reasonably modern high end hardware failing from running in the 80s, please? I'd love to see it, but until I do, I'll trust the tJMax specs. If anything, the recent addition of hotspot temperature measurements has shown us that the temperatures we've been looking at for years are 10-20 degrees lower than what parts of the die are running at, and things have still worked fine. AMD specs their GPUs for up to 110 degrees hotspot temperature, and I don't see that as an issue. Intel specs their CPUs for 100 degrees, but allows for increasing this to 115 - at the user's risk, of course. The point being, 80 degrees does not cause meaningful electromigration in a stock or moderately overclocked chip over its useful life time. Unless you can provide significant data to prove otherwise (and, crucially, this data must show actual silicon failures rather than undiagnosed "my GPU died" failures), this is a safe assumption to make.

And, again, we aren't looking at those temperature ranges in the first place. I showed you a mid-range dual fan cooler that can keep a more power hungry card well below 80.
GerKNGit's probably a triple fan card because they have too many left overs from other RDNA2 cards nobody wants.
Given that RDNA2 cards are just as out of stock as Ampere cards, that's quite unlikely.
Posted on Reply
#33
HenrySomeone
lexluthermiesterThat is also a possibility. The effect is the same, excellent cooling.


Was going to respond to this earlier, but wanted to avoid offending you and sounding like a broken record. So here's how I'll respond:

70C, you're right, it'll be mostly harmless. However, 80C is where silicon starts to get "iffy" with is semi-conductor properties. Electron migration begins to be a problem as well as heat induced electron pathway degradation. Saying that 80C is acceptable depends greatly on what "acceptable" means to someone personally.

Me? I think 75C is barely acceptable. I strive to make sure the temps on my system, except for VRMs, stay below 70C, ideally 60C as a ceiling. So this card with it huge heatsink and triple fans will maintain a temperature threshold that I would consider acceptable.
While there are merits of (relatively) lower power cards having a larger cooler, I certainly wouldn't trust Gigabyte with its execution. Their basic triple fan solutions are habitually loudest or hottest (and sometimes both, but mostly the former) among competitors.
Gigabyte Radeon RX 6700 XT Gaming OC Review - Temperatures & Fan Noise | TechPowerUp
What I would like to see done on a card of this power (120 - 140W) is a shorter, yet taller (= more rectangular) cooler with a single 140mm fan on top. Something akin to the Palit KalmX series, but obviously actively cooled. The large single fan spinning at low rpm should do wonders for temperatures and especially noise levels.
Posted on Reply
#34
lexluthermiester
ValantarElectron migration isn't an issue unless you're pushing clocks and voltages very high
That is a myth.
ValantarI've also never, ever heard of a GPU's silicon failing from running in the 80s.
Not short term, no. Given 4 or 5 years though and it takes it's toll. And before you say that most people will upgrade by that time, that isn't the point.
Posted on Reply
#35
trsttte
HenrySomeoneWhile there are merits of (relatively) lower power cards having a larger cooler, I certainly wouldn't trust Gigabyte with its execution. Their basic triple fan solutions are habitually loudest or hottest (and sometimes both, but mostly the former) among competitors.
Gigabyte Radeon RX 6700 XT Gaming OC Review - Temperatures & Fan Noise | TechPowerUp
What I would like to see done on a card of this power (120 - 140W) is a shorter, yet taller (= more rectangular) cooler with a single 140mm fan on top. Something akin to the Palit KalmX series, but obviously actively cooled. The large single fan spinning at low rpm should do wonders for temperatures and especially noise levels.
It would be amazing if the industry moved away from their cheap small proprietary fans to standard 120/140mm (probably 120, 140 starts getting too big for some pc cases) like the Asus 3070 Noctua edition. That would be golden, probably won't happen because it would eat at their profit margins but a man can dream
Posted on Reply
#37
HenrySomeone
The one retailer on the whole continent where they actually have some meaningful stock of Radeons is not representative of the whole picture, we've been through this before. 6000 series is woefully represented globally.
Posted on Reply
#38
trsttte
This is not ideal by any stretch of the imagination but, for reference, this is how much cooling the RX6600 needs



It's even a single slot card, with both higher base and higher boost clocks (and certainly much higher dBa :D)
Posted on Reply
#39
eidairaman1
The Exiled Airman
trsttteI think you guys are confusing over-engineering with saving a buck reusing the design parts from other skus.
Considering the 6600xt requires more, it makes sense to use the same pcb, All card makers do it no matter if it's AMD or nvidia.
trsttteThis is not ideal by any stretch of the imagination but, for reference, this is how much cooling the RX6600 needs



It's even a single slot card, with both higher base and higher boost clocks (and certainly much higher dBa :D)
Considering its a blower design. Btw Asus have "turbo" cards for Nvidia now

Where are your system specs?
Posted on Reply
#40
Valantar
lexluthermiesterThat is a myth.

Not short term, no. Given 4 or 5 years though and it takes it's toll. And before you say that most people will upgrade by that time, that isn't the point.
As I said, you're going to have to provide some actual proof for that. I definitely am not speaking of when people upgrade, I'm thinking of a 5-10 year lifespan, and obviously a reasonable end-user usage scenario (i.e. no 24/7 100% loads, but perhaps 8h/day under load). GPUs and electronics in general fail for myriad reasons, and I would place thermally induced electromigration pretty far down that list. Heck, laptops typically run near their throttling point under any load, and while those do die more often than desktops, they're generally quite reliable (and failures are much more likely from other factors such as PCB failures or BGA contact issues due to repeated thermal expansion cycles). If GPUs regularly failed after <5 years from electromigration issues we should see troves of GTX 10-series cards dying right about now. I certainly haven't seen any indication of that.
Posted on Reply
#41
Bomby569
trsttteThis is not ideal by any stretch of the imagination but, for reference, this is how much cooling the RX6600 needs



It's even a single slot card, with both higher base and higher boost clocks (and certainly much higher dBa :D)
and this is how much cooling a 5700xt "needed" i guess

Posted on Reply
#42
ARF
TheLostSwedeBecause moar fans sell?
ixiBecause more silent and better cooling?
No matter how much lipstick you put on a pig's lips, it's still an ugly pig :kookoo:
Posted on Reply
#43
lexluthermiester
ValantarAs I said, you're going to have to provide some actual proof for that.
No, I'm not. I didn't pull the science of these known problems out my butt. You want to know more, go research. That's on you.
Posted on Reply
#44
Valantar
lexluthermiesterNo, I'm not. I didn't pull the science of these known problems out my butt. You want to know more, go research. That's on you.
You're the one making a new claim here - that silicon will fail over a reasonably short time span at what are generally seen as safe temperatures - and the responsibility of providing some backing for that claim is thus on you. Saying "go do your own research" is nothing but a cop-out.
Posted on Reply
#45
eidairaman1
The Exiled Airman
ARFNo matter how much lipstick you put on a pig's lips, it's still an ugly pig :kookoo:
Still better than the Asus Turbo 3070...
lexluthermiesterNo, I'm not. I didn't pull the science of these known problems out my butt. You want to know more, go research. That's on you.
ValantarYou're the one making a new claim here - that silicon will fail over a reasonably short time span at what are generally seen as safe temperatures - and the responsibility of providing some backing for that claim is thus on you. Saying "go do your own research" is nothing but a cop-out.
Take the back n forth to private messages please
Posted on Reply
#46
ARF
eidairaman1Still better than the Asus Turbo 3070...
How?
3070 is 85% faster.

Posted on Reply
#47
Valantar
ARFHow?
3070 is 85% faster.

More like 42% in a resolution that thus class of gpu can be expected to work at

But I don't think that was the point - rather that the 3070 Turbo is kind of the antithesis of this card, with woefully underbuilt cooling for a relatively power hungry card, rather than extremely overbuilt cooling for a low-power card.
Posted on Reply
#48
eidairaman1
The Exiled Airman
ValantarMore like 42% in a resolution that thus class of gpu can be expected to work at

But I don't think that was the point - rather that the 3070 Turbo is kind of the antithesis of this card, with woefully underbuilt cooling for a relatively power hungry card, rather than extremely overbuilt cooling for a low-power card.
Exactly my point, rather have overbuilt cooling and components for a entry level chip than crappy underbuilt cooling and components for a high end chip.
Posted on Reply
#49
Solid State Soul ( SSS )
neatfeatguyI think Futurama broke me because I can only think about this clip when I read "eagles":


I play games with headphones on, so a slightly louder GPU over one that's a little quieter probably wouldn't bother me. Coil whine, on the other hand....those high pitch noises pierce through my eardrums, even with headphones on. Thankfully none of my recent GPUs have had that issue.
What gpus had coilwhine for you?

Gigabyte used to have bad coilwhine problems with 900 and 1000 series gpus

My 980 Ti G1 has coil whine but not bad enough to destract me
Posted on Reply
#50
HenrySomeone
ValantarBut I don't think that was the point - rather that the 3070 Turbo is kind of the antithesis of this card, with woefully underbuilt cooling for a relatively power hungry card, rather than extremely overbuilt cooling for a low-power card.
Woefully underbuilt? It's not the 3090 and cards of this power (and higher too) always existed in blower variants, just like the use cases for them (=several of them in a rack). Actually, if this review is to be believed, it's actually very good at keeping the temps in check:

www.digitalcitizen.life/asus-turbo-geforce-rtx-3070-review/
"While running the benchmarks, we also found out that the ASUS Turbo GeForce RTX 3070 video card needs about 200 Watts of power. And, in the end, another important element is the heat produced by the video card. To see how hot the card gets, we ran Furmark for about an hour while monitoring the temperatures. What we found was that the ASUS Turbo GeForce RTX 3070 card doesn’t get hotter than 66 degrees Celsius (151 Fahrenheit). In other words, this is one cool graphics card!"

Now, it obviously won't be quiet at load, but in its primarily intended use case (AI, rendering), this isn't an issue.
Posted on Reply
Add your own comment
Apr 26th, 2024 19:17 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts