• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

GIGABYTE RTX 4090 WindForce V2 Unveiled, with Unique Tail-ended Power Connector

enjoy your broken PCB indeed!!!


this dude is just a resonance box for whatever is the latest drama. The real drama queen.
Unsurprising, his content is so bad, it's the way to get views.
 
It's more like half, but I don't see the point of cutting the rest of the fins just to cram the connector a little bit deeper into the card. It shows what's wrong with modern day gaming in general: form over function.

IMO, leave the fin stack alone, or make the card shorter. This intermediate solution is the worst of both worlds.
But if it's shorther it's going to be louder :D. Or they will have to make the card thicker again :D. You are always going to lose in somewhay with the 4090
 
Much better if the cutout was at the bottom of the card. Although it would make it necessary to take the card out to insert or remove the connector.

Why not put the connector on an extension to make it end-mounted?
Yup and with all of their cards getting a crack by the pcie locking lug, it's not worth it.
 
I was hoping for a dual fan 100mm WF2 version, 3.5 slots but 22-25 cm. From the looks of it V2 introduced a new small PCB the height of normal card. Perfect for a water-cooled mini pc.
 
R.I.P. surface area.
 
But if it's shorther it's going to be louder :D. Or they will have to make the card thicker again :D. You are always going to lose in somewhay with the 4090
IMO, it's better to keep one's expectations in check, play at a resolution that makes sense, and buy a card that isn't overly expensive, has good power consumption and comes with a cooler that fits your case while not being too loud - but it might be just me.
 
So they cut half of the fin stack under the third fan, so it's blowing through nothing. How pointless!
On the contrary, it IS blowing over the connection point, so it should theoretically keep the pins a bit cooler and further reduce the risk of melting.
 
It's more like half, but I don't see the point of cutting the rest of the fins just to cram the connector a little bit deeper into the card. It shows what's wrong with modern day gaming in general: form over function.

IMO, leave the fin stack alone, or make the card shorter. This intermediate solution is the worst of both worlds.

Practical observations are good. Seeing a thermodynamic demonstration of how this design actually works inside a couple of what we would define as typical builds are going to be needed before I pass too strong of comment.

This passthrough had to (should?) be designed with a fair amount of sculpting based upon knowledge air flow will immediately be impeded by impact with the cable and connector. Meaning I'm fairly sure that fan noise and vibration tuning informed the layout. It's also important to note this occurs at the first entry point for cool outside air. A lot of question marks here to be sure could prove... "It shows what's wrong with modern day gaming in general: form over function."
 
Does it only look like that the gap between the connector and the backplate is pretty narrow?
 
IMO, it's better to keep one's expectations in check, play at a resolution that makes sense, and buy a card that isn't overly expensive, has good power consumption and comes with a cooler that fits your case while not being too loud - but it might be just me.
To be fair, the majority of 4090 coolers turned out to be overtuned, and created a set of fitement issues that don't feel justified, along with the bulkiness of the 12VHPWR adapter. The v1 of that gigabyte had no issue staying around 65°c at 32~35dB. The coolers where meant to cool 600w, when it reality you don't even come close to that...and those issues trickled down all the way to some 4070ti.

I won't comment too much on the price/resolution aspect, since imo the 4090 is only wearing a "gaming" skin, I've seen lots of high profile motion/3D designer buying those GPUs to make work for big clients...and not caring for gaming at all. (And people with a large disposable income are always outliers anyway :D)
 
Like this?
It would be nice if it was standardised.


Probably. The problem in my opinion, is not where the cables are, or how they look. The problem is that the third fan cools basically nothing. It's just there for the noise.

Edit: If they had cut the third fan completely, and put the power connector at the rear end of the card, it would have been a much better design, methinks.
That would be upto PCI-SIG to formalize that power connector on motherboard. though it would mean having to connect multiple PCI-e power connectors on motherboard(which will become a stupid headache quickly on server/WS/HEDT platforms where multiple GPUs are a norm.
 
That would be upto PCI-SIG to formalize that power connector on motherboard. though it would mean having to connect multiple PCI-e power connectors on motherboard(which will become a stupid headache quickly on server/WS/HEDT platforms where multiple GPUs are a norm.
Still better than bothering with the cable every time you swap your GPU.

Practical observations are good. Seeing a thermodynamic demonstration of how this design actually works inside a couple of what we would define as typical builds are going to be needed before I pass too strong of comment.

This passthrough had to (should?) be designed with a fair amount of sculpting based upon knowledge air flow will immediately be impeded by impact with the cable and connector. Meaning I'm fairly sure that fan noise and vibration tuning informed the layout. It's also important to note this occurs at the first entry point for cool outside air. A lot of question marks here to be sure could prove... "It shows what's wrong with modern day gaming in general: form over function."
A fair point. I still think the design is kind of pointless, though. If you need a shorter graphics card, then just... dunno... buy a shorter graphics card?
 
this dude is just a resonance box for whatever is the latest drama. The real drama queen.
Unsurprising, his content is so bad, it's the way to get views.

Did you even watch the video? The PCB on the Gigabyte card was THICKER than the PCB of the Asus card, yet the Asus card exhibited little to no flex, whereas the Gigabyte card flexed so badly it was making me nervous. :fear:

So you tell me: how can a thicker PCB have so much flex compared to a thinner one in that specific area of the GPU?
 
Did you even watch the video? The PCB on the Gigabyte card was THICKER than the PCB of the Asus card, yet the Asus card exhibited little to no flex, whereas the Gigabyte card flexed so badly it was making me nervous. :fear:

So you tell me: how can a thicker PCB have so much flex compared to a thinner one in that specific area of the GPU?

things that flex are more resistant to breaking, the ones that don't flex are the ones that are easy to break. Oh boy!
 
Probably already said but damn the connector is soo close to the heat sink

if too hot feel like it can be problematic
 
Did you even watch the video? The PCB on the Gigabyte card was THICKER than the PCB of the Asus card, yet the Asus card exhibited little to no flex, whereas the Gigabyte card flexed so badly it was making me nervous. :fear:

So you tell me: how can a thicker PCB have so much flex compared to a thinner one in that specific area of the GPU?


You mean when jay z was holding the caliper and his hand twitches and instead of 1,6 it locked to 2.1,

Gigabyte flexed more because the Asus had this pillar standoff on top of the PCB that provided reinforcement in the opposite direction.

And yeah of course gigabyte was probably not 2 Oz, they skimped on that promise. One could expect if the motherboards are being marketed as 2 Oz, the GPU will employ the same principle.
 
things that flex are more resistant to breaking, the ones that don't flex are the ones that are easy to break. Oh boy!
It's not just about the PCB, but the construction of the card itself. Asus doesn't flex because the structure of the cooler itself help handling the weight, when it's seems that gigabyte just let the PCB handle a big chunk of the weight. The founder edition are good exemple of a strong structure : they are pretty heavy for their size, but they don't flex...and don't make the news about breaking their PCB :D
 
It better have an ironclad warranty against cracks. It also pales in comparison to Asus’s 4090 Matrix!
 
It's not just about the PCB, but the construction of the card itself. Asus doesn't flex because the structure of the cooler itself help handling the weight, when it's seems that gigabyte just let the PCB handle a big chunk of the weight. The founder edition are good exemple of a strong structure : they are pretty heavy for their size, but they don't flex...and don't make the news about breaking their PCB :D

"making the news", aka one Ytuber made a video about a old issue

I have gigabyte products and i knew about this on their subreddit (were it's mostly complains of all sorts, warranty especially), hardly news. Happens from time to time, mostly pre build and i bet it is just because of.... being on prebuilds.
 
That's a stupid idea for a virtually inexistent problem considering they are sacrificing a significant portion of the heatsink.

But hey it's all about cutting corners ( pun intended ) and pitching it as an advance.

If Gigabyte want to fix some real issues on their GPU's then how about they design a real support bracket and start to honour their warranties ?

Oh yeah that won't happen , they would have to stop cutting corners for that.
 
That's a stupid idea for a virtually inexistent problem considering they are sacrificing a significant portion of the heatsink.

But hey it's all about cutting corners ( pun intended ) and pitching it as an advance.

If Gigabyte want to fix some real issues on their GPU's then how about they design a real support bracket and start to honour their warranties ?

Oh yeah that won't happen , they would have to stop cutting corners for that.
Stop thinking about problems! Just buy this shiny new thing that you never needed, and RMA it when it breaks! :p
 
Stop thinking about problems! Just buy this shiny new thing that you never needed, and RMA it when it breaks! :p
RMA and get a free red arrow sticker back!
 
"making the news", aka one Ytuber made a video about a old issue

I have gigabyte products and i knew about this on their subreddit (were it's mostly complains of all sorts, warranty especially), hardly news. Happens from time to time, mostly pre build and i bet it is just because of.... being on prebuilds.
jay wasn't the only one, Louis Rossman made the first video, and the video itself was started because someone had to repair a lot of gigabyte GPU send by people who didn't get any help from Gygabyte, and even with the 4000 series, he keep receiving a lot of craked GPU from that brand. Other medias then picked up that story. That's bad RMA from their part. Shutting up about will just entice Gigabyte to keep providing bad RMA
 
jay wasn't the only one, Louis Rossman made the first video, and the video itself was started because someone had to repair a lot of gigabyte GPU send by people who didn't get any help from Gygabyte, and even with the 4000 series, he keep receiving a lot of craked GPU from that brand. Other medias then picked up that story. That's bad RMA from their part. Shutting up about will just entice Gigabyte to keep providing bad RMA

The Gigabyte bad RMA is legendary, it's hardly news. Their subreddit is 80% bad RMA.
I know he wasn't the first, he just jumped on the drama as usual. He is the ambulance chaser guy.
Rossman isn't a pc tech guy, he just reports on bad consumer practices for his right to repair cause. For him i believe the Gigabyte RMA can be news. For someone in the business it can't be.
Rossman also said it was mostly from prebuilds, but i guess that was lost in the drama.

Like with Asus and many other dramas recently there is a story, but the drama is overblown (the Gigabyte PSU and that case that caught fire were the exception), out of context, and clickbait.
 
Back
Top