Wednesday, September 7th 2011

MSI Calls Bluff on Gigabyte's PCIe Gen 3 Ready Claim

In August, Gigabyte made a claim that baffled at least MSI, that scores of its motherboards are Ready for Native PCIe Gen. 3. Along with the likes of ASRock, MSI was one of the first with motherboards featuring PCI-Express 3.0 slots, the company took the pains to educate buyers what PCI-E 3.0 is, and how to spot a motherboard that features it. MSI thinks that Gigabyte made a factual blunder bordering misinformation by claiming that as many as 40 of its motherboards are "Ready for Native PCIe Gen. 3." MSI decided to put its engineering and PR team to build a technically-sound presentation rebutting Gigabyte's claims.

More slides, details follow.

MSI begins by explaining that PCIe support isn't as easy as laying a wire between the CPU and the slot. It needs specifications-compliant lane switches and electrical components, and that you can't count on certain Gigabytes for future-proofing.

MSI did some PCI-Express electrical testing using a 22 nm Ivy Bridge processor sample.
MSI claims that apart from the G1.Sniper 2, none of Gigabyte's so-called "Ready for Native PCIe Gen. 3" motherboards are what the badge claims to be, and that the badge is extremely misleading to buyers. Time to refill the popcorn bowl.
Source: MSI
Add your own comment

286 Comments on MSI Calls Bluff on Gigabyte's PCIe Gen 3 Ready Claim

#51
erocker
*
nvidiaintelftwwhat about Asrock Extreme Gen3 boards? are those true PCI-e 3.0?
My Asrock Z68 Extreme4 Gen3 has NXP L04083B PCI-E 3.0 switches on it.
Posted on Reply
#52
Andrea deluxe
TheLostSwedeSome of them are, but according to MSI's reasoning, their high-end model doesn't, as it has an nForce 200 chip on it, yet ASRock is claiming PCI Express 3.0 support...
they support 3.0 only in a single slot of the mobo...

is explained on the manual.....
Posted on Reply
#53
Steven B
well the UD7 was never on the GB list, right?

PCI-E 3.0 is not interesting topic tho, its like the least important thing on a motherboard you buy today.
Posted on Reply
#54
TheLostSwede
News Editor
Andrea deluxethey support 3.0 only in a single slot of the mobo...

is explained on the manual.....
And if that is the case, then so should Gigabyte's boards...
Posted on Reply
#55
STCNE
I haven't trusted Gigabyte since they sent me back the same DOA board 3 times. 5 months of RMAing and $40 for shipping and I'm still stuck with their dead board.
Posted on Reply
#56
ranom
TheLostSwedeAnd if that is the case, then so should Gigabyte's boards...
The reason for that on the ASRock Ext7 is because the other PCIe slots are attached to the NF200, hence no Gen3. But one of the slots does get the full 16 lanes since they are using NXP L04083B PCIe 3.0 switches (but it becomes disabled when using the other x16 slots in crossfire/SLI situations)

Now on boards that dont use PCIe lane switches, in theory they might be able to support full 16 lanes of PCIe 3.0 (have no idea if they will be electronically stable). But on the Gigabyte boards that do have lane switches; other than the new G1.sniper2 and possibly the new REV1.3 boards; the chances of having native Gen3 16 lane support might be a bit silm.
Posted on Reply
#57
Scheich
All these lies, goddamit :shadedshu
Posted on Reply
#58
dazz
what happens in 8x8x configurations with PCIe 2.0 switches involved? Do you get 8 lanes at 3.0 speeds and 8 lanes at 2.0, or do they all downgrade to 2.0?
Posted on Reply
#59
neliz
TheLostSwedeSome of them are, but according to MSI's reasoning, their high-end model doesn't, as it has an nForce 200 chip on it, yet ASRock is claiming PCI Express 3.0 support...
Actually, the Extreme7 is a different beast altogether.

It switches between the SINGLE Gen3x16 slot and the NF200 controller.

If you installed a card in slot2, it will disable ALL OTHER PCI EXPRESS SLOTS from the NF200.
If you install a x16 in any other slot than slot 2, it will disable Gen3.

Now tell me, who feels like paying $300 for a single slot ATX board?


Oh, I checked the Gigabyte website:
G1 Sniper v2 lists Gen3 support:
www.gigabyte.com/products/product-page.aspx?pid=3962#ov

All other Z68 boards etc? NO Gen3 support listed anymore (clue clue!)
www.gigabyte.com/products/product-page.aspx?pid=3863#ov
Posted on Reply
#60
Vimes
nelizActually, the Extreme7 is a different beast altogether.

It switches between the SINGLE Gen3x16 slot and the NF200 controller.

If you installed a card in slot2, it will disable ALL OTHER PCI EXPRESS SLOTS from the NF200.
If you install a x16 in any other slot than slot 2, it will disable Gen3.

Now tell me, who feels like paying $300 for a single slot ATX board?


Oh, I checked the Gigabyte website:
G1 Sniper v2 lists Gen3 support:
www.gigabyte.com/products/product-page.aspx?pid=3962#ov

All other Z68 boards etc? NO Gen3 support listed anymore (clue clue!)
www.gigabyte.com/products/product-page.aspx?pid=3863#ov
They are, at this moment in time, all listed as being compatible with Gen 3...

uk.gigabyte.com/press-center/news-page.aspx?nid=1048
Posted on Reply
#61
cdawall
where the hell are my stars
Derek12very bad, Gigabyte, if this is true and not marketing BS for MSI, You disappointed me if is the case :mad:




Don't insult to people that trust or used to trust Gigabyte, dude. :mad: if you are AntiGigabyte at least don't insult, can you?, the same manner I am antiPCCHIPS and don't insult their users
i make fun of PCChips all the time. how can you get a lower end ECS mobo? i mean thats a hell of a feat.
ensabrenoirIvory bridge sample that's confirmed to exist......Wonder if msi have a bulldozer sample :).....
BD samples have been out and about a year now.
Posted on Reply
#62
sneekypeet
Retired Super Moderator
I take it those in glass houses should throw stones?

www.techpowerup.com/forums/showthread.php?t=151660

SO its OK for MSI to say they are the first on something they are not, but not OK for gigabyte to try something similar.

MSI needs to climb out of everyone's ass, and worry about their own house of cards for a while IMHO!

On topic....meh. The educated won't fall for gigabyte's tricks anyways!
Posted on Reply
#63
cdawall
where the hell are my stars
sneekypeetI take it those in glass houses should throw stones?

www.techpowerup.com/forums/showthread.php?t=151660

SO its OK for MSI to say they are the first on something they are not, but not OK for gigabyte to try something similar.

MSI needs to climb out of everyone's ass, and worry about their own house of cards for a while IMHO!

On topic....meh. The educated won't fall for gigabyte's tricks anyways!
its possible that the MSI card fart works. i shut mine off every night. who knows maybe it will slow down dust build up. it is still no were near the full support for tech that wont be fully compatible.
Posted on Reply
#64
sneekypeet
Retired Super Moderator
not that it works, they claim they are the first with the tech, which they are not as someone in that thread brought up, and imho is something bta should have caught and called the bluff on MSI in it's OP instead of just following the marketing.

Again point is, people in glass houses shouldn't throw stones, but it seems MSI used bullet proof glass on their house!

So MSI sold the tech before they used it themselves? www.techpowerup.com/115132/SPARKLE-Announces-Calibre-X240-X240G-Graphics-Cards-With-Dual-Layer-Fan-Blade-Cooling.html
Posted on Reply
#65
ensabrenoir
cdawalli
BD samples have been out and about a year now.
Yeah everyone benched said 2 b false inaccurate etc etc. Haven't heard of a accountable one yet
Posted on Reply
#66
cdawall
where the hell are my stars
ensabrenoirYeah everyone benched said 2 b false inaccurate etc etc. Haven't heard of a accountable one yet
never said they had been benched and posted just said the samples have been out a while.
Posted on Reply
#67
ensabrenoir
cdawallnever said they had been benched and posted just said the samples have been out a while.
Got me on that one:D
Posted on Reply
#68
werez
The only problem they have right now is this : claiming NATIVE 3.0 support . Native is the wrong word used . However they could have just claimed that any upcoming 3.0 card will be compatible with their boards without any major performance loss due to bandwidth and other crap that will prolly yeld 2fps higher in a random game anyway . Just my two cents ...
The real marketing bs is the 3.0 implementation . I can already see it coming , everybody upgrading their systems , buying a new motherboard for 3.0 support , and the upcoming cards will perform the same on 2.0 (2.1) / 3.0 . I`ve seen it in the past and im pretty sure ill see this crap again .

The fact that the cards maybe won`t run at full potential is another story , and we shall see the impact upon performance . But like i said , im pretty sure they know what they are doing . Just remember that motherboard manufacturers get early en samples to test on their motherboards , so they can actually build that bios , uefi or whatever and im pretty sure they have a 3.0 card around there somewhere . You wouldn`t wan`t ur hd7*** or whatever not working(or not fully compatible) in existing motherboards just because the card itself is 3.0 . It should be backward compatible , and basically that`s what an UEFI or BIOS update is for . Why not update the bios prior to the actual GPU launch ?
And also i am pretty sure that they will actually launch motherboards with TRUE NATIVE ( hardware wise ) pcie 3.0 support , after the actual video cards hit the market .
I`m still going to buy Gigabyte motherboards anyway ...
Posted on Reply
#69
Mussels
Freshwater Moderator
i'll go get the butter.
Posted on Reply
#70
cdawall
where the hell are my stars
Musselsi'll go get the butter.
can i bring the toast?
Posted on Reply
#71
Mussels
Freshwater Moderator
LAN_deRf_HAReminds me of that time Gigabyte said the Asus epu chips don't do anything. Asus just turned around and sued them and they folded pretty easy. Wonder what they'll do here.
except they were right, those chips really didnt do jack shit. i tested a few systems personally with a power meter at the wall, and they didnt do jack - it was just a software mechanism to control EIST, which didnt do anything if it was already enabled in the BIOS.
Posted on Reply
#72
sneekypeet
Retired Super Moderator
Ha it would be awesome if gigabyte was doing a revision and MSI just looks like more of an ass for being a douche all the way around, instead of just stealing ideas and calling it your own!

IDK guess I'm just so tired of people who are obviously wrong making others look bad to get themselves on the next rung of the ladder.
Posted on Reply
#73
Lordbollo
Steven BWell i think what GIGABYTE meant, is that there is native support for PCI-E Gen 3. now MSI took the one board, the UD7 that GIGABYTE did NOT list on their list of PCI-E 3.0 capable boards and attacked it, but GB never said the UD& had PCI-E gen 3 capability.
Um no they didn't the pics posted with this story clearly shows that cpu-z reports that they used the P67A-UD4-B3 board which GB claim has dude. Not the P67A-UD7-B3.
Posted on Reply
#74
Steven B
Just because something technical is put on a slide and delivered from a company for once doesn't mean what the text says is true or applicable to the matter at hand.

I think it is very nice that MSI took the time to make those slides, and I think its a great way for them to show the community how the PCI-E lane allotment and standards work, sad part it is it doesn't disprove that the first 8x lanes of the first 16x slot on the UD4 isn't PCI-E 3.0 capable.

Besides if what VRZone says is true, then this doesn't matter.
Posted on Reply
#75
AsRock
TPU addict
DannibusXFalse advertising is pretty serious in the States. Can't the FTC look into this? It might suck to be Gigabyte pretty soon. I hope not though, they make good stuff.
Yes...

Isn't Gen 3 like Gen 1 to Gen 2 for performance gains ?. Maybe it's why Gigabyte ( if they did that is ) skimped out as the bandwidth is not used most of the time anyways.
Posted on Reply
Add your own comment
May 6th, 2024 21:52 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts