1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Gigabyte Responds to MSI's Bluff Call

Discussion in 'News' started by btarunr, Sep 20, 2011.

?

Do such public face-offs help the consumer?

  1. Yes

    37 vote(s)
    52.1%
  2. No

    34 vote(s)
    47.9%
  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,713 (11.16/day)
    Thanks Received:
    13,669
    Location:
    Hyderabad, India
    A little earlier this month, MSI's PR team dished out a presentation in which they claimed that Gigabyte was misleading buyers into thinking that as many as 40 of its recently-launched motherboards were "Ready for Native PCIe Gen.3". MSI tried to make its argument plausible by explaining what exactly goes into making a Gen 3-ready motherboard. The presentation caused quite some drama in the comments. Gigabyte responded with a presentation of its own, in which it counter-claimed that those making the accusations ignored some key details. Details such as "what if the Ivy Bridge CPU is wired to the first PCIe slot (lane switches won't matter)?"

    [​IMG] [​IMG] [​IMG] [​IMG] [​IMG]

    In its short presentation with no more than 5 slides, Gigabyte tried to provide an explanation to its claim that most of its new motherboards are Gen 3-ready. The presentation begins with a diplomatic-sounding message on what is the agenda of the presentation, followed by a disclaimer that three of its recently-launched boards, Z68X-UD7-B3 & P67A-UD7/P67A-UD7-B3, lack Gen 3 readiness. This could be because those boards make use of a Gen 2 NVIDIA nForce 200 bridge chip, even the first PCI-E x16 slot is wired to that chip.

    The next slide looks to form the key component of Gigabyte's rebuttal, that in motherboards with just one PCI-E x16 slot, there is no switching circuitry between the CPU's PCI-E root-complex and the slots, and so PCI-E Gen 3 will work.

    The following slides explain that in motherboards with more than one PCI-Express x16 slot wired to the CPU, a Gen 3 switch redirects the unused x8 PCIe lanes from the
    second slot to the first card slot for full x16 PCIe graphics bandwidth. But then MSI already established that barring the G1.Sniper 2, none of Gigabyte's boards with more than one PCI-E x16 slot has Gen 3 switches.

    Likewise, it explained in the following slides about how Gen 3 switches handle cases in which more than one graphics card is wired to the CPU. Again we'd like to mention that barring the G1.Sniper 2, none of Gigabyte's 40 "Ready for Native PCIe Gen.3" have Gen 3 switches.

    The last slide, however, successfully rebuts MSI's argument. Even in motherboards with Gen 2 switches, a Gen 3 graphics card can run in Gen 3 mode, on the first slot, albeit at electrical x8 data rate. Sure, it's not going to give you PCI-Express 3.0 x16, and sure, it's going to only work for one graphics card, but it adequately validates Gigabyte's "Ready for Native PCIe Gen.3" claim from a purely logical point of view.

    Now that Gigabyte entered the debate, the onus on Gigabyte will now be to also clarify that apart from the switching argument, all its "Ready for Native PCIe Gen.3" have Gen 3-compliant electrical components, which MSI claimed Gigabyte's boards lack in this slide.
     
    Last edited: Sep 20, 2011
    neliz, cadaveca and Derek12 say thanks.
  2. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,379 (11.54/day)
    Thanks Received:
    9,683
    inb4 flame war between the company reps again >.>
     
  3. TheMailMan78

    TheMailMan78 Big Member

    Joined:
    Jun 3, 2007
    Messages:
    21,140 (7.82/day)
    Thanks Received:
    7,669
    [​IMG]
     
    AndreiD, 1c3d0g, tomkaten and 18 others say thanks.
  4. Derek12

    Joined:
    Jan 12, 2010
    Messages:
    1,099 (0.63/day)
    Thanks Received:
    163
    The only way to check any claims 100% is trying a PCIe 3 video card on PCie3 ready MSI & Gigabyte boards and see differences.
     
  5. random

    random

    Joined:
    Oct 19, 2008
    Messages:
    3,043 (1.38/day)
    Thanks Received:
    686
    So.... for people who own Gigabyte mid-high end motherboards with more than 1 PCIE slot will get 8x/8x 3.0 in both single and dual GPU mode? :confused:
     
  6. tigger

    tigger I'm the only one

    Joined:
    Mar 20, 2006
    Messages:
    10,183 (3.24/day)
    Thanks Received:
    1,399
    Guarenteed to make me laugh Mailman thanks.
     
  7. kereta New Member

    Joined:
    Sep 3, 2011
    Messages:
    11 (0.01/day)
    Thanks Received:
    0
    i knew gigabyte has it covered
     
  8. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    42,379 (11.54/day)
    Thanks Received:
    9,683
    8x 3.0 on the first slot, 8x 2.0 on the second. (or disabled, according to other comments... which would suck)
     
    random says thanks.
  9. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,044 (6.15/day)
    Thanks Received:
    6,104
    I love how it is all marketting and we won't see a graphics card that even begins to make use of the extra bandwidth in PCI-E 3.0 until long after these boards are obsolete.

    I'm more interested in PCI-E 3.0 x1 than anything else, so we can have SATA 6.0Gb/s cards that actually get the bandwidth they need from an x1 slot and don't need an x4 slot.
     
    1c3d0g says thanks.
    Crunching for Team TPU 50 Million points folded for TPU
  10. TheLostSwede

    TheLostSwede

    Joined:
    Nov 11, 2004
    Messages:
    933 (0.26/day)
    Thanks Received:
    164
    No, you'll only have the top slot working in x8 mode, at least according to these slides, and the second slot will be disabled.

    The original idea as I understood it, was going to be fir the first slot to run in Gen 3 x8 mode and the second slot in Gen 2 x8 mode, but maybe that didn't work so well...
     
    Last edited: Sep 20, 2011
    Mussels says thanks.
  11. TheLostSwede

    TheLostSwede

    Joined:
    Nov 11, 2004
    Messages:
    933 (0.26/day)
    Thanks Received:
    164
    Not going to happen for quite some time, as Sandy Bridge-E doesn't support it, nor does Ivy Bridge, so yeah...
     
  12. repman244

    repman244

    Joined:
    Apr 7, 2011
    Messages:
    1,104 (0.85/day)
    Thanks Received:
    456
    Well this is a never ending flame war which doesn't benefit anyone I think.

    It goes like this:
    [​IMG]
     
  13. Fourstaff

    Fourstaff Moderator Staff Member

    Joined:
    Nov 29, 2009
    Messages:
    9,197 (5.14/day)
    Thanks Received:
    1,982
    Location:
    Home
    You probably want the extra features from PCI-E 3.0 rather than the bandwidth.
     
  14. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,044 (6.15/day)
    Thanks Received:
    6,104
    Sure they do. Sandy Bridge-E has 40 PCI-E 3.0 lanes to work with and Ivy Bridge is going to have 16, they can be used for anything, they don't have to be used for a graphics card.

    No, not really, the extra features are pretty useless to everyone really...
     
    Last edited: Sep 20, 2011
    1c3d0g says thanks.
    Crunching for Team TPU 50 Million points folded for TPU
  15. TheMailMan78

    TheMailMan78 Big Member

    Joined:
    Jun 3, 2007
    Messages:
    21,140 (7.82/day)
    Thanks Received:
    7,669
    Until games stop being ports most GPU upgrades are useless over all. Unless you fold or do 3D rendering its all fluff.
     
    garyinhere says thanks.
  16. DigitalUK

    DigitalUK New Member

    Joined:
    Oct 16, 2009
    Messages:
    503 (0.27/day)
    Thanks Received:
    68
    Location:
    UK South
    thanks mailman, was laughing my head off when i saw that
     
  17. TheLostSwede

    TheLostSwede

    Joined:
    Nov 11, 2004
    Messages:
    933 (0.26/day)
    Thanks Received:
    164
    Sure, but I think the OP meant x1 slots from the chipset, which isn't going to happen until maybe Ivy Bridge-E or even later.
     
  18. bbmarley

    Joined:
    Jul 24, 2008
    Messages:
    486 (0.21/day)
    Thanks Received:
    79
    wasnt it last year ASUS called out Gigabyte
    i wonder why all the Gigabyte hate from brands
     
  19. DannibusX

    DannibusX

    Joined:
    Aug 17, 2009
    Messages:
    2,528 (1.33/day)
    Thanks Received:
    979
    Location:
    United States
    Because they are competitors?
     
  20. TheMailMan78

    TheMailMan78 Big Member

    Joined:
    Jun 3, 2007
    Messages:
    21,140 (7.82/day)
    Thanks Received:
    7,669
    No. I don't want to own ether of their products because of this childish BS. You can call a competitor out without pointing fingers. Have a little class people.
     
    Damn_Smooth and cadaveca say thanks.
  21. cadaveca

    cadaveca My name is Dave

    Joined:
    Apr 10, 2006
    Messages:
    14,050 (4.50/day)
    Thanks Received:
    7,206
    Location:
    Edmonton, Alberta
    +1.

    I gotta say though, AsRock pulled the same thing, and all that Gigabyte has done in this situation is respond, as you would to a spoilt child throwing a tantrum.



    Kinda sucks they got dragged into this crap, but at the same time, I do dare say that they were one of hte first yo claim PCIe 3.0 support, so maybe they drew the attention to themselves.


    All I know is that I have been highlighting PCIe 3.0 switches in my reviews on every product that has them, and now, because of this, ANY PCIe siwthc on a board I review gets highlighted. I kind of appreciate that fact that this has provided me with more work, really, I DO.:banghead:


    Poo-poo on AsRock and MSI for starting this crap!
     
  22. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,713 (11.16/day)
    Thanks Received:
    13,669
    Location:
    Hyderabad, India
    Added a poll.
     
    v12dock and cadaveca say thanks.
  23. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,713 (11.16/day)
    Thanks Received:
    13,669
    Location:
    Hyderabad, India
    ASRock didn't get itself involved. It just sensed the situation and spammed us with the same PCIe Gen3 PR material it sent us months ago when it launched its Gen 3 motherboards. The same "Gen3 gets you laid" stuff.
     
  24. HalfAHertz

    HalfAHertz

    Joined:
    May 4, 2009
    Messages:
    1,893 (0.95/day)
    Thanks Received:
    380
    Location:
    Singapore
    So um what are the differences between pci-e 2.0 x16 and pci-e 3.0 x8, don't they provide the exact same bandwidth?
     
  25. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    28,713 (11.16/day)
    Thanks Received:
    13,669
    Location:
    Hyderabad, India
    Yes, provided:
    • You use Ivy Bridge CPU
    • The graphics card is PCI-E 3.0 compliant
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page