Thursday, August 4th 2011

Production Gigabyte G1.Sniper 2 Features PCI-Express Gen. 3

About a month ago, we were treated to the first pictures of G1.Sniper 2, Gigabyte's first LGA1155 motherboard in its G1.Killer series of motherboards targeting the gamer-overclocker market segment. High resolution pictures showed the prototype of having PCI-Express Gen. 2 slots like most other LGA1155 boards. It turns out, according to a photo-session of a production sample by tech-blog SIN's Hardware, that Gigabyte refined the design with the production version (the one that will be sold in the markets), it features PCI-Express 3.0 (Gen. 3) graphics slots.

For a LGA1155 motherboard to have Gen. 3 PEG slots, it requires Gen. 3 specifications compliant switching circuitry, which wasn't available to motherboard vendors when they were designing their first LGA1155 boards. With availability of those components, motherboard vendors are not wasting any time in rolling out new variants of their LGA1155 boards that feature Gen. 3 PCI-Express slots. Gigabyte placed "PCI-Express 3.0" marking on the board manually using stickers, going on to show that adding Gen. 3 slots indeed may have been a last-minute decision at Gigabyte. The other interesting marking on the G1.Sniper 2 carton is the mention of the board being ready for upcoming 22 nm "Ivy Bridge" Core processors. More pictures, and a preview at the source.
Source: SIN's Hardware
Add your own comment

19 Comments on Production Gigabyte G1.Sniper 2 Features PCI-Express Gen. 3

#1
dickobrazzz
hmm, if i can see gigbyte wrote that their mobo can take 22nm ivy bridge - that sounds good :)
but i think fucking managers fron intel will present us new chipset ex. p77 and said that only on p77 you can overclock our new ivy bridge using 133mhz blck, but if you want 150mhz blck you should buy z77 :banghead:
Posted on Reply
#2
N.E.A
dickobrazzzhmm, if i can see gigbyte wrote that their mobo can take 22nm ivy bridge - that sounds good :)
but i think fucking managers fron intel will present us new chipset ex. p77 and said that only on p77 you can overclock our new ivy bridge using 133mhz blck, but if you want 150mhz blck you should buy z77 :banghead:
Can't agree more with on Intel Doing such thing, but i think it will be scar on their face if they do so. they are not that stupid to do such thing, or are they ?!
Any its always good to new spicy Components :D and i think it is time for me to be prepared for Ivy Bridge.
Posted on Reply
#3
newtekie1
Semi-Retired Folder
So other than a sticker on the board that says PCI-E Gen 3, is there anything physically different about PCI-E 3.0 over PCI-E 2.0 slots?
Posted on Reply
#5
mastrdrver
So what's with the NB heat sink when there is no northbridge? :shadedshu
Posted on Reply
#6
newtekie1
Semi-Retired Folder
Steven BI believe the PCI-E 3.0 needs to have a PCI-E 3.0 switch and this board seems to have it.

tech.163.com/digi/11/0729/15/7A4VKDTB00162OUT_5.html
But the PCI-E 3.0 controller is built into the CPU, so if you put a CPU that has PCI-E 3.0 controller in a board that was originally designed for PCI-E 2.0 ports, should the board then have 3.0 ports anyway? From everyting I've read nothing on the board is special, it is all about the controller on the CPU. No?
Posted on Reply
#7
Steven B
The whole PCI-E 3.0 thing i think has to deal with ivy bridge as it will have a PCI-E 3.0 controller, so these new boards from asrock, msi, gigabyte and probably asus soon will all be ready to support ivy bridge's features.
Posted on Reply
#8
Th3pwn3r
mastrdrverSo what's with the NB heat sink when there is no northbridge? :shadedshu
It's all about looks.

Also, Asrock has taken a stab at Asus and Gigabyte for having faux PCI 3.0 slots.

Err, not sure what I posted is correct. Perhaps they were saying Asus and Gigabyte are lacking PCI 3.0
Posted on Reply
#9
Steven B
I think they made that slide before they knew GB did this, they said that GB is using PCI-E 2.0 switches, but that is obviously not the case. GB is using the same switches as in that link, same as MSI. Asrock is using NXP PLX switches. What surprises me is that GB hasn't been advertising PCI-E 3.0, they have a microsite for this board as well now.
Posted on Reply
#10
dickobrazzz
pcie3.0 pcie3.0, we haven`t got compatible vga cards, but if your mobo hasn`t got pcie gen3.0 you`re looser:laugh:
i think gen3.0 is like usb3.0, because i can`t see superiority usb3.0 on 2.0, yes, it`s faster a bit, but not more faster than 2.0
of course 3.0 x8 = 2.0 x16 it`s good, but can`t see superiority 2.0 x16 on x8:banghead:
in benchmarkes only and sometimes in games less fps around ~1-2
Posted on Reply
#12
KashunatoR
i read the comments but i still don't understand what is the actual difference that pci 3.0 makes over the 2.0 version.
Posted on Reply
#14
cadaveca
My name is Dave
dickobrazzzit`s faster a bit, but not more faster than 2.0
I dunno about you, but my USB 3.0 drive is an easy 3x AVG faster on it's native interface than USB 2.0. It's nearly as fast as SATA 2.0. I can't wait to try an SSD on USB 3.0, but I need an enclosure.

USB 3.0:



SATA 3.0 Gb/s:


Likewise, PCIe 3.0 will offer similar benefits to devices that are natively capable to make use of it. Specifically, with future AMD cards planning to use IOMMU capabilites, traffic over the PCIe bus could potentially see a dramatic increase that current stuff just isn't capable of. Don't forget that PCIe isn't just CPU-to-devices, but also device-to-device. ;)
Posted on Reply
#15
Hotobu
Steven BI think PCI-E 3.0 offers more bandwidth, but at this point its a gimmick.
Eh I wouldn't say that. The upcoming generation of GPUs will use PCI 3.0 and so will the next gen of Intel's processors. The motherboards that support this stuff have to come out first in order to ensure compatibility. How retarded would it be for gas stations to carry Ethanol fuel if there weren't at least some cars on the market that could support it first?
Posted on Reply
#16
newtekie1
Semi-Retired Folder
HotobuEh I wouldn't say that. The upcoming generation of GPUs will use PCI 3.0 and so will the next gen of Intel's processors. The motherboards that support this stuff have to come out first in order to ensure compatibility. How retarded would it be for gas stations to carry Ethanol fuel if there weren't at least some cars on the market that could support it first?
No, he is saying the bandwidth is a gimmick, because no card will use it.

But that isn't really the case, but it is just that the traditional high bandwidth cards that we think of normally won't use it. The fact is that even the next generation of graphics cards won't come close to maxing out PCI-E 2.0 x16, and probably barely show a difference at PCI-E 2.0 x8. However, the cards that will show a difference will be PCI-E x1 cards, specifically SATA 6.0Gb/s RAID cards. Saddly, most motherboard manufacturers will probably keep wiring the x1 slots to the southbridge so they will still be 2.0 slots...
Posted on Reply
#17
Hotobu
Yes, but I'm saying it's a much easier transition for everyone involve when PCI 3.0 slots are ubiquitous and that bandwidth is reached. What about the guy who gets an SB setup now, throws in an Ivy chip later, and then is shopping for a GPU 2 years from now? Chances are he'll be happy that his board supports 3.0
Posted on Reply
#18
Th3pwn3r
Yea, I've been kicking myself for cheaping out on my board which is socket 775 because my other 775 board died. I should have swapped everything then because now that I want an SSD I'll be bottlenecked due to my sata 2.0 ports.
Posted on Reply
#19
mastrdrver
newtekie1No, he is saying the bandwidth is a gimmick, because no card will use it.

But that isn't really the case, but it is just that the traditional high bandwidth cards that we think of normally won't use it. The fact is that even the next generation of graphics cards won't come close to maxing out PCI-E 2.0 x16, and probably barely show a difference at PCI-E 2.0 x8.
We'll see when it does show up but I wonder if cards are bandwidth limited when I see things like this:
GTX 480 article


Only a 7% loss in performance when limiting bandwidth to 25% of the original? Another way is that you lose about 5 fps on average when going from 5 GT/s to 1.25 GT/s. On a GTX 480 that seems odd to me.

The results were almost the same with the 5870:
5870 article
Posted on Reply
Add your own comment
Apr 26th, 2024 22:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts