- Joined
- Jun 2, 2011
- Messages
- 760 (0.16/day)
System Name | An experiment in continuous upgrading |
---|---|
Processor | Intel Core i7 2600k @ 4.4 Ghz | FX-8570 @ 4.0 Ghz | Phenom II X4 965 |
Motherboard | MSI P67A-GD53 | MSI 990FXA-GD80 | Asus M4A79 Deluxe |
Cooling | Noctua NH-D14 | Zalman CNPS10x | Coolermaster*212+ |
Memory | 24gb DDR3-1866 |8Gb | 8Gb |
Video Card(s) | ASUS GTX 970 Turbo x 2 (SLI) | Sapphire Radeon 7970 + GTX 670 (PhysX) | Radeon 4870 1Gb |
Storage | 2x240gb SSD + 4tb SSHD + HDDs | 240gb SSD + HDDs | 120gb SSD + WD Blue 500gb |
Display(s) | ASUS VG248 144hz + Samsung S23A700D 120hz + 3D Vision | 40" Sony 1080p TV | 23" 1080p |
Case | Cooler Master HAF-X | Lian-Li PC-8 | Antec 302 |
Audio Device(s) | Senn. PC360 G4ME | Sound Blaster Zx | Generic |
Power Supply | Corsair TX850W | Corsair TX 750 | OCZ 700 |
Mouse | Steelseries Sensei | Logitech G402 W/L | Generic |
Keyboard | Filco Majetouch Ninja Tenkeyless MX Black | Logitech wireless |SteelSeries 6Gv2 MX Red |
Software | About 800 top-rated games. | 200 top-rated games | No games |
Benchmark Scores | No time for benching, I prefer gaming. |
Nope... My cousin was lagging in Starcraft II and I was looking at CPU usage and it was <20% the whole time... Once I put it in High Performance instead of Power Saver, it went up to 60~80%. This happened Friday lol
Buggy, coz that's not how it should behave. I'll try turning off CnQ & Turbo core, although I like the idea of turbo boost. Or core. Whatever.
In the meantime, I flashed the BIOS of my mobo again, and that of both 570s, updated drivers for my motherboard, etc. No DICE.
EDIT: it seems a lot of people on AMD rigs have the same problem. Someone has posted the following interesting theory about the problem, on nVidia forums:
Im having the same issue as the OP and I'm using x2 285's in sli. I would actually be able to have nice FPS if i could use both of my cards to their full potential.
There are many people having this issue however...
If the OP turns out to be using a 990fx chipset im going to throw a brick through Nvidia's window... and Dice too... for general purposes.
I think Nvidia has a 8bit switch in their drivers from the old nvidia chipset days, lol. Some of the older AMD nvidia chipsets were built with 8 bit HT buses to link to the north bridge (all modern day AMD HT implementations use 16bit). This would effectively limit ht 2.0 at the time to 2000mb/s. if using a 8 bit switch on a modern day ht 3.0 interface, that would limit the bandwidth to 4000mb/s. That is enough to saturate a PciE x16 1.x lane, however is only half as much is needed to saturate a PciE x16 2.x lane (8000mb/s). When using a AMD chipset that does not utilize the NF200 repeater chip (which are seen primarily in conjunction with Intel chipsets), both cards require full bandwidth capability when in Sli. Not to mention the latency discrepancy between PciE and the HT bus. Even if the HT bus was being utilized to its full potential, Im sure there is still little optimization within the drivers themselves for the HT bus communication to the north-bridge. So as it stands there is enough bandwidth on a AMD chipset to saturate ONE PciE Graphics cards needs, but when Sli is introduced the "Bottleneck" (as so many are anxious to refer) is in all reality in the lack of optimization to utilize the chipset's full potential. This is also why, I believe, AMD users see low GPU usage while utilizing so little CPU resources. Its the only logical explanation for the symptoms seen with AMD + Sli, and furthermore why these symptoms are not seen with the use of AMD's CrossFire.
OPTIMIZE FOR AMD CHIPSETS ALREADY!
What do you think?
Last edited: