• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Problem with 9800GT in SLI mode

I can't believe we are still arguing on this matter. :shadedshu

Where did you see that I said all AM2 boards support AM3 proc?

All I said was AM3 CPUs support (or backwards compatible with) the AM2(+) socket and thus AM2(+) platform.

If you just say "AM3 CPUs support AM2(+) and DDR2", and don't go any further, you are leaving out the fact that the board has to support the CPU. So, what you said is essentially saying that any AM3 CPU will work in any AM2/AM2+ motherboard, and that is not true.
 
If you just say "AM3 CPUs support AM2(+) and DDR2", and don't go any further, you are leaving out the fact that the board has to support the CPU. So, what you said is essentially saying that any AM3 CPU will work in any AM2/AM2+ motherboard, and that is not true.
That is just your implication.
I said AM3 CPUs support the AM2(+) socket and DDR2.
It does plug into the AM2 socket correctly and it does have a functional DDR2 IMC on the chip.
Which are both correct.

It is up to the board manufacturer to provide proper bios updates is another thing.
Just that a board does not work for what ever reason does not mean that my statement is wrong.

If your particular board doesn't work with AM3 procs, because:
1. some greedy bastard wants you to buy a new board
2. your board have cheap power circuity
3. you did not upgrade your bios (if there is one)
It doesn't mean you go sue AMD for it. Same goes here.
 
Last edited:
That is just your implication.
I said AM3 CPUs support the AM2(+) socket and DDR2.
It does plug into the AM2 socket correctly and it does have a functional DDR2 IMC on the chip.
Which are both correct.

It is up to the board manufacturer to provide proper bios updates is another thing.
Just that a board does not work for what ever reason does not mean that my statement is wrong.

If your particular board doesn't work with AM3 procs, because:
1. some greedy bastard wants you to buy a new board
2. your board have cheap power circuity
3. you did not upgrade your bios (if there is one)
It doesn't mean you go sue AMD for it. Same goes here.

I'm not arguing any of that, I'm arguing that when you say "AM3 CPUs support AM2 motherboards", while true, it is confusing, and implies that all AM2/AM2+ motherboards will work with AM3 CPUs. By saying what you said, and leaving out the fact that the board also has to support the CPU, you are saying that the board support doesn't matter when it really does.
 
Last edited:
I can assure you that Green edition cards can run with normal cards without issue. However, the normal card should downclock to match the Green Edition's speeds. Use GPU-z to make sure both cards are lowering to the Green Edition clock speeds(550Mhz Core/1375Mhz Shader). However, this problem would lead to crashes and artifacs, not a lack of performance.

That's interesting. When I have the SLI enabled the cards are running at their default speeds when I check them with GPU-z. The Sparkle speeds are not lowered to those of the Palit card.

As for the CPU support. According to the Gigabyte website my motherboard does support AM2+ and AM3 CPUs so please don't argue anymore.
 
did you try the SLI indicator yet?
 
That's interesting. When I have the SLI enabled the cards are running at their default speeds when I check them with GPU-z. The Sparkle speeds are not lowered to those of the Palit card.

As for the CPU support. According to the Gigabyte website my motherboard does support AM2+ and AM3 CPUs so please don't argue anymore.

Come to think of it, I don't remember if GPU-z shows the lowered clock speeds when in SLi...

But, yeah, as I said the clock speed issue wouldn't cause a lack of performance, it would cause crashing.

Also, as others have said, 3DMark is pretty CPU bound. This is especially true at the standard resolution. I remember when I had my SLi 9800GTXs, I scored about 13,000 in 3DMark06 at the standard resolution, and 13,000 at the highest resolution, because 3DMark06 was CPU bound, this was with an E6600@3.6GHz IIRC.

What resolution are you using? Games might also become CPU bound at low enough resolutions, SLi might only help at higher resolution.

Edit: I scored 16,050 with my 9800GTX cards in SLi at the default 1280x1024 resolution, and 16,000 at 1680x1050. I was CPU limitted.
 
Last edited:
Now back on the OP's concerns, his board the GA-M57SLI-S4 (all rev.) do support all the way up to the latest AM3 Hexa-cores CPUs.
http://www.gigabyte.tw/Support/Motherboard/CPUSupport_Model.aspx?ProductID=2539&ver=#anchor_os
NO. His board DOES NOT SUPPORT THUBANS.
Look at the bold N/A next to the processors, that means NO.
Also, no 940BEs, 965/955BEs.

Sorry, but had to make a point there.

Back to SLI, its a fact that Crossfire doesnt work in windowed mode and Im not sure if it applies to SLI also, so try using fullscreen and see if anything changes.
 
I got home today and tried a few things that some of you suggested. First I tried to enable the SLI indicator but with no success. I don't have that option in the control panel. The only SLI related option I have in the "Manage 3D settings" section is "SLI performance mode". Could this be because i don't have an SLI bridge connected? The Palit card simply does not have SLI bridge connector.

I also ran benchmarks on 3dmark06 with SLI enabled/disabled but at higher resolution(1920x1440). I scored 7250/7890 points with SLI disabled/enabled respectively. This is the first time I actually see any difference in performance with SLI enabled/disabled. Until now I ran 3dmark only at the default 1280x1204 resolution.

Another interesting thing I found out. My CPU is not even 5000+. It's 4200+:roll: I overclocked it about 2 years ago when I bought it and completely forgot about it until today. I thought it was 5000+ because everest recognized it that way, but now after the BIOS update the CPU is at its default clock and shows as 4200+.

I think there's no doubt I need new CPU even if I had just one 9800GT. Correct me if I'm wrong but with better CPU the 9800GT could score much more than 7000 points in 3dmark06
 
Yes, a single 9800GT should be scoring a far bit more than 7,000 at that resolution. It should actually be closer to 9,500.

The SLi indicator option is hidden pretty well in the control panel. If you highlight "Set PhysX and SLi Configuration" on the left hand side, there will actually be an extra option that appears up at the to called "3D Settings". It shows up in the File/Edit/View bar at the very top of the Window, and it only shows up when you have something highlighted under "3D Settings" on the left. The option to enable the SLi indicator is in there.

Not having the SLI bridge won't effect the option being there. However, it will effect performance. It allows the cards to directly communicate with eachother, instead of doing it over the PCI-E bus. Since the cards have to use the PCI-E Bus, performance in SLi will suffer. It will suffer even more since your board only allows the second card to run at x8 instead of the full x16. However, performance will still be noticeably better than one card.

In your case, with the weaker CPU, SLi might not help in getting more frames per second, however it will allow you to raise the graphical settings and resolutions in most games.
 
I got home today and tried a few things that some of you suggested. First I tried to enable the SLI indicator but with no success. I don't have that option in the control panel. The only SLI related option I have in the "Manage 3D settings" section is "SLI performance mode". Could this be because i don't have an SLI bridge connected? The Palit card simply does not have SLI bridge connector.

I also ran benchmarks on 3dmark06 with SLI enabled/disabled but at higher resolution(1920x1440). I scored 7250/7890 points with SLI disabled/enabled respectively. This is the first time I actually see any difference in performance with SLI enabled/disabled. Until now I ran 3dmark only at the default 1280x1204 resolution.

Another interesting thing I found out. My CPU is not even 5000+. It's 4200+:roll: I overclocked it about 2 years ago when I bought it and completely forgot about it until today. I thought it was 5000+ because everest recognized it that way, but now after the BIOS update the CPU is at its default clock and shows as 4200+.

I think there's no doubt I need new CPU even if I had just one 9800GT. Correct me if I'm wrong but with better CPU the 9800GT could score much more than 7000 points in 3dmark06

let it loose, enjoy them they are a truely formidable pair of cards, if you need any help with SLI or something you can PM me

Good Luck :rockout:
 
I've just made a quick test with the SLI indicator enabled(I finally found it thanks to newtekie1). Here are two screenshots, one from 3dmark and one from crysis. Crysis is with all settings at enthusiast level.

3dmarkz.jpg


crysiswh.jpg
 
Last edited:
I just got the Athlon II x3 435. I ran 3dmark06 benchmarks again with default settings with SLI disabled/enabled and I get 12100/13100 points. I have no idea if these results are normal for my configuration. With the Athlon 4200+ I scored around 8900 points(default 3dmark settings again) no matter if the SLI is enabled or disabled.
 
much better, with a quad core you could reach like 18000, 13000 seems fine for your system
 
much better, with a quad core you could reach like 18000, 13000 seems fine for your system

Isn't an Athlon II X4 a Quad core? I think you meant with an overclock on the quad core....

IMO, it doesn't matter anymore what 3dmark06 scores now, Vantage is where SLI enabled will show the improvements.
 
Isn't an Athlon II X4 a Quad core? I think you meant with an overclock on the quad core....

IMO, it doesn't matter anymore what 3dmark06 scores now, Vantage is where SLI enabled will show the improvements.

Athlon II X3 if you read again carefully :laugh:
 
Too bad Vantage needs Vista to run
 
Back
Top