• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

[PCPER]Frame-rating, what it is, what is does, and why it's important.

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.61/day)
AMD made it sound like all these issues exist because they didn't know they were there.

Prove they did know. That's gonna be hard. I mean, many enthusiasts have been complaining about "Microstutter" for many years. Did they ever see those complaints?


Did they recognize the problem?


I don't think it's really that important, unless it's a problem they cannot fix.
 
Joined
Feb 19, 2006
Messages
6,270 (0.94/day)
Location
New York
Processor INTEL CORE I9-9900K @ 5Ghz all core 4.7Ghz Cache @1.305 volts
Motherboard ASUS PRIME Z390-P ATX
Cooling CORSAIR HYDRO H150I PRO RGB 360MM 6x120mm fans push pull
Memory CRUCIAL BALLISTIX 3000Mhz 4x8 32gb @ 4000Mhz
Video Card(s) EVGA GEFORECE RTX 2080 SUPER XC HYBRID GAMING
Storage ADATA XPG SX8200 Pro 1TB 3D NAND NVMe,Intel 660p 1TB m.2 ,1TB WD Blue 3D NAND,500GB WD Blue 3D NAND,
Display(s) 50" Sharp Roku TV 8ms responce time and Philips 75Hz 328E9QJAB 32" curved
Case BLACK LIAN LI O11 DYNAMIC XL FULL-TOWER GAMING CASE,
Power Supply 1600 Watt
Software Windows 10
AMD made it sound like all these issues exist because they didn't know they were there.

My guess is they did not know until it was pointed out to them, I doubt that red would behead themselves and I am sure they will have a resolution to the issue.
 

HammerON

The Watchful Moderator
Staff member
Joined
Mar 2, 2009
Messages
8,397 (1.51/day)
Location
Up North
System Name Threadripper
Processor 3960X
Motherboard ASUS ROG Strix TRX40-XE
Cooling XSPC Raystorm Neo (sTR4) Water Block
Memory G. Skill Trident Z Neo 64 GB 3600
Video Card(s) PNY RTX 4090
Storage Samsung 960 Pro 512 GB + WD Black SN850 1TB
Display(s) Dell 32" Curved Gaming Monitor (S3220DGF)
Case Corsair 5000D Airflow
Audio Device(s) On-board
Power Supply EVGA SuperNOVA 1000 G5
Mouse Roccat Kone Pure
Keyboard Corsair K70
Software Win 10 Pro
Benchmark Scores Always changing~
Thanks Dave for the info.
I switched to the HD 7970's back in November from two GTX 580's as they are far better GPU's for crunching. I worried about the "microstuttering" that I had been hearing about for years. The last AMD/ATI multi-card set-up I had was two 4870's. I am a little surprised at the findings as I have not had an issue with BF3 and one issue with Crysis 3 where I have to Alt+tab to get crossfire to work. So I have been under the assumption (yes I know better than to assume:)) that all has been working well, when clearly it is not.
I do not seem to have issues with microstuttering and I am able to get (at least I think as I am using Fraps) 60 to 70 FPS in Crysis 3 MP running at 2560x1600 with vsync and motion blur off and SMAA low (1x) while everything else is set to Very High (AF 16).
I have looked at Afterburner on many occasions to see what it was saying as far as GPU usage and this is what I normally see:


So from what I am reading in this thread is that these GPU's are underperforming and I am not getting the most for my money. Sad indeed:(
 
Last edited:

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.61/day)
So from what I am reading these GPU's are underperforming and I am not getting the most for my money. Sad indeed

Well, on the other hand, does a single GPU deliver the same experience for you?

Because for me, it does, even slightly better on occasion.



However, if we take into account things like BluRay's 24 FPS being good enough for many, and we say that your 70FPS is actually 35, it is understandable that many users would just not ever notice.

What kind of sucks, to me, is that that means all the extra power usage is for nothing as well, nevermind, the cost of the cards.

I am also pretty sure that GPU-Compute is handled differently.
 
Joined
Oct 26, 2011
Messages
3,145 (0.69/day)
Processor 8700k Intel
Motherboard z370 MSI Godlike Gaming
Cooling Triple Aquacomputer AMS Copper 840 with D5
Memory TridentZ RGB G.Skill C16 3600MHz
Video Card(s) GTX 1080 Ti
Storage Crucial MX SSDs
Display(s) Dell U3011 2560x1600 + Dell 2408WFP 1200x1920 (Portrait)
Case Core P5 Thermaltake
Audio Device(s) Essence STX
Power Supply AX 1500i
Mouse Logitech
Keyboard Corsair
Software Win10
Basically that's what I've experienced too in the past years.

If a game was smooth on single GPU then it was no problem on multi GPU.

As soon as the single GPU configuration wasn't enough everything started stuttering in multi GPU config.

So that's why triple 30" was unplayable most of the times?
 
Joined
Oct 3, 2012
Messages
229 (0.05/day)
Location
Brazil
System Name Tiffany
Processor Intel i7 4770K @ 4.5Ghz
Motherboard Gigabyte Z87X-UD4H
Cooling Corsair H100i Push-Pull
Memory Corsair Vengeance 1600MHz 16GB
Video Card(s) XFX R9 290 Crossfire @ 925Mhz
Storage WD 500GB + Seagate 1 TB
Display(s) BenQ XL2420TX 120hz
Case Cooler Master CM690II Black & White
Audio Device(s) Onboard
Power Supply Cooler Master Silent Pro 1000W
Software Windows 7
We’re proud of the work that we’ve put into this – and we think it can help gamers get the experience they’re paying for. So we’re opening up our FCAT solution, making the scripts and software associated with FCAT freely-modifiable and redistributable. The technical press has already dug in, and the results have been dramatic.

Our hope: that third-party apps can replicate and replace our tools, giving gamers what they need to be sure they’re getting all of the graphics quality they’re paying for.

I'm surprised, actually. I thought they would take advantage of the tool. I was totally wrong :)shadedshu).

And that's great. :D
 
Joined
Apr 30, 2012
Messages
3,881 (0.88/day)
They already put the last part to the test.

Our hope: that third-party apps can replicate and replace our tools, giving gamers what they need to be sure they’re getting all of the graphics quality they’re paying for.

You can provide support but Nvidia wont open source it. So expect all the Nvidia AIB partners with Benching & Monitoring tools for the gpu to implement support.

I suspect this is how the PR will go. Nvidia will dump it to reviewers with the kits like they already have and will encourage AIB partners to implement support. Then run a marketing campaing and make it seam they had nothing to do with it. Sneaky but effective.
 
Joined
Oct 1, 2006
Messages
4,899 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
They already put the last part to the test.



You can provide support but Nvidia wont open source it. So expect all the Nvidia AIB partners with Benching & Monitoring tools for the gpu to implement support.

I suspect this is how the PR will go. Nvidia will dump it to reviewers with the kits like they already have and will encourage AIB partners to implement support. Then run a marketing campaing and make it seam they had nothing to do with it. Sneaky but effective.
Even if they let people know they have something to do with the tool, as long as the results are true they are in a good spot.
After all who's damn fault is it that Crossfire have problems ;)
 
Joined
Apr 30, 2012
Messages
3,881 (0.88/day)
Its AMDs ofcourse

But Nvidias FCAT is just FRAPS at the other end of the pipeline. We need to know more of what goes on inbetween with each step for a better analysis.

Fraps lets you know how the Game Engines is spiting out frames.

Nvidias FCAT tell you how they are displayed at the end of the pipeline.

Game Engine is X--X--X--X--X

GPU 1 displays X--X-----X--X

GPU 2 displays X-X-X-X-X

You need a better understanding of how its being interpreted along the way. GPU 2 can look smoother but the timing is off and GPU 1 might be missing something but the timing reflect that of the games intention.

The TechReport did a better job combining the two.
 
Joined
Oct 1, 2006
Messages
4,899 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
You need a better understanding of how its being interpreted along the way. GPU 2 can look smoother but the timing is off and GPU 1 might be missing something but the timing reflect that of the games intention.

The TechReport did a better job combining the two.
More information is always nice, but to be honest now that AMD admits it a problem on their hand I am more interest in when will they get a fix.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,233 (2.61/day)
More information is always nice, but to be honest now that AMD admits it a problem on their hand I am more interest in when will they get a fix.

Soon, I suppose. I understand that AMD even has an open offer for a job to any programmer that can fix this immediately. Interested parties should contact Justin Boggs @ AMD. If you don't know how to get a hold of him, and really want that job, send me a PM.


Now, I know, 100%, that I told AMD staff that this was an issue, but that staff member may have just kept that to themselves. I also told W1zz that I thought this was happening many months ago, too. But what I found when telling people about this problem, responses were rather...well..bland. I approach every type of user, form general public to contacts at hardware companies, and they all ignored the issue.



We heard a few months ago, that AMD has issues with memory management, and they were working to fix it(expected release of that driver was NOW, but there is no such driver yet, and instead, we get more information and another time extension.)


We also have a 7990 launch imminent. Or not. it seems to me, that this driver problem is what held AMD back from releasing the 7990, and I am pretty sure that even though AMD might get 7990 working right, that does not mean magically single-card Crossfire will be fixed, either.


All those "7990's" already sold by Powercolor and ASUS...don't work right. :roll: That's really funny.
 
Joined
Apr 30, 2012
Messages
3,881 (0.88/day)
We also have a 7990 launch imminent. Or not. it seems to me, that this driver problem is what held AMD back from releasing the 7990, and I am pretty sure that even though AMD might get 7990 working right, that does not mean magically single-card Crossfire will be fixed, either.


All those "7990's" already sold by Powercolor and ASUS...don't work right. :roll: That's really funny.

Who ever bought a 7990 should have been paying closer attention

Hexus interview with Powercolor last year there were hints of issues with the original concept
@ 4:00 mark



Here is something interesting on Nvidia FCAT

Kyle_Bennett HardOCP Editor-in-Chief post on it

Again this comes down to interpretation of the data and what is being compared in any review. If CrossFire was no better than a comparable single GPU card, then real world gaming testing of highest attainable settings and resolutions would expose this easily. Again, it is why we were first in the industry to start this tremendously resource intensive testing years ago.

We have been talking to NVIDIA about frametime testing and collection for a long time now and there is good information back from inside the NVIDIA organization that HardOCP GPU reviews was the catalyst for this coming about. We had the opportunity to help develop the program tools with NVIDIA but chose not to. PCPer has put an incredible about of time and money into this program that we were simply not comfortable with spending. PCPer has done a great deal of needed work on this with NVIDIA, which is commendable, but I am not sure data collection on this front will prove to be the end all be all in GPU reviews. It all still comes down to evaluating the end user gaming experience and how well the hardware allows you to achieve you wants and needs on this front. Frame time data collection will never be something that any users can use at home easily so it will never be more than a review data point. Focus on the user experience will still have the most impact on video card sales making sure the end user gets what he wants and needs.
 
Last edited by a moderator:
Joined
Feb 24, 2009
Messages
3,516 (0.63/day)
System Name Money Hole
Processor Core i7 970
Motherboard Asus P6T6 WS Revolution
Cooling Noctua UH-D14
Memory 2133Mhz 12GB (3x4GB) Mushkin 998991
Video Card(s) Sapphire Tri-X OC R9 290X
Storage Samsung 1TB 850 Evo
Display(s) 3x Acer KG240A 144hz
Case CM HAF 932
Audio Device(s) ADI (onboard)
Power Supply Enermax Revolution 85+ 1050w
Mouse Logitech G602
Keyboard Logitech G710+
Software Windows 10 Professional x64
I don't get it. If the maximum resolution of the capture card is 2560x1440 at 60hz, then how is he capturing Eyefinity/Surround resolutions?

Is the capture card only 1 of the 3 monitors?

I find this odd since just about everyone who has experienced both nVidia and AMD multi monitor solutions always say to go with AMD.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.22/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I don't get it. If the maximum resolution of the capture card is 2560x1440 at 60hz, then how is he capturing Eyefinity/Surround resolutions?

Is the capture card only 1 of the 3 monitors?

Basically, yes. He only has to capture the one monitor with the color bar.

I find this odd since just about everyone who has experienced both nVidia and AMD multi monitor solutions always say to go with AMD.

Yeah, but they only say that because the AMD solution has more memory(typically 3GB vs. 2GB) and the AMD solution was giving better framerate. But they were basing their framerate numbers on FRAPs or some other similar program, which obviously isn't giving the real picture.
 
Joined
Dec 24, 2010
Messages
449 (0.09/day)
Location
mississauga, on, Canada
System Name YACS amd
Processor 5800x,
Motherboard gigabyte x570 aorus gaming elite.
Cooling bykski GPU, and CPU, syscooling p93x pump
Memory corsair vengeance pro rgb, 3600 ddr4 stock timings.
Video Card(s) xfx merc 310 7900xtx
Storage kingston kc3000 2TB, amongst others. Fanxiang s770 2TB
Display(s) benq ew3270u, or acer XB270hu, acer XB280hk, asus VG 278H,
Case lian li LANCOOL III
Audio Device(s) obs,
Power Supply FSP Hydro Ti pro 1000w
Mouse logitech g703
Keyboard durogod keyboard. (cherry brown switches)
Software win 11, win10pro.
Meh. Most people seem to not care.

yes!...

(slight bit off topic WRT crossfire):
how many people care that Second life runs better on Nvidia cards compared to AMD cards... (and secondlife is "cpu bound"... and it s openGL, and runs on linux!...)

annoys me that a gt640/gtx650 runs Second life faster/the same speed as a 7970...

well, this issue is in the same league as why WOW Runs 50% faster on an nvidia card of the same price range as the amd cards...


TL;DR just watch... a few months after the PS4 is released, all of these AMD problems will have Disappeared... meaning it will take about a year for these crossfire/gpu problems to get fixed.
 
Joined
Apr 30, 2012
Messages
3,881 (0.88/day)
The Nvidia FCAT testing method is worse then FRAPS.

The miss-understanding of it and interpretation is just odd to me. I wouldnt be surpise it if came back to bite the 4 known sites Nvidia sent these test kits to.

To be clear:
Nvidias FCAT talks about a whole new issue. Not the FRAPS test nor the AMD issue they were already addressing.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.22/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
The Nvidia FCAT testing method is worse then FRAPS.

Why exactly?

It seems to me that reading the framerate the user actually sees would be a far better method and far more informative to the user than reading the framerate the game engine generates long before it ever actually makes it to the user.
 
Joined
Apr 30, 2012
Messages
3,881 (0.88/day)
Why exactly?

It seems to me that reading the framerate the user actually sees would be a far better method and far more informative to the user than reading the framerate the game engine generates long before it ever actually makes it to the user.

Neither give a real-picture of whats going on in. FRAPS tries to gives a static picture at a certain point but even those vary tremendously depending on system setup. FCAT combines data from 3 different points. A time interval after the data has gone through the GPU Render and a time cap for the overlay prior to entering the queing process much like FRAPS.

If the goal is to measure output frames then the FCAT should dump the overlay capture. The time interval after the data has gone through the GPU render would serve best without imposing data prior to the queing process.
 
Last edited:
Joined
Feb 24, 2009
Messages
3,516 (0.63/day)
System Name Money Hole
Processor Core i7 970
Motherboard Asus P6T6 WS Revolution
Cooling Noctua UH-D14
Memory 2133Mhz 12GB (3x4GB) Mushkin 998991
Video Card(s) Sapphire Tri-X OC R9 290X
Storage Samsung 1TB 850 Evo
Display(s) 3x Acer KG240A 144hz
Case CM HAF 932
Audio Device(s) ADI (onboard)
Power Supply Enermax Revolution 85+ 1050w
Mouse Logitech G602
Keyboard Logitech G710+
Software Windows 10 Professional x64
Basically, yes. He only has to capture the one monitor with the color bar.

The problem I have is that if it is this problematic (as Ryan suggests), then were are the scores of crying and screaming going on with people that have this setup? I've yet to hear of anyone complaining about this (the blank screen).

Yeah, but they only say that because the AMD solution has more memory(typically 3GB vs. 2GB) and the AMD solution was giving better framerate. But they were basing their framerate numbers on FRAPs or some other similar program, which obviously isn't giving the real picture.

The reason I've been given have always been better experience on the AMD side. Never has the reason been because of frame buffer size or framerate.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.22/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Neither give a real-picture of whats going on in. FRAPS tries to gives a static picture at a certain point but even those vary tremendously depending on system setup. FCAT combines data from 3 different points. A time interval after the data has gone through the GPU Render and a time cap for the overlay prior to entering the queing process much like FRAPS.

If the goal is to measure output frames then the FCAT should dump the overlay capture. The time interval after the data has gone through the GPU render would serve best without imposing data prior to the queing process.

That would only be an issue if the overlay capture was happening in software, it isn't, the overlay is captured after it is output to the monitor.

The problem I have is that if it is this problematic (as Ryan suggests), then were are the scores of crying and screaming going on with people that have this setup? I've yet to hear of anyone complaining about this (the blank screen).



The reason I've been given have always been better experience on the AMD side. Never has the reason been because of frame buffer size or framerate.

Most people just won't notice. They'll look at their framerate counter and assume everything is peachy because it is reading more than what they had with a single card. You really have to analyze it to notice that every 3rd frame is basically being wasted with AMD's crossfire. Every 3rd frame is being rendered, so it counts towards the framerate reported by software, but it actually isn't being displayed to the user.

Everything I've seen always pointed to more memory on the AMD side and better framerates, which a lot of idiots seem to equate to a better experience, but this kind of proves that those idiots don't have a clue.
 
Joined
Apr 30, 2012
Messages
3,881 (0.88/day)
That would only be an issue if the overlay capture was happening in software, it isn't, the overlay is captured after it is output to the monitor.

Might want to read the article again.




Following that is t_present, the point at which the game engine and graphics card communicate to say that they are ready to pass information for the next frame to be rendered and displayed. What is important about this time location is that this is where FRAPS gets its time stamps and data and also where the overlay that we use for our Frame Rating method is inserted.

NVIDIA was responsible for developing the color overlay that sits between the game and DirectX (in the same location of the pipeline as FRAPS essentially) as well as the software extractor that reads the captured video file to generate raw information about the lengths of those bars in an XLS file.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.98/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
CrossFire looks pretty craptastic with what I've seen in that PC Per video. Those problems shouldn't have been there in the first place, let along waiting 120 days or so for a fix! It doesn't look like AMD make any good products nowadays, does it?

I'm glad that I've stuck with nvidia and intel for the last few years. I bought a GTX 590 dual GPU card recently for a good price and I've not noticed any problems with it with the little bit of gaming I've done on it so far.

Are you gonna go nvidia now, Dave?
 
Joined
Feb 24, 2009
Messages
3,516 (0.63/day)
System Name Money Hole
Processor Core i7 970
Motherboard Asus P6T6 WS Revolution
Cooling Noctua UH-D14
Memory 2133Mhz 12GB (3x4GB) Mushkin 998991
Video Card(s) Sapphire Tri-X OC R9 290X
Storage Samsung 1TB 850 Evo
Display(s) 3x Acer KG240A 144hz
Case CM HAF 932
Audio Device(s) ADI (onboard)
Power Supply Enermax Revolution 85+ 1050w
Mouse Logitech G602
Keyboard Logitech G710+
Software Windows 10 Professional x64
Most people just won't notice. They'll look at their framerate counter and assume everything is peachy because it is reading more than what they had with a single card. You really have to analyze it to notice that every 3rd frame is basically being wasted with AMD's crossfire. Every 3rd frame is being rendered, so it counts towards the framerate reported by software, but it actually isn't being displayed to the user.

Everything I've seen always pointed to more memory on the AMD side and better framerates, which a lot of idiots seem to equate to a better experience, but this kind of proves that those idiots don't have a clue.

Maybe but if the problem (in Eyefinity) is as bad as Ryan is say, then I have an extremely hard time believing most people would just play if off after looking at their frame rates.

On another note there was an interesting comment over on the B3D forum suggesting that the problem may lie with hyperthreading on the Intel processor. There was an article referenced that showed little frame time variance between an IVB Pentium and i3 with the only difference being the i3 has HT. When the article used an IVB i7, frame times started going all over the place (the graph was with BF3 and the GTX 650 Ti). It was suggested that since the i7 shouldn't be as stressed as say the i3, then Windows 7 may be shuffling core parking on the processor. Since nVidia has hardware to get frames timed right (if this theory is correct), then it would be less susceptible to any of this.

I know from my own experience that I see a solid 20 fps drop with my two 5870s in BF3 mp when I enable HT. Even when I reduce in game settings to make the frame rate higher, it doesn't feel the same even though the frame rate is showing the same. I have not turned HT back on since.

This might also explain why the Tom's Hardware reviewe should "less" problems in their Crossfire setup since they used a i5 without HT. At least in their review, the second card did not become useless even though PCPer used some of the same games that Tom's did and PCPer showed far different results with the 7970 (as compared to the 7870 with Tom's).
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.98/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Maybe but if the problem (in Eyefinity) is as bad as Ryan is say, then I have an extremely hard time believing most people would just play if off after looking at their frame rates.

On another note there was an interesting comment over on the B3D forum suggesting that the problem may lie with hyperthreading on the Intel processor. There was an article referenced that showed little frame time variance between an IVB Pentium and i3 with the only difference being the i3 has HT. When the article used an IVB i7, frame times started going all over the place (the graph was with BF3 and the GTX 650 Ti). It was suggested that since the i7 shouldn't be as stressed as say the i3, then Windows 7 may be shuffling core parking on the processor. Since nVidia has hardware to get frames timed right (if this theory is correct), then it would be less susceptible to any of this.

I know from my own experience that I see a solid 20 fps drop with my two 5870s in BF3 mp when I enable HT. Even when I reduce in game settings to make the frame rate higher, it doesn't feel the same even though the frame rate is showing the same. I have not turned HT back on since.

This might also explain why the Tom's Hardware reviewe should "less" problems in their Crossfire setup since they used a i5 without HT. At least in their review, the second card did not become useless even though PCPer used some of the same games that Tom's did and PCPer showed far different results with the 7970 (as compared to the 7870 with Tom's).

We really need to see PC Per run their tests with HT on and off. I'm sure they've thought of this by now and people have suggested it in their forums.

HT is one of those things that can hurt performance in some cases and this might be one of them, due to the realtime nature of gaming graphics rendering.
 
Top