- Joined
- Oct 6, 2004
- Messages
- 58,413 (8.18/day)
- Location
- Oystralia
System Name | Rainbow Sparkles (Power efficient, <350W gaming load) |
---|---|
Processor | Ryzen R7 5800x3D (Undervolted, 4.45GHz all core) |
Motherboard | Asus x570-F (BIOS Modded) |
Cooling | Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate |
Memory | 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V) |
Video Card(s) | Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W)) |
Storage | 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2 |
Display(s) | Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144) |
Case | Fractal Design R6 |
Audio Device(s) | Logitech G560 | Corsair Void pro RGB |Blue Yeti mic |
Power Supply | Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY) |
Mouse | Logitech G Pro wireless + Steelseries Prisma XL |
Keyboard | Razer Huntsman TE ( Sexy white keycaps) |
VR HMD | Oculus Rift S + Quest 2 |
Software | Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware! |
Benchmark Scores | Nyooom. |
I wrote up a brief summary (inspired by another thread, will find link later) of why benchmarks can be misleading, and first hand experience is key to a video cards performance.
I hope this can be stickied or make an article here at some point, enjoy
*********************************************************
This is a short, brief guide to explain the flaw behind benchmark comparisons of video cards.
Most benchmarks (excluding the 3Dmark series) get the results based on the minimum and maximum FPS (frames per second)
This was ok back in the day as no one saw anything wrong with more performance as an end goal - but its flawed.
Let me give you an example:
Video card one gives 10FPS as a minimum with 100FPS as a maximum - an average between these is 55.
Video card two gives 20 FPS minimum, with 80FPS maximum - an average between these is 50.
In the average review, card 1 is the winner by 5 FPS or 10%. People would buy this card thinking its faster.
But wait a sec, what about the minimum FPS? 20FPS is slightly laggy, but 10FPS is half that speed, and totally unplayable. Some would correctly argue this card is in fact slower! for the best gameplay card 2 is in fact the winner.
Some might find this revelation enough, but theres even more to it - what was the FPS doing between these amounts?
was that minimum of 10 FPS a once off, or was the score poor the whole way through the test, with one huge boost at some point?
As a basic breakdown, consider the following:
Think of this as an FPS timeline, spaced 10 seconds apart
20 25 30 25 30 25 10 10 12 15 35 40 100
There is a 10 and a 100 in there, but most of them are below 30 FPS (the minimum framerate most gamers are happy with)
So which card is faster? How do you tell which is the faster card, do we examine minimum, maximum, or average frame rate when deciding what to buy?
The simple answer: none. Benchmarks need to be updated, and the slightly old game F.E.A.R (First encounter assault recon) had an in-game benchmark with the answer.
This benchmark listed:
Minimum FPS
Average FPS
Maximum FPS
Thats all three - but which one to look at?
How about the other inclusion?
It showed a percentage of Frames below a threshold.
What if benchmarks were like this?
% of frames below 30FPS
% of frames below 50FPS
% of frames below 75FPS
% of frames below 100FPS
If a card gets 1% of frames above 100FPS, but 80% of them are below 30 - that card could have beaten the old system, but will be shown to be crap for most gaming here.
This is a brief writeup, and only deals with Frames per second. Updates/extensions may come out at a later date, feel free to leave a comment if you want something expanded upon.
I hope this can be stickied or make an article here at some point, enjoy
*********************************************************
This is a short, brief guide to explain the flaw behind benchmark comparisons of video cards.
Most benchmarks (excluding the 3Dmark series) get the results based on the minimum and maximum FPS (frames per second)
This was ok back in the day as no one saw anything wrong with more performance as an end goal - but its flawed.
Let me give you an example:
Video card one gives 10FPS as a minimum with 100FPS as a maximum - an average between these is 55.
Video card two gives 20 FPS minimum, with 80FPS maximum - an average between these is 50.
In the average review, card 1 is the winner by 5 FPS or 10%. People would buy this card thinking its faster.
But wait a sec, what about the minimum FPS? 20FPS is slightly laggy, but 10FPS is half that speed, and totally unplayable. Some would correctly argue this card is in fact slower! for the best gameplay card 2 is in fact the winner.
Some might find this revelation enough, but theres even more to it - what was the FPS doing between these amounts?
was that minimum of 10 FPS a once off, or was the score poor the whole way through the test, with one huge boost at some point?
As a basic breakdown, consider the following:
Think of this as an FPS timeline, spaced 10 seconds apart
20 25 30 25 30 25 10 10 12 15 35 40 100
There is a 10 and a 100 in there, but most of them are below 30 FPS (the minimum framerate most gamers are happy with)
So which card is faster? How do you tell which is the faster card, do we examine minimum, maximum, or average frame rate when deciding what to buy?
The simple answer: none. Benchmarks need to be updated, and the slightly old game F.E.A.R (First encounter assault recon) had an in-game benchmark with the answer.
This benchmark listed:
Minimum FPS
Average FPS
Maximum FPS
Thats all three - but which one to look at?
How about the other inclusion?
It showed a percentage of Frames below a threshold.
What if benchmarks were like this?
% of frames below 30FPS
% of frames below 50FPS
% of frames below 75FPS
% of frames below 100FPS
Its far better, because we can see where the card lies in overall performance - does it generally go above 50FPS for smooth gameplay? Does it lag briefly, or is it smooth consistent results? say 95% of all results were above 50 FPS but below 75FPS - thats perfectly fine for all but the fussiest gamersedit said:% of frames @ <25 (unplayable/borderline)
% of frames @ 25-30 (acceptable)
% of frames @ 30-60 (great)
% of frames @ 60-100 (win!)
If a card gets 1% of frames above 100FPS, but 80% of them are below 30 - that card could have beaten the old system, but will be shown to be crap for most gaming here.
This is a brief writeup, and only deals with Frames per second. Updates/extensions may come out at a later date, feel free to leave a comment if you want something expanded upon.
Last edited: