• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

R700 Crossfire X Scaling Efficiency up to 104 %

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,849 (7.39/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Chinese hardware website ChipHell put up preliminary comparisons between the Radeon HD 4870 X2 and a Crossfire X setup using two of these cards to compare the scaling efficiency of these cards when paired in a Crossfire X setup. Readings were put up on a chart. The cards seem to scale up to 104% and in several tests the performance scaling stayed above 50%. The average performance increment a pair of these cards has over a single card is noted to be 66%, on launch drivers. It is general belief that entry-level and mid-range cards scale more efficiently than high-end cards. Scaling efficiency for these cards are high even with settings such as 4x AA, 8x AF enabled. The HD 4870 X2 is expected to be formally launched this Tuesday the 12th.



View at TechPowerUp Main Site
 
OMG, how did they manage that? :)
 
I'm like OMG too, still using whatever help Google Translate can give me in trying to find out more about their bench.
 
Interesting they didn't show that Nvidia loving game which reduces performance in Crossfire named Crysis . It would have been interesting if AMD finally found a way around that .
 
Those are definately great news. Can't wait till they come out, I might be getting at least my first one very soon after released. Then I'll get another one to pair it up afterwards. It should be a beast.

Some how, Crysis does do a lot better with Nvidia though. Wonder why is that??
 
:roll: love the AMD/ATI all they need now is to sort out their act in CPU market...

I think they are going to sort out much of it with the 45nm Phenoms, those things are looking great. However let me tell you something. Forget about benchmarks. When you actually play the game, there is not much difference between lets say my Quad COre phenom at 3.1Ghz, and my buddies Q6600 at 3.2GHz. I have a single 2900xt, he has a 3870x2. I think in the real world, of gaming, and everyday use, AMD Is not far behind at all. Just my two cents of course. I can be crazy:D
 
ATi/AMD, this is great news....I'm really loving the HD4xxx launch. AMD needs to catch up with Intel thou, they maybe clock for clock doing great. AMD needs to pull out some good dual-core's, many people don't buy quads for gaming, and when not gaming, the Phenom's are behind. Wait till Nehalem, what then AMD?
 
Interesting they didn't show that Nvidia loving game which reduces performance in Crossfire named Crysis . It would have been interesting if AMD finally found a way around that .

Those Call of Juarez benches are close to answering that. CoJ hasn't been very ATI friendly.
 
Are these drivers gonna improve HD 4850 crossfirex too? When they are cheap(like the 3850s are now) I will get another for sure. Even the current drivers scale really well.
 
Those Call of Juarez benches are close to answering that. CoJ hasn't been very ATI friendly.

What are you talking about ?

COJ has always been an ATI game. Its one of the few games on which ATI worked with the developers and the result of which IS 2900xt performs better than 8800GTX .
 
Last edited by a moderator:
What are you talking about ?

COJ has always been an ATI game. Its one of the few games on which ATI worked with the developers .

Yea if I remember correctly the HD 2900XT raped at that game
 
You're right. How about UT3 ?
 
the issue is simple, in games that have not been designed for ATI cards but for general shaders the Nvidia card wins, it has more shader power when the game doesnt use the simple shaders of the ATI cards. Once that happens they get a leg up
 
I can't wait to get my hands on one of these but it's not gonna be till next year after taxes(trying to be a responsible adult and pay my bills first)
 
Nice numbers. I may actually be able to pull out my 30" to play games with this card :toast:
 
the issue is simple, in games that have not been designed for ATI cards but for general shaders the Nvidia card wins, it has more shader power when the game doesnt use the simple shaders of the ATI cards. Once that happens they get a leg up

Nvidia has more shader power?
what do you mean by shader power?
I think you meant processing power which is the number of constructions/flops a gpu can handle in a single clock.which can be calculated by the following formula:
number of cores x core frequency x number of cinstructions the core can handle is a single clock"cycle".
The 4870 has 800 x 750 x 2 =1200 Giga Flops/cycle
The 280gtx has 240 x 1300 x 3 = 933 Giga flops/cycle
So,the 4870 has more "shader power" than the 280gtx.
crysis fro nvida cards is like call of huarez for ATI cards.It''s optimezd to work better on gefoce cards.
 
Nvidia has more shader power?
what do you mean by shader power?
I think you meant processing power which is the number of constructions/flops a gpu can handle in a single clock.which can be calculated by the following formula:
number of cores x core frequency x number of cinstructions the core can handle is a single clock"cycle".
The 4870 has 800 x 750 x 2 =1200 Giga Flops/cycle
The 280gtx has 240 x 1300 x 3 = 933 Giga flops/cycle
So,the 4870 has more "shader power" than the 280gtx.
crysis fro nvida cards is like call of huarez for ATI cards.It''s optimezd to work better on gefoce cards.

He meant that One shader on ATI Cards = 5 Virtual shaders or something like that .
 
Interesting they didn't show that Nvidia loving game which reduces performance in Crossfire named Crysis . It would have been interesting if AMD finally found a way around that .

It only takes 20 seconds of google searching to figure out how to get crossfire to scale in crysis... I'm sure that AMD would get into trouble for modding the game.
 
It only takes 20 seconds of google searching to figure out how to get crossfire to scale in crysis... I'm sure that AMD would get into trouble for modding the game.

OK i read some Korean translated article of which i didn't understand anything . Now you can tell me how to make crossfire scale .
 
Back
Top