• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 780 Ti 3 GB

Joined
Jun 11, 2009
Messages
41 (0.01/day)
Location
Slovakia
Processor AMD Phenom II x3 720 2.8GHz @ 3.6 GHz
Motherboard GA-MA790FXT-UD5P
Cooling CPU-Arctic Cooling Freezer Extreme
Memory 4GB Corsair XMS3 DHX DDR3 1600
Video Card(s) SAPPHIRE HD7870 GHz EDITION OC 2GB GDDR5
Storage Western Digital Caviar Blue 500GB
Display(s) 19" Philips LCD 1440x900
Case Xigmatek Midgard W
Audio Device(s) onboard
Power Supply CORSAIR TX550M
Software Widows 7 Ultimate 64bit
I don't know if this has been discussed, but it would be nice to include some BF4 benchmarks.

Or at least change the description for BF3 as it's no longer "Arguably one of the most anticipated online shooters of recent times" :)
W1zzard mentioned that he add bf4 along with other games next month.
 
Joined
Jul 19, 2006
Messages
42,966 (8.82/day)
Processor i7 8700K
Motherboard Asus Maximus Hero X WiFi
Cooling Water
Memory 32GB G.Skill 3200Mhz CL14
Video Card(s) GTX 1080
Storage SSD's
Display(s) Nixeus EDG27
Case Thermaltake Core X5
Audio Device(s) SoundBlaster Zx
Power Supply Corsair H1000i
Mouse Finalmouse Pro
Keyboard Razer BlackWidow Tournament Ed.
AMd should just sell the bare boards at this point. :laugh:

That being said, I really dig the reference cooler's look with the red highlights. I wonder if it would fit onto a 7970 :confused: (unlikely)
A 280x cooler should..maybe. Not sure if the VRM section would line up "perfectly".
 
Joined
Aug 17, 2009
Messages
1,585 (0.42/day)
Location
Los Angeles/Orange County CA
System Name Vulcan
Processor i6 6600K
Motherboard GIGABYTE Z170X UD3
Cooling Thermaltake Frio Silent 14
Memory 16GB Corsair Vengeance LPX 16GB (2 x 8GB)
Video Card(s) ASUS Strix GTX 970
Storage Mushkin Enhanced Reactor 1TB SSD
Display(s) QNIX 27 Inch 1440p
Case Fractal Design Define S
Audio Device(s) On Board
Power Supply Cooler Master V750
Software Win 10 64-bit
No one cares about 4k performance. None of you are playing games on 4k monitors. If you think 4k is "the future" then you're probably the same people who bought into 1080p for 4x the cost with ZERO content coming for 5 years. There isn't even a significant gaming population above 1080p, which is why reviewers generally look at 1080p and 1440p or 1600p results, where the 780ti smokes the 290x without taking overclocking into consideration.

You AMD fans need to stop trying to skew results, it's ridiculous. Wait until aftermarket 290x and aftermarket 780ti are out and reviewed and then let's see what the 1440p results say. I can tell you this right now: No factory card (not on water) from the 290x line is going to beat the 780ti and I highly doubt that even on water any of them will beat a 780ti on water. The 290x has major heat issues, major noise issues (SLI 708ti is quieter than 290x uber).

I will gladly pay the $150 dollars to avoid AMDs software, drivers, the heat and noise. When you're talking about two GPUs at $1100 or $1400 who cares? If you can afford that system in the first place then $300 is pretty much nothing when considering the total system cost...
I agree with you on 4K. Worthless to even bring up right now. 1440p is what many people can afford.

But talk about skewing things. Maybe $300 is nothing to you, but that is skewed financial sense to me.

Particularly when Wizards performance numbers at 2560x1600 show the 780ti at 100% and the 290 (non-x) at 87%.

Again, maybe 13% performance is worth 75% more in cost to you, but I would say that is skewed reasoning.

...
 

swagnuggets123

New Member
Joined
May 30, 2013
Messages
8 (0.00/day)
Location
Avon Indiana USA
System Name Beaufighter mk 1
Processor i7 3970x @ 5ghz
Motherboard ASrock x79 Extreme 11
Cooling 5 swiftech 480mm radiator with a cpu waterblock and 4 full cover gpu waterblock
Memory 64gb Kingston Hyperx Beast 2133 Quad Channel @ 2770
Video Card(s) 4 Galaxy Geforce Gtx 780 HOF editions
Storage Samsung EVO 1tb SSD Western Digital WD SE 4 tb
Display(s) 3 Asus PQ321 31.5in 4k monitors in Nvidia Surround
Case Corsair 900D
Power Supply LEPA G Series 1600 Watt 80 PLUS GOLD Certified
Software Windows 8 Pro, Linux Mint
It was never intended to do so. IMHO it is great card, considering it has more transistors than competitor card yet it produce less heat that is impressive. If nothing else it may force AMD to starts thinking about bundle more games with R9 series graphics cards so it will be win win situation for us customers.
But also the nvidia chips are on a much bigger die
 
Joined
Dec 22, 2011
Messages
2,962 (1.02/day)
System Name Zimmer Frame Rates
Processor Intel i7 920 @ Stock speeds baby
Motherboard EVGA X58 3X SLI
Cooling True 120
Memory Corsair Vengeance 12GB
Video Card(s) Palit GTX 980 Ti Super JetStream
Storage Of course
Display(s) Crossover 27Q 27" 2560x1440
Case Antec 1200
Audio Device(s) Don't be silly
Power Supply XFX 650W Core
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 10
Benchmark Scores Epic
I agree with you on 4K. Worthless to even bring up right now. 1440p is what many people can afford.

But talk about skewing things. Maybe $300 is nothing to you, but that is skewed financial sense to me.

Particularly when Wizards performance numbers at 2560x1600 show the 780ti at 100% and the 290 (non-x) at 87%.

Again, maybe 13% performance is worth 75% more in cost to you, but I would say that is skewed reasoning.

...
The problem is people have been bringing up 4K benchmarks in order to show the benefits of the 290(x) over the 780 Ti.

Kinda flies in the face of their "good value" argument.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
15,925 (3.65/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K at stock (hits 5 gees+ easily)
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (4 x 4GB Corsair Vengeance DDR3 PC3-12800 C9 1600MHz)
Video Card(s) Zotac GTX 1080 AMP! Extreme Edition
Storage Samsung 850 Pro 256GB | WD Green 4TB
Display(s) BenQ XL2720Z | Asus VG278HE (both 27", 144Hz, 3D Vision 2, 1080p)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair HX 850W v1
Software Windows 10 Pro 64-bit
Is there any game that can max out 2GB RAM at 1080?

Just wondering, because the 690 can now be had for around the same price as a 780 Ti in some places and it's generally faster, if you don't mind SLI (I don't). However, it's only got 2GB useable RAM (2x2 config) whereas the 780 Ti has 3GB. Obviously, if the RAM maxes out the performance will tank or the game will crash if badly written, so is to be avoided at all costs.

By the time 4K performance will actually matter, with affordable single pane 4K monitors, all of today's cards will be obsolete anyway, hence I don't care about 4K performance (sorry AMD).
 
Joined
Aug 17, 2009
Messages
1,585 (0.42/day)
Location
Los Angeles/Orange County CA
System Name Vulcan
Processor i6 6600K
Motherboard GIGABYTE Z170X UD3
Cooling Thermaltake Frio Silent 14
Memory 16GB Corsair Vengeance LPX 16GB (2 x 8GB)
Video Card(s) ASUS Strix GTX 970
Storage Mushkin Enhanced Reactor 1TB SSD
Display(s) QNIX 27 Inch 1440p
Case Fractal Design Define S
Audio Device(s) On Board
Power Supply Cooler Master V750
Software Win 10 64-bit
The problem is people have been bringing up 4K benchmarks in order to show the benefits of the 290(x) over the 780 Ti.

Kinda flies in the face of their "good value" argument.
I'm sure with their 512-bit bus they will do fine at higher resolutions.

Personally just think that talk about 4k is still very premature.

In making their decision, people should stick to what matters now, and realize $300 is a lot money and they are not using 4k.

I find people bringing up non-important points for the sake of arguing pointless.

...
 
Joined
Jun 11, 2009
Messages
41 (0.01/day)
Location
Slovakia
Processor AMD Phenom II x3 720 2.8GHz @ 3.6 GHz
Motherboard GA-MA790FXT-UD5P
Cooling CPU-Arctic Cooling Freezer Extreme
Memory 4GB Corsair XMS3 DHX DDR3 1600
Video Card(s) SAPPHIRE HD7870 GHz EDITION OC 2GB GDDR5
Storage Western Digital Caviar Blue 500GB
Display(s) 19" Philips LCD 1440x900
Case Xigmatek Midgard W
Audio Device(s) onboard
Power Supply CORSAIR TX550M
Software Widows 7 Ultimate 64bit
A 280x cooler should..maybe. Not sure if the VRM section would line up "perfectly".
What you say, would it fit?

1st gigabyte windforce 280x, 2nd asus directcu 280x, 3rd 290x
 

Attachments

Joined
Jul 19, 2006
Messages
42,966 (8.82/day)
Processor i7 8700K
Motherboard Asus Maximus Hero X WiFi
Cooling Water
Memory 32GB G.Skill 3200Mhz CL14
Video Card(s) GTX 1080
Storage SSD's
Display(s) Nixeus EDG27
Case Thermaltake Core X5
Audio Device(s) SoundBlaster Zx
Power Supply Corsair H1000i
Mouse Finalmouse Pro
Keyboard Razer BlackWidow Tournament Ed.
Joined
Sep 29, 2013
Messages
97 (0.04/day)
Processor Intel i7 4960x Ivy-Bridge E @ 4.6 Ghz @ 1.42V
Motherboard x79 AsRock Extreme 11.0
Cooling EK Supremacy Copper Waterblock
Memory 65.5 GBs Corsair Platinum Kit @ 666.7Mhz
Video Card(s) PCIe 3.0 x16 -- Asus GTX Titan Maxwell
Storage Samsung 840 500GBs + OCZ Vertex 4 500GBs 2x 1TB Samsung 850
Audio Device(s) Soundblaster ZXR
Power Supply Corsair 1000W
Mouse Razer Naga
Keyboard Corsair K95
Software Zbrush, 3Dmax, Maya, Softimage, Vue, Sony Vegas Pro, Acid, Soundforge, Adobe Aftereffects, Photoshop
@W1zzard: any chance you could show this card version of this chart?

http://tpucdn.com/reviews/AMD/R9_290X/images/analysis_quiet.gif

Would like a better sense of the throttling amount this card has, if any.
Anybody ever did this test or review with Ultra Low Power State (ULPS) set to 0 in the registry? Personally, I think down-throttling is due to the fact that the card doesn't need to push higher Core Frequencies or GPU Loads to do the same level of work.

In CrossfireX, in a lot of current games, the 1st GPU won't push full GPU Load, or Core Frequency because it isn't necessary to get the same amount of output needed to finish drawing frames. In other games that are more intensive, the GPUs will push 100% at full stock frequencies on both GPUs at 95 degs C tops.

Disabling ULPS would probably prove whether people are making a big deal about the throttling anomalies, or there are merits to back up the claims. I'm leaning towards the possibility that it's just NVidia consumers blowing something simple and insignificant, out of proportion...

About GTX 780 Ti. Nice card, but in some ways, I feel as if NVidia kicked it's consumers in the balls when they bought GTX Titan for $1069 and $1099 for partial 2880 Cuda Cores, and K6000 with a whopping Price Tag of almost $4000.00 for 6 GBs VRam less, 64bit floating precision, and the small minor additions that come with Workstation Cards.

RX9-290x still has a higher Max Consumption Wattage over 300 Watts, but GTX 780 Ti was only 10 watts different full load in comparison to the RX9-290x. Max Temps are less than 10 degs C difference. RX9-290x is 95 degs c on full load, GTX 780 Ti is 89 degs on full load?... I only took a glimpse of the numbers. For games optimized for AMD, there's only a 2 to 7 FPS difference, and for games that are optimized for NVidia, there's only a 10 to 20 fps difference. Still, in theory, GTX 780 Ti is suppose to be a theoretical 15% performance increase, but it seems like GTX Titan inches closer to GTX 780 Ti on resolutions higher than 1600p. For $699.99, I am thinking more of like $749.99 on Newegg because they need to make a profit, is probably what you're looking to pay on the first day of release. I remember when GTX 680s first hit the shelves, Newegg jacked the price up from $599.99 to $699.99. Well the price was somewhere in between those figures... Also, to take into account, with that price tag, NVidia users are getting DX11.2 full support. Ya... If you look at the "nitty gritty", NVidia consumers aren't getting a whole lot more back. Just a GTX 780 Refresh with full Titan Cores, and some additional perks... Like the Frame Time Variance Graphs on the GTX 780 Ti. Curve Band looks smaller, and the minimum extreme is a little lower than RX9-290x on a single card setup. This is something that should have been seen back in GTX 680, 690, Titan, and 780.... Lower frame times equate to higher fps, and small frame time bands equates to less deviation or stalling. AMD is now better at Multi-GPU setups because Scaling on the new PCIe Based CrossfireX is roughly from 1.8 to 2.0x the FPS. NVidia is now, again, the better single card/GPU solution. Seems like AMD and NVidia are playing musical chairs by switching between these two factors.

One other thing. GTX 780 Ti OC's past 1100 Mhz. Asus ROG Ares II has a turbo clock of 1100 Mhz, dual GPU solution, and it can actually OC past 1200 Mhz Core, 1750 Mhz Mem with an ASCI Quality of 71%... I've expected more from GTX 780 Ti. I'll be expecting more from RX9-290x with a better cooling solution to go past the 1250 Mhz to 1300 Mhz mark with a higher power envelope then it's competitors.

@ the Btarunr and W1zzard,

You should provide screen shots of the GPU-Z in your testing setup. The reason is this. You don't really state it in your write ups, but a lot of readers are under the assumption that you're testing on the PCIe 3.0 x16. On some other sites, they don't. Now there won't be a difference between PCIe 3.0 x16 and PCIe 2.0 x16 except for the bandwidth, but for your readers, I think you should just add in the screen shot to show that you're using the Graphic Card, and that it's using that PCIe interface in the test. Just something minor to consider. The only ones who use a screen shot of GPU-Z during their benches is Legitreviews.com.
 
Last edited by a moderator:
Joined
Oct 26, 2011
Messages
3,145 (1.07/day)
Processor 8700k Intel
Motherboard z370 MSI Godlike Gaming
Cooling Triple Aquacomputer AMS Copper 840 with D5
Memory TridentZ RGB G.Skill C16 3600MHz
Video Card(s) GTX 1080 Ti
Storage Crucial MX SSDs
Display(s) Dell U3011 2560x1600 + Dell 2408WFP 1200x1920 (Portrait)
Case Core P5 Thermaltake
Audio Device(s) Essence STX
Power Supply AX 1500i
Mouse Logitech
Keyboard Corsair
Software Win10
One other thing. GTX 780 Ti OC's past 1100 Mhz. Asus ROG Ares II has a turbo clock of 1100 Mhz, dual GPU solution, and it can actually OC past 1200 Mhz Core, 1750 Mhz Mem with an ASCI Quality of 71%... I've expected more from GTX 780 Ti. I'll be expecting more from RX9-290x with a better cooling solution to go past the 1250 Mhz to 1300 Mhz mark with a higher power envelope then it's competitors.
Usually w1z does not increase the voltage in his reviews.

Be sure that this card, provided you get a good bin like the sample he reviewed, will hit 1400 MHz with 1.35v on the core.

But that's just trivial because you are comparing just frequencies without taking architectures and core count into the equation.

On a side note, 780 ti 3GB just came up here at our retailers, I'm kinda on the fence about waiting the 6GB models, I'd only use more than 3GB with Skyrim TBH.
 
Joined
Dec 31, 2009
Messages
15,776 (4.37/day)
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
19,808 (3.49/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
You should provide screen shots of the GPU-Z in your testing setup. The reason is this. You don't really state it in your write ups, but a lot of readers are under the assumption that you're testing on the PCIe 3.0 x16. On some other sites, they don't. Now there won't be a difference between PCIe 3.0 x16 and PCIe 2.0 x16 except for the bandwidth, but for your readers, I think you should just add in the screen shot to show that you're using the Graphic Card, and that it's using that PCIe interface in the test. Just something minor to consider. The only ones who use a screen shot of GPU-Z during their benches is Legitreviews.com.
the oc page shows a gpuz screenshot, and we do test at x16 3.0, of course
 
Joined
Dec 22, 2011
Messages
2,962 (1.02/day)
System Name Zimmer Frame Rates
Processor Intel i7 920 @ Stock speeds baby
Motherboard EVGA X58 3X SLI
Cooling True 120
Memory Corsair Vengeance 12GB
Video Card(s) Palit GTX 980 Ti Super JetStream
Storage Of course
Display(s) Crossover 27Q 27" 2560x1440
Case Antec 1200
Audio Device(s) Don't be silly
Power Supply XFX 650W Core
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 10
Benchmark Scores Epic
I'm sure with their 512-bit bus they will do fine at higher resolutions.

Personally just think that talk about 4k is still very premature.

In making their decision, people should stick to what matters now, and realize $300 is a lot money and they are not using 4k.

I find people bringing up non-important points for the sake of arguing pointless.

...
Oh I absolutely agree, I'm just a little bemused that on one hand people argue $300 savings, whilst using benchmarks done on $2800+ monitors to prove their point.
 

nem

Joined
Oct 22, 2013
Messages
165 (0.07/day)
Location
Cyberdyne CPU Sky Net
item # 1
nVidea shitting in the face of their customers again more ... do not know
well but only if nVidea drawing conclusions could actually have
Kepler launched the Titan big with their 2880 cuda cores
NV unlocked and actually did not to compete with the
Quadro k6000 ( full unlocked ) to know but did not
and could do so for those who bought the Titan as the best that
could offer to disappoint ... pfff ¬ ¬


Point # 2
Keppler and not improve much but will go up 290x performance with better drivers might be a tie technician .

In walks consumption almost equal 290x and 780Ti

TechPowerUp .com/reviews/NVIDIA/GeForce_GTX_780_Ti/25.html


Point # 3
No actual stock still ... have not even begun to sell out not that day but still not like the play of the demo XD

noticias3d .com/articulo.asp?idarticulo=1873&pag=30


Point # 4
Mantle remains to show what if a simple R9 290pro gives thrashed to GTX780Ti

Point # 5
Keppler is CPU dependent so if you do not have a i7 4770 forgetting better go see the reviewss fps XD
 
Joined
May 21, 2011
Messages
660 (0.21/day)
System Name Tiger1-Workstation
Processor Intel XEON E3-1275V2 / E3-1230V3
Motherboard ASUS SABERTOOTH Z77 / AsRock H87 Performance
Cooling Corsair H80i Watercooling
Memory 32GB Corsair Dominator Platinum 2400
Video Card(s) Inno3D GTX 780 Ti
Storage 2TB SSD(4X OCZ vertex 4 256GB LSI RAID0 + Crucial M550 1TB)
Display(s) 2x Dell U3011 30" IPS
Case Silverstone Raven 03
Audio Device(s) Xonar Essence STX--> Xonar Essence One --> SPL Auditor -->Hivi X6
Power Supply Corsair AX860i Platinum
Software Windows 8.1 Enterprise
Oh I absolutely agree, I'm just a little bemused that on one hand people argue $300 savings, whilst using benchmarks done on $2800+ monitors to prove their point.
its pretty obvious that they come here not to actually argue a sensible point but to criticize with any means necessary. -doesn't matter if it actually makes sense or not
 
Joined
Oct 26, 2011
Messages
3,145 (1.07/day)
Processor 8700k Intel
Motherboard z370 MSI Godlike Gaming
Cooling Triple Aquacomputer AMS Copper 840 with D5
Memory TridentZ RGB G.Skill C16 3600MHz
Video Card(s) GTX 1080 Ti
Storage Crucial MX SSDs
Display(s) Dell U3011 2560x1600 + Dell 2408WFP 1200x1920 (Portrait)
Case Core P5 Thermaltake
Audio Device(s) Essence STX
Power Supply AX 1500i
Mouse Logitech
Keyboard Corsair
Software Win10
Oh I absolutely agree, I'm just a little bemused that on one hand people argue $300 savings, whilst using benchmarks done on $2800+ monitors to prove their point.
Also taking the price tag out of the equation, 4K just isn't worth it as a purchase yet.

Current monitors that are being offered suck overall, I would never pick up the ASUS monitor, not even for 1K € considering what it offers.

Go take a look at the Anandtech review, it doesn't deliver in colors (which I admit I'm a bit picky on the subject), responsiveness (that much latency could give issues even on single player games) and it's a bloody tiled display.

What 4K needs is a native 4K panel that doesn't use MST over DP and does 4k at 60Hz without two streams, then I'd be tempted to purchase one.

Oh I forgot the new HDMI, that's important, too.
 

HTC

Joined
Apr 1, 2008
Messages
3,731 (0.88/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
Anybody ever did this test or review with Ultra Low Power State (ULPS) set to 0 in the registry? Personally, I think down-throttling is due to the fact that the card doesn't need to push higher Core Frequencies or GPU Loads to do the same level of work.

In CrossfireX, in a lot of current games, the 1st GPU won't push full GPU Load, or Core Frequency because it isn't necessary to get the same amount of output needed to finish drawing frames. In other games that are more intensive, the GPUs will push 100% at full stock frequencies on both GPUs at 95 degs C tops.

Disabling ULPS would probably prove whether people are making a big deal about the throttling anomalies, or there are merits to back up the claims. I'm leaning towards the possibility that it's just NVidia consumers blowing something simple and insignificant, out of proportion...

About GTX 780 Ti. Nice card, but in some ways, I feel as if NVidia kicked it's consumers in the balls when they bought GTX Titan for $1069 and $1099 for partial 2880 Cuda Cores, and K6000 with a whopping Price Tag of almost $4000.00 for 6 GBs VRam less, 64bit floating precision, and the small minor additions that come with Workstation Cards.

RX9-290x still has a higher Max Consumption Wattage over 300 Watts, but GTX 780 Ti was only 10 watts different full load in comparison to the RX9-290x. Max Temps are less than 10 degs C difference. RX9-290x is 95 degs c on full load, GTX 780 Ti is 89 degs on full load?... I only took a glimpse of the numbers. For games optimized for AMD, there's only a 2 to 7 FPS difference, and for games that are optimized for NVidia, there's only a 10 to 20 fps difference. Still, in theory, GTX 780 Ti is suppose to be a theoretical 15% performance increase, but it seems like GTX Titan inches closer to GTX 780 Ti on resolutions higher than 1600p. For $699.99, I am thinking more of like $749.99 on Newegg because they need to make a profit, is probably what you're looking to pay on the first day of release. I remember when GTX 680s first hit the shelves, Newegg jacked the price up from $599.99 to $699.99. Well the price was somewhere in between those figures... Also, to take into account, with that price tag, NVidia users are getting DX11.2 full support. Ya... If you look at the "nitty gritty", NVidia consumers aren't getting a whole lot more back. Just a GTX 780 Refresh with full Titan Cores, and some additional perks... Like the Frame Time Variance Graphs on the GTX 780 Ti. Curve Band looks smaller, and the minimum extreme is a little lower than RX9-290x on a single card setup. This is something that should have been seen back in GTX 680, 690, Titan, and 780.... Lower frame times equate to higher fps, and small frame time bands equates to less deviation or stalling. AMD is now better at Multi-GPU setups because Scaling on the new PCIe Based CrossfireX is roughly from 1.8 to 2.0x the FPS. NVidia is now, again, the better single card/GPU solution. Seems like AMD and NVidia are playing musical chairs by switching between these two factors.
I think you missed what i'm trying to know.

With the R9 290x:

1 - take any benchmark you like that's able to push the card so that it throttles a lot @ stock settings (everything, fan included) and try and get the average speed of it throughout the test
2 - set the default speed of the card to the value you discovered in point #1
3 - the card will now throttle way less and, in theory, it should produce the same result as with everything @ stock

If throttling lots of times makes it slower then throttling a few times in the above scenario, then the fact it throttles too much due to the shoddy cooler is actually hampering performance: that's what i want to know.


nVidia's approach is different and i just don't know if it's possible to test :(

I sure hope it is possible: i'm very curious to know how the different technologies compare, efficiency wise!
 
Joined
May 21, 2011
Messages
660 (0.21/day)
System Name Tiger1-Workstation
Processor Intel XEON E3-1275V2 / E3-1230V3
Motherboard ASUS SABERTOOTH Z77 / AsRock H87 Performance
Cooling Corsair H80i Watercooling
Memory 32GB Corsair Dominator Platinum 2400
Video Card(s) Inno3D GTX 780 Ti
Storage 2TB SSD(4X OCZ vertex 4 256GB LSI RAID0 + Crucial M550 1TB)
Display(s) 2x Dell U3011 30" IPS
Case Silverstone Raven 03
Audio Device(s) Xonar Essence STX--> Xonar Essence One --> SPL Auditor -->Hivi X6
Power Supply Corsair AX860i Platinum
Software Windows 8.1 Enterprise
I think you missed what i'm trying to know.

With the R9 290x:

1 - take any benchmark you like that's able to push the card so that it throttles a lot @ stock settings (everything, fan included) and try and get the average speed of it throughout the test
2 - set the default speed of the card to the value you discovered in point #1
3 - the card will now throttle way less and, in theory, it should produce the same result as with everything @ stock

If throttling lots of times makes it slower then throttling a few times in the above scenario, then the fact it throttles too much due to the shoddy cooler is actually hampering performance: that's what i want to know.


nVidia's approach is different and i just don't know if it's possible to test :(

I sure hope it is possible: i'm very curious to know how the different technologies compare, efficiency wise!
1 and 2 is the samething, you are taking an average of something why should the result be any different? 4+8 is the same as 6+6 [ (4+8)/2 is the same as (6+6)/2]
 
Joined
Apr 19, 2011
Messages
2,107 (0.67/day)
Location
So. Cal.
But also the nvidia chips are on a much bigger die
Although the count is one thing it's all about transistor density. AMD is much greater and packing those transistors that much closer, while keeping them cooling is a big achievement

The problem is people have been bringing up 4K benchmarks in order to show the benefits of the 290(x) over the 780 Ti.
I'm sure with their 512-bit bus they will do fine at higher resolutions.

Personally just think that talk about 4k is still very premature.

In making their decision, people should stick to what matters now, and realize $300 is a lot money and they are not using 4k.

I find people bringing up non-important points for the sake of arguing pointless.
Exactly, I don't say 4k is relevant in market, but to really see the engineering and processing power to push all those pixel's is something to revel in, and applaud the technical hurdle it's achieving for that size of die! ;)
 

HTC

Joined
Apr 1, 2008
Messages
3,731 (0.88/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
1 and 2 is the samething, you are taking an average of something why should the result be any different? 4+8 is the same as 6+6
If the throttling is efficient, then yes. If not, then no.

That's the very thing i want to know: the efficiency!
 
Joined
May 21, 2011
Messages
660 (0.21/day)
System Name Tiger1-Workstation
Processor Intel XEON E3-1275V2 / E3-1230V3
Motherboard ASUS SABERTOOTH Z77 / AsRock H87 Performance
Cooling Corsair H80i Watercooling
Memory 32GB Corsair Dominator Platinum 2400
Video Card(s) Inno3D GTX 780 Ti
Storage 2TB SSD(4X OCZ vertex 4 256GB LSI RAID0 + Crucial M550 1TB)
Display(s) 2x Dell U3011 30" IPS
Case Silverstone Raven 03
Audio Device(s) Xonar Essence STX--> Xonar Essence One --> SPL Auditor -->Hivi X6
Power Supply Corsair AX860i Platinum
Software Windows 8.1 Enterprise
If the throttling is efficient, then yes. If not, then no.

That's the very thing i want to know: the efficiency!
ok im starting to see your point.

what you are saying is that the card running on a constant (slightly) lower temperature is more efficient than doing a cycle of low-to-maximum spikes
 

HTC

Joined
Apr 1, 2008
Messages
3,731 (0.88/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 2600X
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Nitro+ Radeon RX 480 OC 4 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 19.04 LTS
ok im starting to see your point.

what you are saying is that the card running on a constant (slightly) lower temperature is more efficient than doing a cycle of low-to-high spikes
Not temperature but speed, but yes: low-to-high spikes in speed.
 
Joined
Dec 31, 2009
Messages
15,776 (4.37/day)
Its a good question, but, the switching is so fast, I do not think it would make a difference. Not to mention, you lose the peaks too by starting out lower.

Turn the fan up. :)
 
Joined
Oct 26, 2011
Messages
3,145 (1.07/day)
Processor 8700k Intel
Motherboard z370 MSI Godlike Gaming
Cooling Triple Aquacomputer AMS Copper 840 with D5
Memory TridentZ RGB G.Skill C16 3600MHz
Video Card(s) GTX 1080 Ti
Storage Crucial MX SSDs
Display(s) Dell U3011 2560x1600 + Dell 2408WFP 1200x1920 (Portrait)
Case Core P5 Thermaltake
Audio Device(s) Essence STX
Power Supply AX 1500i
Mouse Logitech
Keyboard Corsair
Software Win10
Top