• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Comparing 192 bit and 256 bit real world test on Mass Effect 3, need help.

Phusius

New Member
Joined
Mar 23, 2012
Messages
1,316 (0.30/day)
Processor i5-2500k @ 4.5
Motherboard Asus Z68 Deluxe
Cooling Noctua NH-D14
Memory 16GB DDR3 1600
Video Card(s) Sapphire 7950 @ 1200/1475 @ 1175v
Display(s) Dell 23" 2ms
Case Carbide 500r
Audio Device(s) Asus Xonar DG
Alright, I am quite disappointed with my GTX 660 non-Ti version, AnAndTech shows both my 6970 and this 660 head to head, so I am wondering is 192 bit what is slowing me down in Mass Effect 3?

Example, I will do a twitch aiming move to shoot some baddies, 6970 and 600 both have Vsync turned on in game and default settings on control panels. Settings all the same.

I twitch shoot with the 660 and my frame stutters and drops to 40 fps for a split second (hurts my aiming because it really does stutter bad if I turn to fast) as I am about to shoot, and my 6970 doesn't do this. Is this because 192/256 bit?


Just purchased a Sapphire 7950, hopefully I have less VRM trouble this time around. Sending the 660 back.
 

manofthem

WCG-TPU Team All-Star!
Joined
Jan 9, 2011
Messages
10,960 (2.26/day)
Location
Florida
Processor 3900X @ 4.0
Motherboard Asus ROG Strix X570-E
Cooling DeepCool Castle 360EX
Memory G Skill Trident Z Neo 32GB 3600
Video Card(s) RX 5700 XT Pulse
Storage Sabrent Rocket Q 1TB
Display(s) LG 34UC88
Case Thermaltake P3
Power Supply Super Flower Leadex III 750w
Mouse Logitech G900
Keyboard G Skill KM570 MX Silver
Software Windows 10 Pro
Boy, you so silly. You have the most sincerely volatile love/hate relationships with your cards :laugh:
I must say though that I support your 7950 choice :D

Throw up a link to the 7950 you bought.
 
Last edited:

Phusius

New Member
Joined
Mar 23, 2012
Messages
1,316 (0.30/day)
Processor i5-2500k @ 4.5
Motherboard Asus Z68 Deluxe
Cooling Noctua NH-D14
Memory 16GB DDR3 1600
Video Card(s) Sapphire 7950 @ 1200/1475 @ 1175v
Display(s) Dell 23" 2ms
Case Carbide 500r
Audio Device(s) Asus Xonar DG
SAPPHIRE 100352-2L Radeon HD 7950 3GB 384-bit GDDR... and I plan to OC it to a modest 1000/1575, and check VRM's, if VRM's get to hot then a modest 1425 or so.

Well, I have been thinking about it. A 7950 3GB 384 bit has a lot of umph behind it, and Sapphire has always been trustworthy to me, in fact the only two stickers I have on my case are a Noctua one and a Sapphire one from my old 6950/70 CF setup. I feel it is time to forget this custom cooling business, I thought, the Arctic on my 7970 would be great for lowering temps... no one told me it increases your VRM temps to near instability (and yes I had the heatsinks on properly).

So the 7970 business is done and I don't see the need to blow money on another one, when a 7950 will easily max out all my games, and I can play twitch FPS to my satisfaction without having any screen stuttering like my 660 gives me. I am really shocked the 660 does this, because when I turn to aim real slow everything is smooth 60 FPS... it just can't handle super fast in-game turning... and that is something I really need since I play twitch FPS a lot.
 

Phusius

New Member
Joined
Mar 23, 2012
Messages
1,316 (0.30/day)
Processor i5-2500k @ 4.5
Motherboard Asus Z68 Deluxe
Cooling Noctua NH-D14
Memory 16GB DDR3 1600
Video Card(s) Sapphire 7950 @ 1200/1475 @ 1175v
Display(s) Dell 23" 2ms
Case Carbide 500r
Audio Device(s) Asus Xonar DG
Separate question:

Adaptive V-Sync users, does it flicker during in-game cutscenes but work fine otherwise? I turn that off and enable reg Vsync and no more screen tearing in Mass Effect 2 or 3 in-game cutscenes... sigh all these Nvidia technologies are turning out to be useless. :/
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.23/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
Separate question:

Adaptive V-Sync users, does it flicker during in-game cutscenes but work fine otherwise? I turn that off and enable reg Vsync and no more screen tearing in Mass Effect 2 or 3 in-game cutscenes... sigh all these Nvidia technologies are turning out to be useless. :/

Ever hear of the term "different folks, different strokes?" PC games are sometimes a balancing act to get it running the best for your GPU. If you keep switching between cards you will notice some games with some drivers are working better than others while playing a game with some settings turned off/on. Just consider for a moment how many variables are different for you when you change equipment. A-vsync won't work for some games/instances, and that's the plain truth it just plain will not work. Is there a problem with this? No. Did I complain that not every game supports tessellation when I purchased a 5850, no. Did it become a type of standard? Yes. Will A-vsync take hold? Maybe. Just expect that bleeding edge tech isn't always 100% perfect especially when it's a driver forced graphical augmentation.

:shadedshu
 
Joined
Nov 9, 2010
Messages
5,654 (1.15/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
I don't think it has anything to do with 192 vs 256 bus, and I think comments on the bus width in genmeral have been blown out of proportion by AMD owners. Nvidia optimized their implementation of GDDR5 quite a bit since the 670, so these new cards can do fine with 192, esp since they're less powerful anyway.

I think it has more to do with the difference in drivers and game coding, and that it has more to do with the efficiency of in-game VSync implementation from one game/driver/GPU to the next. Have you tried experimenting with Direct3DOverrider? Try using it for Triple Buffering, VSync or both and see if you notice any difference.
 

mediasorcerer

New Member
Joined
Sep 15, 2011
Messages
978 (0.21/day)
Location
coast ,melbourne
System Name THE MEDIAMACHINE
Processor i5-3570k
Motherboard Asus gene v z-77 matx.
Cooling Antec h20 620
Memory 2x4gb g.skill ripjaws z 2400
Video Card(s) h.i.s radeon 7950 reference 3 gb- hooray!!!
Storage samsung 128gb~830 ssd. samsung 500gb hdrive.
Display(s) 22 inch tele.
Case circa 1996 grey rat box with no sides front.until my own is finished
Audio Device(s) inbuilt creative.supreme effects 3
Power Supply thermaltake tt-500w
Software win 7 x64-
Benchmark Scores Coming soon
My suggestion is, bang the nvidia card, keep the 7950, and later on, grab another one and cf, that should do up most of your games plenty good~?
 

Phusius

New Member
Joined
Mar 23, 2012
Messages
1,316 (0.30/day)
Processor i5-2500k @ 4.5
Motherboard Asus Z68 Deluxe
Cooling Noctua NH-D14
Memory 16GB DDR3 1600
Video Card(s) Sapphire 7950 @ 1200/1475 @ 1175v
Display(s) Dell 23" 2ms
Case Carbide 500r
Audio Device(s) Asus Xonar DG
My suggestion is, bang the nvidia card, keep the 7950, and later on, grab another one and cf, that should do up most of your games plenty good~?

Yeah, I plan to keep my 7950 until it can no longer max my games, keep in mind, like when I play shogun 2 or the future rome 2, I have no problem turning off shadows and a few other things, that doesn't take away the experience to me... so my 7950 should last me quite a long time. ^^
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.23/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
Try turning off adaptive vsync

Mass Effect 3 is not a demanding game at all

Please sir, reread the initial post and all of the posts following. The OP was commenting on his use of proprietary vsync and the bus width of his card. Adaptive vsync was already addressed by the OP and he admitted this feature did nothing during important points of the game such as the cutscenes. Tearing during cutscenes and a bit of in game stutter make made him think it might be the bus width holding back the smoothness of his visuals. A number of very helpful people chimed in and the OP seems generally knowledgeable enough to have run the game with and without adaptive vsync.
 
Top