• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

My Personal Experience

Joined
Jul 25, 2006
Messages
12,130 (1.87/day)
Location
Nebraska, USA
System Name Brightworks Systems BWS-6 E-IV
Processor Intel Core i5-6600 @ 3.9GHz
Motherboard Gigabyte GA-Z170-HD3 Rev 1.0
Cooling Quality case, 2 x Fractal Design 140mm fans, stock CPU HSF
Memory 32GB (4 x 8GB) DDR4 3000 Corsair Vengeance
Video Card(s) EVGA GEForce GTX 1050Ti 4Gb GDDR5
Storage Samsung 850 Pro 256GB SSD, Samsung 860 Evo 500GB SSD
Display(s) Samsung S24E650BW LED x 2
Case Fractal Design Define R4
Power Supply EVGA Supernova 550W G2 Gold
Mouse Logitech M190
Keyboard Microsoft Wireless Comfort 5050
Software W10 Pro 64-bit
Regarding CPU Temperatures - Intel measures temperatures weird. I always kept the max temp on the 9370 under 60c, and only that high when stress testing. The 6700k is safe to 81c and that is on the low end of the safety spectrum, but the chip itself is a WAY lower TDP, and definitely runs cooler in my rig, but measurable temps of the core are substantially higher. Anyone care to weigh in on why this is?
It is not that Intel measures temps weird. Long standing Intel users would probably say AMD measures temps weird.

The problem is, there is no industry standard for where processor makers will put sensors, nor is there any industry standard for how those test points are even labeled. So what makes it "weird" is all the processor makers do it differently.

If they all agreed to a common terminology, it would be so much easier. But one uses CPU temp, the other uses Tcase and another uses the socket. One other uses "Core", the other "Tjunction" - it is no wonder consumers (and often the HW monitoring programs) get confused.

Even within the same brand, the makers don't put the sensors in the same place.

All that creates confusion before the computer is even turned on! After that, there is no (and can be no) standard for what is "hot". Hot for this CPU may be 65°C but that CPU may have another 10°C headroom.

As far as AMD vs Intel, both make great and reliable processors and either will form the foundation for a great computer. And we (consumers) need both companies to continue doing so. While I happen to prefer Intel, that is only because my Ford F150 is better than Chevy and both are better than RAM! ;)
 
Joined
Nov 9, 2008
Messages
2,318 (0.41/day)
Location
Texas
System Name Mr. Reliable
Processor Ryzen R9 5950x
Motherboard MSI Meg X570s Ace Max
Cooling D5 Pump, Singularity Top/Res, 2x360mm EK P rads, EK Magnitude/Alphacool Blocks
Memory 32Gb (4x8Gb) Corsair Dominator Platinum 3600Mhz @ 16/19/20/36 1.35v
Video Card(s) MSI 3080ti with Alphacool Block
Storage 2 x Corsair Force MP400 1TB Nvme; 2 x T-Force Cardea Z340; 2 x Mushkin Reactor 1TB
Display(s) Acer 32" Z321QU 2560x1440; LG 34GP83A-B 34" 3440x1440
Case Lian Li PC-011 Dynamic XL; Synology DS218j w/ 2 x 2TB WD Red
Audio Device(s) SteelSeries Arctis Pro+
Power Supply EVGA SuperNova 850G3
Mouse Razer Basilisk V2
Keyboard Das Keyboard 6; Razer Orbweaver Chroma
Software Windows 10 Pro
It is not that Intel measures temps weird. Long standing Intel users would probably say AMD measures temps weird.

The problem is, there is no industry standard for where processor makers will put sensors, nor is there any industry standard for how those test points are even labeled. So what makes it "weird" is all the processor makers do it differently.

If they all agreed to a common terminology, it would be so much easier. But one uses CPU temp, the other uses Tcase and another uses the socket. One other uses "Core", the other "Tjunction" - it is no wonder consumers (and often the HW monitoring programs) get confused.

Even within the same brand, the makers don't put the sensors in the same place.

All that creates confusion before the computer is even turned on! After that, there is no (and can be no) standard for what is "hot". Hot for this CPU may be 65°C but that CPU may have another 10°C headroom.

I am digging in to the link Earth threw up. Pretty interesting read on TJMax and what it means/how it is read. It is a pretty steep learning curve with the multitude of fundamental differences between the two processor manufacturers.
 

Toothless

Tech, Games, and TPU!
Supporter
Joined
Mar 26, 2014
Messages
9,268 (2.52/day)
Location
Washington, USA
System Name Veral
Processor 5950x
Motherboard MSI MEG x570 Ace
Cooling Corsair H150i RGB Elite
Memory 4x16GB G.Skill TridentZ
Video Card(s) Powercolor 7900XTX Red Devil
Storage Crucial P5 Plus 1TB, Samsung 980 1TB, Teamgroup MP34 4TB
Display(s) Acer Nitro XZ342CK Pbmiiphx + 2x AOC 2425W
Case Fractal Design Meshify Lite 2
Audio Device(s) Blue Yeti + SteelSeries Arctis 5 / Samsung HW-T550
Power Supply Corsair HX850
Mouse Corsair Nightsword
Keyboard Corsair K55
VR HMD HP Reverb G2
Software Windows 11 Professional
Benchmark Scores PEBCAK
IIRC, it's because of where the reading is taken. (I've never quite understood this so it's hard for me to explain) The Intel's temperature is read in the die, while AMD's is read on the package????? They way I think of it; Intel's is read closer to the source of the heat.

But haven't we seen that trend in society? How much of a snot where you when you were 20? (yes @Toothless , I'm talking about you:p) I know I knew everything when I was 18.

I don't think there are many who have a similar experience to discuss. IMHO, most made the switch a long time ago or are holding out for ZEN.
Secondly, people today don't seem to be able to participate in an debate. You offered up your experience, and most have responded with 1-2 sentences. Going deeper than that is foreign to most everyone, it seems. Then there's that time thing....:laugh:

For a long time, my son had an i5-650 and I had an X2-4400+. I would have to go back and forth between the two, and that old 4400+ felt just as fast as his. Way back time machine: IIRC in Win3.1, there was a setting (I can't recall where) to adjust how quickly the GUI responded. Therefore, most likely, the speed that the GUI responds is not determined by the CPU (unless it's really slow/overloaded) but by settings within the OS. I would not be surprised if AMD and Intel have custom "profiles" within OS's and maybe AMD cached more of the directory tree.
I might be snotty at times but i still have love for most of you guys. :D
 
Joined
Nov 9, 2010
Messages
5,653 (1.15/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
All that's really going on with Cities is the Intel chip yielding much higher result due to much higher IPC. Which is why AMD is now seeking 40% more power per core in Zen.

Other than that the Intel sys just has way higher RAM speed, but that may only show as a slight performance increase in very CPU demanding games. However you are right to a degree that you'd need to test for it with same RAM capacity.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
You guys wanted more than a generic post, so...

I was a long time AMD user too. Actually my first really good CPU was a AMD, a Athlon TB 700 Slot A. After that I had a Athlon 1333 TB (Socket A), Athlon 64 X2 3800+ S939 (really expensive and not really worth it, but who would've guessed they are so slow with games that utilize more than 1 core) and finally a Phenom II 940 (AM2+). Last one I used for nearly 4 years from early 2009 (it's release) to end of 2013, I really liked that CPU and system. I changed when it became apparent that the CPU just couldn't cope with BF4, simply because the bandwidth and raw CPU power wasn't there. That's when I changed to a Intel. That was only my 3rd Intel, the 2nd one was a Pentium 166 MMX, the first a Pentium 90. So that was like a "big move" for me.
 
Last edited:
Joined
Jan 17, 2010
Messages
12,280 (2.36/day)
Location
Oregon
System Name Juliette // HTPC
Processor Intel i7 9700K // AMD Ryzen 5 5600G
Motherboard ASUS Prime Z390X-A // ASRock B550 ITX-AC
Cooling Noctua NH-U12 Black // Stock
Memory Corsair DDR4 3600 32gb //G.SKILL Trident Z Royal Series 16GB (2 x 8GB) 3600
Video Card(s) ASUS RTX4070 OC// GTX 1650
Storage Samsung 970 EVO NVMe 1Tb, Intel 665p Series M.2 2280 1TB // Samsung 1Tb SSD
Display(s) ASUS VP348QGL 34" Quad HD 3440 x 1440 // 55" LG 4K SK8000 Series
Case Seasonic SYNCRO Q7// Silverstone Granada GD05
Audio Device(s) Focusrite Scarlett 4i4 // HDMI to Samsung HW-R650 sound bar
Power Supply Seasonic SYNCRO 750 W // CORSAIR Vengeance 650M
Mouse Cooler Master MM710 53G
Keyboard Logitech 920-009300 G512 SE
Software Windows 10 Pro // Windows 10 Pro
Listing your system specs is important when discussing a problem. All manufactures hardware have traits that are characteristic to that company. Most OEMs profit margin is so tight than manufactures try to pass shoddy hardware on to costumers and load their products with "trial software" that can be a nightmare experience. This forum is our voice in that fight for quality hardware that do what we payed for. Its so powerful of a voice that representatives from hardware manufacture troll this forum to see what our experiences are, and to assist with problem when they arise. Sure you get an occasional "AMD sucks" but really its a non issue when you look at the positive that comes out of this place. Our passion is our power, don't silence your voice.
 
Last edited:
Joined
Mar 27, 2007
Messages
809 (0.13/day)
Location
Tampa FL
Processor AMD Ryzen 3800x
Motherboard Asus TUF Gaming X-570-Plus (WiFi)
Cooling Corsair Hydro Series H110i
Memory 32 GB Corsair Dominator RGB DDR4 3200 Mhz @3600
Video Card(s) Asus AREZ RX Vega 64 Strix OC 8GB
Storage Samsung 970 EVO Plus 1 TB
Display(s) Asus PB258Q 25" 2560x1440
Case Cooler Master Master Case 5 Pro
Power Supply Corsair RMx 1000
Software Windows 10 Pro 64
I've always been an AMD Loyalist for a few reasons, first they're the underdog and we should fight for the underdog and they are more budget oriented.
 
Joined
Jan 13, 2016
Messages
660 (0.22/day)
Location
127.0.0.1, London, UK
System Name Warranty Void Mk.IV
Processor AMD Ryzen 5 5600
Motherboard Asus X470-I Strix
Cooling Arctic Freezer 240 + 2x Be Quiet! Pure Wings 2 140mm / Silverstone 120mm Slim
Memory Crucial Ballistix Elite 3600MHz 2x8GB CL16 - Tightened Sub-timings
Video Card(s) EVGA RTX 2080 XC Ultra
Storage WD SN550 / MX300 / MX500
Display(s) AOC CU34G2 / LG 29UM69G-B - Auxilary
Case CM NR200p / Silverstone RVZ03-B
Audio Device(s) Realtek ALC 1220+SupremeFX
Power Supply Corsair CX550M 550W / Silverstone SX650-G 650W
Mouse Logitech G302/G502/G203 / RPG: Corsair Nightsword
Keyboard CM Masterkeys Pro M / Asus Sagaris GK100
VR HMD Oculus Rift S
Software Windows 10 Pro x64 - LTSB
finally a Phenom II 940 (AM2+). Last one I used for nearly 4 years from early 2009 (it's release) to end of 2013, I really liked that CPU and system. I changed when it became apparent that the game just couldn't cope with BF4, simply because the bandwidth and raw CPU power wasn't there.

This. My first actual system that I used for gaming was a Phenom II x4 960T, which I could unlock to a six-core and even then overclock it beyond 4GHz, but what's the point? The TDP was nearing 140W mark, instead of the 95W for stock. And I didn't see much of a performance increase in a lot titles especially ones using DX9 and DX11 API's. The rig did well enough I thought to myself, I built it in Q2 2012 after the releases of a lot of AAA titles.

I'll use Skyrim and Fallout: New Vegas as examples here... I would always look up benchmarks since 2011 to see Intel processors (since Nehalem) performing much faster compared to AMD counterparts. The ONLY reason I built that Phenom II rig was for Skyrim at the time and because money was an issue.

So l looked at pretty much every benchmark of Skyrim I could find and compared the results to my own. And thought "Ok, but I still think that surely Intel chips cannot be that much faster than the Phenom II and FX series processors." I proved myself wrong with the only way possible, by actually selling the 990X AM3+ motherboard and the Phenom II X4 960T.

After being fed up with dreadful stuttering in some games (including the Bethesda Game Studios titles) I replaced my Phenom II rig into one with an Intel i5 2400. Which didn't seem like an upgrade, but more of a downgrade, remember the Phenom X6 1605T @ 4GHz that I unlocked was faster in multi-threading, so a lot of more basic tasks in OS felt better for some reason (placebo? I don't know.). The Single-thread performance was what interested me, I remember my dad being all skeptical about the i5 2400 performance, saying it couldn't beat his FX 8350 in anything or would run on par.

Skyrim was, get this. Twice or almost three times as faster, in stock clock. My only reaction was "**** yeah! Finally I can play games without ever even reaching 70% usage on this thing when overclocked". The GTX 660 I had at the time could really stretch its legs, and after upgrading to a R9 380. The rig feels like a beast for the money I spent on getting it.

The motherboard VRM's are so much cooler, I never have to worry about the motherboard getting hot, especially after changing the thermal paste/pads. With a decent GPU I could run any game no problem, but the Sandy Bridge is showing its age now, more than ever. I'm starting to think I might have to track down an i7 2600K to take advantage of future technologies that favor multi-tasking more.

I'm still excited about AMD Zen, I want to see the next chapter in what's in store for us consumers, good or bad. Maybe I'll tell the story about it to the future generations if I'll still be alive.

Sorry for the poorly written post. I'm just a newcomer here speaking my mind. It's messed up in there, especially when it's late.
 
Last edited:

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
There are many users here that still use their Phenom IIs one even coupled with a Fury X - what a crazy guy. It's out of loyalty I guess, but basically games like Skyrim are the nail to the coffin for Phenoms. That game's not really into multi Cores, I mean, you could probably play Crysis 3 on it and not lose any FPS, even with a Fury X maybe, but then again, Crysis 3 has a really solid multi threaded engine, which even scales up to 8 cores.
 
Joined
Jun 28, 2014
Messages
2,388 (0.67/day)
Location
Shenandoah Valley, Virginia USA
System Name Home Brewed
Processor i9-7900X and i7-8700K
Motherboard ASUS ROG Rampage VI Extreme & ASUS Prime Z-370 A
Cooling Corsair 280mm AIO & Thermaltake Water 3.0
Memory 64GB DDR4-3000 GSKill RipJaws-V & 32GB DDR4-3466 GEIL Potenza
Video Card(s) 2X-GTX-1080 SLI & 2 GTX-1070Ti 8GB G1 Gaming in SLI
Storage Both have 2TB HDDs for storage, 480GB SSDs for OS, and 240GB SSDs for Steam Games
Display(s) ACER 28" B286HK 4K & Samsung 32" 1080P
Case NZXT Source 540 & Rosewill Rise Chassis
Audio Device(s) onboard
Power Supply Corsair RM1000 & Corsair RM850
Mouse Generic
Keyboard Razer Blackwidow Tournament & Corsair K90
Software Win-10 Professional
Benchmark Scores yes
I completely prefer my Intel boxes at the moment, but my last AMD PC (FX-9590) was smooth and powerful.
It had a fast SSD in it,.................and that helped it out a lot. There were two GTX-760 4GB GPUs in it in SLI too.
One thing that it didn't do well was overclock, but it was rock solid at stock speeds.
I gave it to my son to use for his Engineering tasks because I knew that it would work well for him. He loves it so far.

I think that my intel boxes handle memory a little better. I also think that Intel's DATA throughput is quicker than AMD's is.
If Zen is good, I'll probably buy one of them. It doesn't have to whip Intel's ass either. It just has to be good.
 
Joined
Nov 9, 2008
Messages
2,318 (0.41/day)
Location
Texas
System Name Mr. Reliable
Processor Ryzen R9 5950x
Motherboard MSI Meg X570s Ace Max
Cooling D5 Pump, Singularity Top/Res, 2x360mm EK P rads, EK Magnitude/Alphacool Blocks
Memory 32Gb (4x8Gb) Corsair Dominator Platinum 3600Mhz @ 16/19/20/36 1.35v
Video Card(s) MSI 3080ti with Alphacool Block
Storage 2 x Corsair Force MP400 1TB Nvme; 2 x T-Force Cardea Z340; 2 x Mushkin Reactor 1TB
Display(s) Acer 32" Z321QU 2560x1440; LG 34GP83A-B 34" 3440x1440
Case Lian Li PC-011 Dynamic XL; Synology DS218j w/ 2 x 2TB WD Red
Audio Device(s) SteelSeries Arctis Pro+
Power Supply EVGA SuperNova 850G3
Mouse Razer Basilisk V2
Keyboard Das Keyboard 6; Razer Orbweaver Chroma
Software Windows 10 Pro
All that's really going on with Cities is the Intel chip yielding much higher result due to much higher IPC. Which is why AMD is now seeking 40% more power per core in Zen.

Other than that the Intel sys just has way higher RAM speed, but that may only show as a slight performance increase in very CPU demanding games. However you are right to a degree that you'd need to test for it with same RAM capacity.

I'm going to pull a pair of sticks out this evening and find out. The performance increase is absolutely crazy. Quite literally 30-40% increase in frame rate. I just really want to know what % difference is IPC , and what % of the increase is RAM.

This. My first actual system that I used for gaming was a Phenom II x4 960T, which I could unlock to a six-core and even then overclock it beyond 4GHz, but what's the point? The TDP was nearing 140W mark, instead of the 95W for stock. And I didn't see much of a performance increase in a lot titles especially ones using DX9 and DX11 API's. The rig did well enough I thought to myself, I built it in Q2 2012 after the releases of a lot of AAA titles...

I always thought my 9370 was more than sufficient. Truly did not realize the dramatic difference in the platforms themselves.

...The ONLY reason I built that Phenom II rig was for Skyrim at the time and because money was an issue...

This may be the main reason I did not switch the last few years. The Sabertooth 990FX I had the 9370 in began it's life with a Phenom II 965, that was always "good enough", then when I had some extra coin I spent a bit for a 9370, then spent some time playing with different cooling. The price of switching the entire platform was cost prohibitive. Sad thing is that now I know what I was missing all those years.

...Sorry for the poorly written post. I'm just a newcomer here speaking my mind. It's messed up in there, especially when it's late.

I thought it was extremely well written.

I completely prefer my Intel boxes at the moment, but my last AMD PC (FX-9590) was smooth and powerful.

Right after you made the switch, did you happen to notice any difference in responsiveness of the desktop/file file browser??

As I mentioned, the change has really got me thinking about playing with a pair of NV cards. I know performance is better, but is the difference in "visual quality" very dramatic? I know that is a VERY subjective question, but I would like some opinions.

JAT
 
Last edited:

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
Visual quality should be the same, unless you have more GPU power available and can set the graphical settings higher. With CF 290X from what you're coming, you need at least a Titan XP or 2x GTX 1080 to have a real boost and more options available. 1x GTX 1080 wouldn't do much difference if at all (at least if crossfire works in that game). The switch from a Crossfire setup to a single GPU would benefit frame time issues nonetheless, so yes, the quality would be higher, at least in games that have problems with Crossfire, but basically it would be a good move and a lot less power consumption on top (150-200W compared to 500W+).
 
Joined
Aug 2, 2009
Messages
4,013 (0.75/day)
As I mentioned, the change has really got me thinking about playing with a pair of NV cards. I know performance is better, but is the difference in "visual quality" very dramatic? I know that is a VERY subjective question, but I would like some opinions.

JAT

I actually had a very hard time after switching to an NV card, the colors to my eyes are much less vibrant than AMD. Things look fine, but just not as striking. It's been several years now, and a couple of cards later, so it no longer bothers me as much. Talking to a few other users on this forum it seems to be a well known difference between the brands.
 
Joined
Nov 9, 2008
Messages
2,318 (0.41/day)
Location
Texas
System Name Mr. Reliable
Processor Ryzen R9 5950x
Motherboard MSI Meg X570s Ace Max
Cooling D5 Pump, Singularity Top/Res, 2x360mm EK P rads, EK Magnitude/Alphacool Blocks
Memory 32Gb (4x8Gb) Corsair Dominator Platinum 3600Mhz @ 16/19/20/36 1.35v
Video Card(s) MSI 3080ti with Alphacool Block
Storage 2 x Corsair Force MP400 1TB Nvme; 2 x T-Force Cardea Z340; 2 x Mushkin Reactor 1TB
Display(s) Acer 32" Z321QU 2560x1440; LG 34GP83A-B 34" 3440x1440
Case Lian Li PC-011 Dynamic XL; Synology DS218j w/ 2 x 2TB WD Red
Audio Device(s) SteelSeries Arctis Pro+
Power Supply EVGA SuperNova 850G3
Mouse Razer Basilisk V2
Keyboard Das Keyboard 6; Razer Orbweaver Chroma
Software Windows 10 Pro
Visual quality should be the same, unless you have more GPU power available and can set the graphical settings higher. With CF 290X from what you're coming, you need at least a Titan XP or 2x GTX 1080 to have a real boost and more options available. 1x GTX 1080 wouldn't do much difference if at all (at least if crossfire works in that game). The switch from a Crossfire setup to a single GPU would benefit frame time issues nonetheless, so yes, the quality would be higher, at least in games that have problems with Crossfire, but basically it would be a good move and a lot less power consumption on top (150-200W compared to 500W+).

I'm definitely ready for a new pair of cards soon, as these 290Xs get hot as hell, but I'm not willing to spend the money on blocks for cards I know I'm about to replace. As far as new cards, I'm not going to go with a GP104, and I'm a little thrown that even the Pascal Titan is GP102, which should mean the "ti" model will use it too...It's beginning to look like we will not see a consumer card base on GP100? I'm waiting to see what Vega brings before I make any decisions, and I think right now the cost of GP104 and 102 based cards are too high. But the switch to Intel has made me want to try every manufacturer of hardware now, just to become familiar with the particular nuances of each.

I actually had a very hard time after switching to an NV card, the colors to my eyes are much less vibrant than AMD. Things look fine, but just not as striking. It's been several years now, and a couple of cards later, so it no longer bothers me as much. Talking to a few other users on this forum it seems to be a well known difference between the brands.

This is what I've also heard, and partially what I was referring to with the question. Glad to hear a first hand account of it...
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
It's beginning to look like we will not see a consumer card base on GP100?
The GP100 has a lot of useless space occupied with double precision shaders and other things that are useless to gamers - basically the GP102 is a GP100 without useless parts to save space, energy and cost. What you'd want though is a GP102 fully activated (3840 shaders), a "Titan X Black", comparable to the original Titan Black. And if Vega is nice, maybe Nvidia is forced to come with a "1080 Ti" that has all shaders activated, being similar to a 780 Ti which has more shaders activated compared to the original Titan too.
I'm waiting to see what Vega brings before I make any decisions, and I think right now the cost of GP104 and 102 based cards are too high. But the switch to Intel has made me want to try every manufacturer of hardware now, just to become familiar with the particular nuances of each.
Wise choice.
 
Last edited:
Joined
Oct 22, 2014
Messages
13,210 (3.81/day)
Location
Sunshine Coast
System Name Black Box
Processor Intel Xeon E3-1260L v5
Motherboard MSI E3 KRAIT Gaming v5
Cooling Tt tower + 120mm Tt fan
Memory G.Skill 16GB 3600 C18
Video Card(s) Asus GTX 970 Mini
Storage Kingston A2000 512Gb NVME
Display(s) AOC 24" Freesync 1m.s. 75Hz
Case Corsair 450D High Air Flow.
Audio Device(s) No need.
Power Supply FSP Aurum 650W
Mouse Yes
Keyboard Of course
Software W10 Pro 64 bit
.....No worries though..Life is too short to subject yourself to rude and offensive behavior.
It's also too short to waste time misinterpreting what others say and get huffy over.
I haven't seen any rudeness in this thread, but your comments are now starting to infuse it.
 

Kanan

Tech Enthusiast & Gamer
Joined
Aug 22, 2015
Messages
3,517 (1.11/day)
Location
Europe
System Name eazen corp | Xentronon 7.2
Processor AMD Ryzen 7 3700X // PBO max.
Motherboard Asus TUF Gaming X570-Plus
Cooling Noctua NH-D14 SE2011 w/ AM4 kit // 3x Corsair AF140L case fans (2 in, 1 out)
Memory G.Skill Trident Z RGB 2x16 GB DDR4 3600 @ 3800, CL16-19-19-39-58-1T, 1.4 V
Video Card(s) Asus ROG Strix GeForce RTX 2080 Ti modded to MATRIX // 2000-2100 MHz Core / 1938 MHz G6
Storage Silicon Power P34A80 1TB NVME/Samsung SSD 830 128GB&850 Evo 500GB&F3 1TB 7200RPM/Seagate 2TB 5900RPM
Display(s) Samsung 27" Curved FS2 HDR QLED 1440p/144Hz&27" iiyama TN LED 1080p/120Hz / Samsung 40" IPS 1080p TV
Case Corsair Carbide 600C
Audio Device(s) HyperX Cloud Orbit S / Creative SB X AE-5 @ Logitech Z906 / Sony HD AVR @PC & TV @ Teufel Theater 80
Power Supply EVGA 650 GQ
Mouse Logitech G700 @ Steelseries DeX // Xbox 360 Wireless Controller
Keyboard Corsair K70 LUX RGB /w Cherry MX Brown switches
VR HMD Still nope
Software Win 10 Pro
Benchmark Scores 15 095 Time Spy | P29 079 Firestrike | P35 628 3DM11 | X67 508 3DM Vantage Extreme
btw. one reason NV has worse colors compared to AMD is maybe the driver settings.

Go under NV Control Panel to Adjust Video Color Settings -> "With the Nvidia Settings" -> "Dynamic Range" -> "Full". Makes a lot of difference, because blacks aren't really black (rather grey) with "Limited" and Limited is the standard setting. Compare in a video or movie, the difference is pretty clear.

If you use HDMI you must do the same under "Change Resolution".
 
Last edited:

Toothless

Tech, Games, and TPU!
Supporter
Joined
Mar 26, 2014
Messages
9,268 (2.52/day)
Location
Washington, USA
System Name Veral
Processor 5950x
Motherboard MSI MEG x570 Ace
Cooling Corsair H150i RGB Elite
Memory 4x16GB G.Skill TridentZ
Video Card(s) Powercolor 7900XTX Red Devil
Storage Crucial P5 Plus 1TB, Samsung 980 1TB, Teamgroup MP34 4TB
Display(s) Acer Nitro XZ342CK Pbmiiphx + 2x AOC 2425W
Case Fractal Design Meshify Lite 2
Audio Device(s) Blue Yeti + SteelSeries Arctis 5 / Samsung HW-T550
Power Supply Corsair HX850
Mouse Corsair Nightsword
Keyboard Corsair K55
VR HMD HP Reverb G2
Software Windows 11 Professional
Benchmark Scores PEBCAK
btw. one reason NV has worse colors compared to AMD is maybe the driver settings.

Go under NV Control Panel to Adjust Video Color Settings -> "With the Nvidia Settings" -> "Dynamic Range" -> "Full". Makes a lot of difference, because blacks aren't really black (rather grey) with "Limited" and Limited is the standard setting. Compare in a video or movie, the difference is pretty clear.

If you use HDMI you must do the same under "Change Resolution".
thank you i needed this
 
Joined
Jan 1, 2016
Messages
522 (0.17/day)
Location
Beachy Gulf Beach
System Name RadActive Dragon & Black Dragon
Processor i7 5960X & 1090T
Motherboard Gigabyte X99-UD3P & Asus Sabertooth 990FX
Cooling EK EVO & Corsair H50
Memory GSkill TridentZ 32GBs 3200MHz & Vegeance 8GBs 1600MHz
Video Card(s) 2x 980Ti & 960 FTW
Storage Samsung Pro 256GB & Intel 120GB
Case Heavy modded Haf X & NZXT Apollo
Power Supply AX1200 & AX1200i
Software Windows 7 & Ubuntu 15.04
ignore those posts...if it gets out of hand, report THOSE posts/users for moderation help.

But, honestly, this is MOST forums, including TPU, when they get big there is always that type of behavior.

You should join Overclockers.com...smaller community, just as if not more knowledgeable than here and WAY less riff raff in the forums.

Anyway, I digress. :)
No kidding. This is tame compare to OCN. To save yourself, never go into a news thread on a new GPU.....talk about one hell of a craze storm between fanboys.

But on the OP's main post.

Cities:Skylines loves fast single thread performance. Since it is a building simulation game, such games tend to get CPU bound pretty fast before the GPU can be fully be used. Some others over on OCN in the thread for the game report once you get really high pop the game will actually start slowing down because of all the calculations it has to do.

About the reason the Intel chip runs hotter. Smaller nm and more transistors in a small space. Hence, the heat is more concentrated in a small space. This is also reason why it more of a challenge to get higher GHz. Can't remember all the particular reasons for it. Would have to find that huge thread debate over on OCN on the matter.

On the snappy feel of AMD, can't really say I ever notice that since both of my rigs run off SSDs. Only time I ever notice something being snappier was putting my folder rig over to Linux. I currently run a 1090T on a Asus 990FX board in the folder. The main rig has a 4770K delidded on a MSI Z87-GD65 board. On motherboard heat, never paid attention to it.

Plus, can't say on anything about GPU usage on AMD or Intel since I don't fret over small percentages or fps majority of the time as long as it can run the game. Also, I never game much on any of the AMD rigs I had since I use my current AMD for folding and boinc tasks.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.59/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
I'm going to pull a pair of sticks out this evening and find out. The performance increase is absolutely crazy. Quite literally 30-40% increase in frame rate. I just really want to know what % difference is IPC , and what % of the increase is RAM.



I always thought my 9370 was more than sufficient. Truly did not realize the dramatic difference in the platforms themselves.



This may be the main reason I did not switch the last few years. The Sabertooth 990FX I had the 9370 in began it's life with a Phenom II 965, that was always "good enough", then when I had some extra coin I spent a bit for a 9370, then spent some time playing with different cooling. The price of switching the entire platform was cost prohibitive. Sad thing is that now I know what I was missing all those years.



I thought it was extremely well written.



Right after you made the switch, did you happen to notice any difference in responsiveness of the desktop/file file browser??

As I mentioned, the change has really got me thinking about playing with a pair of NV cards. I know performance is better, but is the difference in "visual quality" very dramatic? I know that is a VERY subjective question, but I would like some opinions.

JAT

I guess I got lucky with a 125 W chip then to oc far.
 
Top