• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU for Unreal engine

Joined
Jan 8, 2009
Messages
548 (0.10/day)
System Name AMD RyZen PC
Processor AMD RyZen 5950x
Motherboard ASUS Crosshair VIII Hero 570x WIFI
Cooling Custom Loop
Memory 64GB G.Skill Trident Z DDR4 3200 MHz 14C x4
Video Card(s) Evga 3080 TI
Storage Seagate 8TB + 3TB + 4TB + 2TB external + 512 Samsung 980
Display(s) LG 4K 144Hz 27GN950-B
Case Thermaltake CA-1F8-00M1WN-02 Core X71 Tempered Glass Edition Black
Audio Device(s) XI-FI 8.1
Power Supply EVGA 700W
Mouse Microsoft
Keyboard Microsoft
Software Windows 10 x64 Pro
Hello…

i am currently working on some game development and in need of a GpU upgrade from Vega64

I have narrowed down to 2 cards that i can buy… both similarly priced thanks to scalpers… :/

AMD 6900xt or 3080TI

Which one do you guys think will be a better choice. I am working on large landscape and not sure how the 16gb Vram of 6900 will perform over the 12gb of 3080TI.

I do know when it come to RT, there is no question and I am planning to use RT.

have a 750w PSU. Will that be enough for 3080 TI or 6900XT ?

other specs…

CPU 5950x
RAM 3200mhz 14c 64GB ram..
MB Asus hero VIII wifi 570x
4 SSD and 1 M.2
 
Last edited:
Joined
Jun 21, 2021
Messages
2,623 (2.59/day)
System Name daily driver Mac mini M2 Pro
Processor Apple Silicon M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple Silicon M2 Pro (16-core GPU)
Storage Apple proprietary 512GB SSD + various external HDDs
Display(s) LG 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
Software macOS Ventura 13 (including latest patches)
Benchmark Scores (My Windows daily driver is a Beelink Mini S12. I'm not interested in benchmarking.)
I'll refrain from making any comments about the technical pros and cons of either brand since that is regularly covered by PC & gaming writers and reviewers. I personally own two Radeon GPUs and two GeForce GPUs and none of the four are side-by-side competitors in terms of capabilities. I am happy with all four but I don't ask them to fulfill the same roles.

As for power requirements you should consult the AMD and Nvidia corporate websites for this information. A brief glance at the Videocardz.net database offers a 850W recommendation for the Radeon 6900 XT and 750W rec for the GeForce 3080 Ti but it would be unwise to proceed without looking for an authoritative answer from AMD and Nvidia themselves, at least for the reference cards. Note that some of the third-party AIB partner cards may recommend even more power so feel free to consult those companies' websites as well.

Do you care about market penetration? One brand dominates the discrete gaming GPU market. Anyone can review and tabulate the regular Steam hardware surveys to see what the most popular cards are used by that store's ecosystem.

One approach is to develop for the weaker hardware. Game developers do this often to capture a larger market. If your game is only playable by two cards, you won't have many sales of your title. If you develop for Nintendo Switch, you will have a 70+ million installed user base plus the opportunity to capture PS4/Xbox One, PS5/Xbox Series X|S and the PC market if you choose to support those other platforms.
 
Last edited:
Joined
Jan 8, 2009
Messages
548 (0.10/day)
System Name AMD RyZen PC
Processor AMD RyZen 5950x
Motherboard ASUS Crosshair VIII Hero 570x WIFI
Cooling Custom Loop
Memory 64GB G.Skill Trident Z DDR4 3200 MHz 14C x4
Video Card(s) Evga 3080 TI
Storage Seagate 8TB + 3TB + 4TB + 2TB external + 512 Samsung 980
Display(s) LG 4K 144Hz 27GN950-B
Case Thermaltake CA-1F8-00M1WN-02 Core X71 Tempered Glass Edition Black
Audio Device(s) XI-FI 8.1
Power Supply EVGA 700W
Mouse Microsoft
Keyboard Microsoft
Software Windows 10 x64 Pro
I'll refrain from making any comments about the technical pros and cons of either brand since that is regularly covered by PC & gaming writers and reviewers. I personally own two Radeon GPUs and two GeForce GPUs and none of the four are side-by-side competitors in terms of capabilities. I am happy with all four but I don't ask them to fulfill the same roles.

As for power requirements you should consult the AMD and Nvidia corporate websites for this information. A brief glance at the Videocardz.net database offers a 850W recommendation for the Radeon 6900 XT and 750W rec for the GeForce 3080 Ti but it would be unwise to proceed without looking for an authoritative answer from AMD and Nvidia themselves, at least for the reference cards. Note that some of the third-party AIB partner cards may recommend even more power so feel free to consult those companies' websites as well.

Do you care about market penetration? One brand dominates the discrete gaming GPU market. Anyone can review and tabulate the regular Steam hardware surveys to see what the most popular cards are used by that store's ecosystem.

One approach is to develop for the weaker hardware. Game developers do this often to capture a larger market. If your game is only playable by two cards, you won't have many sales of your title. If you develop for Nintendo Switch, you will have a 70+ million installed user base plus the opportunity to capture PS4/Xbox One, PS5/Xbox Series X|S and the PC market if you choose to support those other platforms.

Good points. The game will run on low end system as well. But i need to load the whole landscape on my development computer.

the only think that concerns me is the life of the GPU. Does nvidia last as long as AMD ? I personally know people who suffered burned GPU with Nvidia including me.. i lost 2 nvidia cards before switching to AMD 16 years ago…

anyway. I have ordered 3080TI Evga. Hope it will be good. For 1960$ its very expensive but i am stuck as UE5 runs like crap on Vega64
 
Joined
Jun 21, 2021
Messages
2,623 (2.59/day)
System Name daily driver Mac mini M2 Pro
Processor Apple Silicon M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple Silicon M2 Pro (16-core GPU)
Storage Apple proprietary 512GB SSD + various external HDDs
Display(s) LG 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
Software macOS Ventura 13 (including latest patches)
Benchmark Scores (My Windows daily driver is a Beelink Mini S12. I'm not interested in benchmarking.)
I do not know of any statistically significant recent study of GPUs that show that one brand or the other is considerably better in terms of hardware reliability.

I'm pretty sure if one was vastly more reliable than the other, it would be heavily mentioned and discussed all over the Internet. Moreover I'm sure sales, company financial results and stock prices would also reflect some of this.

Both AMD and Nvidia have deployed SoCs in a variety of non-PC devices (like videogame consoles) that would also reflect hardware reliability. Current and previous generation PlayStations and Xboxes use AMD SoCs. The Nintendo Switch uses an Nvidia SoC.

I am aware that Nvidia did have a spell of poor hardware QA that prompted Apple Inc. to ditch them as a discrete GPU supplier about ten years ago. But more than the hardware issues was Nvidia's arrogant and dismissive attitude about the problems.

For sure we live in a different PC hardware environment today and no PC hardware vendor has a pristine track record over their lifespan. Companies that were once great have fallen (or disappeared) and we have new leaders.

I do know that some companies that have made mistakes eventually owned up to them, apologized, and moved on. ASUS is one of these; I bought ASUS motherboards in the Nineties and two of my current builds have ASUS mobos. There are other companies that are no longer around. There are a few that appear to be circling around the drain.

Anyhow best of luck with your software development.

Disclaimer: I own 2 GeForce graphics cards and 2 Radeon graphics cards and I am satisfied with all four. There are three other computers in the house with Intel integrated graphics (including a Mac mini). Those are also satisfactory in terms of hardware reliability.
 
Last edited:
Joined
Aug 28, 2021
Messages
364 (0.39/day)
not sure how the 16gb Vram of 6900 will perform over the 12gb of 3080TI
Should only make a difference during optimization stage , and only in larger resolutions , dunno what resolution you plan working on but I don’t think it will have an impact until you start crossing 8k .

Also what psu are you using? A high end 750w can be enough, a low end 750w will definitely limit you .
I see you picked the 3080ti , great choice and good luck with your work :)
 
Joined
Jan 8, 2009
Messages
548 (0.10/day)
System Name AMD RyZen PC
Processor AMD RyZen 5950x
Motherboard ASUS Crosshair VIII Hero 570x WIFI
Cooling Custom Loop
Memory 64GB G.Skill Trident Z DDR4 3200 MHz 14C x4
Video Card(s) Evga 3080 TI
Storage Seagate 8TB + 3TB + 4TB + 2TB external + 512 Samsung 980
Display(s) LG 4K 144Hz 27GN950-B
Case Thermaltake CA-1F8-00M1WN-02 Core X71 Tempered Glass Edition Black
Audio Device(s) XI-FI 8.1
Power Supply EVGA 700W
Mouse Microsoft
Keyboard Microsoft
Software Windows 10 x64 Pro
Disclaimer: I own 2 GeForce graphics cards and 2 Radeon graphics cards and I am satisfied with all four. There are three other computers in the house with Intel integrated graphics (including a Mac mini). Those are also satisfactory in terms of hardware reliability.

Thank you guys. I finally got my 3080 IT..

Not very happy finding out they have the crappiest drivers coming from AMD. I did not realized AMD drivers were so good/advanced/crash free

I did some testing games with same settings locked at 60FPS and I see 338w usage compared to my vega64 doing same FPS with same settings for 165w or so. and Picture quality is not crisp as with AMD. everything looks faded compared to AMD. Tho I set AA and all kind of quality stuff. it still looks not as crisp as AMD.

Also there is no fan control on the Nvidia driver. it feels so inferior other then good FPS thanks to blur image out of focus. If I am looking straight, I can see the textures switching to low quality on the side of the screen even in Division 2 which does not support DLSS or any kind of Nvidia tech. Also driving around in forza horizon 4 I see mountains and stuff popping up while driving which never happens with AMD card. its more like Nvidia auto decides draw distance and texture quality on everything. Even in Unreal Engine.

I am surprised that people have not noticed all this or never bothered to talk about it. Even at Idle, Vega64 uses like 14w and this uses 54w :/

Is there a way I can install just the Nvidia driver without all the crazy useless software's ?

For the blurry stuff I am talking about. I tried DUU and all others software's and finally did a clean install of windows thinking AMD driver may be causing issues even after cleaning it so many times in sidemode.

MSI after burn keeps crashing Unreal engine. so I can use it and EVGA Precision X1 software does not have the kind of OSD MSI afterburn provides. :/ Any solution for that ?

@cvaldes, Have you never noticed texture and quality issues with image between AMD and NVIDIA ?




Should only make a difference during optimization stage , and only in larger resolutions , dunno what resolution you plan working on but I don’t think it will have an impact until you start crossing 8k .

Also what psu are you using? A high end 750w can be enough, a low end 750w will definitely limit you .
I see you picked the 3080ti , great choice and good luck with your work :)

I do 4k in cinematic setting lol
 
Joined
Oct 26, 2018
Messages
194 (0.10/day)
Processor Intel i5-13600KF
Motherboard ASRock Z790 PG Lightning
Cooling NZXT Kraken 240
Memory Corsair Vengeance DDR5 6400
Video Card(s) XFX RX 7800 XT
Storage Samsung 990 Pro 2 TB + Samsung 860 EVO 1TB
Display(s) Dell S2721DGF 165Hz
Case Fractal Meshify C
Power Supply Seasonic Focus 750
Mouse Logitech G502 HERO
Keyboard Logitech G512
I am surprised that people have not noticed all this or never bothered to talk about it.
I am surprised by your opinion. In my experience every UE game looks better on nvidia, and runs faster on otherwise equal cards.
Maybe take a good look at your global 3d setting in nvidia control panel.

There is a tool here called nvcleaninstall to get rid of junk in nvidia driver package.
If there is a real problem with afterburner, you can ask about it directly at guru3d.com

I'd love to see some full size pics of exactly what problems you're seeing in Division 2.
 
Joined
Jan 8, 2009
Messages
548 (0.10/day)
System Name AMD RyZen PC
Processor AMD RyZen 5950x
Motherboard ASUS Crosshair VIII Hero 570x WIFI
Cooling Custom Loop
Memory 64GB G.Skill Trident Z DDR4 3200 MHz 14C x4
Video Card(s) Evga 3080 TI
Storage Seagate 8TB + 3TB + 4TB + 2TB external + 512 Samsung 980
Display(s) LG 4K 144Hz 27GN950-B
Case Thermaltake CA-1F8-00M1WN-02 Core X71 Tempered Glass Edition Black
Audio Device(s) XI-FI 8.1
Power Supply EVGA 700W
Mouse Microsoft
Keyboard Microsoft
Software Windows 10 x64 Pro
I am surprised by your opinion. In my experience every UE game looks better on nvidia, and runs faster on otherwise equal cards.
Maybe take a good look at your global 3d setting in nvidia control panel.

There is a tool here called nvcleaninstall to get rid of junk in nvidia driver package.
If there is a real problem with afterburner, you can ask about it directly at guru3d.com

I'd love to see some full size pics of exactly what problems you're seeing in Division 2.


Ya I am surprised as well after spending 2k on 3080TI. I don't really care about afterburn at this point. My concerns are more about texture quality and stuff. Sure I will post it here. I need some time to do all that work again so I can take screenshot.

Should I try Nvcleaninstall as I have already does a clean install of windows and have reinstalled Sutido drivers ? Yes I am not on Game Driver due to the nature of what I am doing.
 
Joined
Jun 21, 2021
Messages
2,623 (2.59/day)
System Name daily driver Mac mini M2 Pro
Processor Apple Silicon M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple Silicon M2 Pro (16-core GPU)
Storage Apple proprietary 512GB SSD + various external HDDs
Display(s) LG 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
Software macOS Ventura 13 (including latest patches)
Benchmark Scores (My Windows daily driver is a Beelink Mini S12. I'm not interested in benchmarking.)
I've seen plenty of side-by-side comparisons between AMD and Nvidia graphics cards comparing game graphics and I've rarely seen a large difference in pure rasterization image quality between the two.

For sure there are studios that optimize performance for one or the other. There are also large game studios that clearly have access to a large variety of graphics cards from both AMD and Nvidia, as well as console dev kits (often multiple generations like PS4, PS4 Pro, PS5) and will work hard to ensure that their titles perform satisfactorily on multiple platforms and multiple generations of graphics architecture.

There are also studios that do a piss poor job at balancing game code between different platforms (*cough* CDPR *cough*).

Now that RDNA2 has ray tracing, I've seen comparisons between AMD and Nvidia's ray tracing implementations. Same with AMD FSR and Nvidia DLSS. Because AMD's implementations of RT and SS are more recent, they appear to trail Nvidia's efforts. In time, I expect AMD and those who develop with their GPUs to catch up. For sure the fact that AMD tech powers the newest game consoles will encourage competent developers to master these.

If a particular title has large image quality differences between AMD and Nvidia GPUs, I would be more inclined to attribute that to the familiarity of the developer with GPU architectures rather than the architectures themselves since there are clearly titles that are well executed on both.

If there were glaringly shortcomings in one or the other, clearly they would have been well noted by PC reviewers, game reviewers, and various pundits. It would probably show up in both companies' fiscal results as well.
 
Joined
Jan 8, 2009
Messages
548 (0.10/day)
System Name AMD RyZen PC
Processor AMD RyZen 5950x
Motherboard ASUS Crosshair VIII Hero 570x WIFI
Cooling Custom Loop
Memory 64GB G.Skill Trident Z DDR4 3200 MHz 14C x4
Video Card(s) Evga 3080 TI
Storage Seagate 8TB + 3TB + 4TB + 2TB external + 512 Samsung 980
Display(s) LG 4K 144Hz 27GN950-B
Case Thermaltake CA-1F8-00M1WN-02 Core X71 Tempered Glass Edition Black
Audio Device(s) XI-FI 8.1
Power Supply EVGA 700W
Mouse Microsoft
Keyboard Microsoft
Software Windows 10 x64 Pro
I've seen plenty of side-by-side comparisons between AMD and Nvidia graphics cards comparing game graphics and I've rarely seen a large difference in pure rasterization image quality between the two.

For sure there are studios that optimize performance for one or the other. There are also large game studios that clearly have access to a large variety of graphics cards from both AMD and Nvidia, as well as console dev kits (often multiple generations like PS4, PS4 Pro, PS5) and will work hard to ensure that their titles perform satisfactorily on multiple platforms and multiple generations of graphics architecture.

There are also studios that do a piss poor job at balancing game code between different platforms (*cough* CDPR *cough*).

Now that RDNA2 has ray tracing, I've seen comparisons between AMD and Nvidia's ray tracing implementations. Same with AMD FSR and Nvidia DLSS. Because AMD's implementations of RT and SS are more recent, they appear to trail Nvidia's efforts. In time, I expect AMD and those who develop with their GPUs to catch up. For sure the fact that AMD tech powers the newest game consoles will encourage competent developers to master these.

If a particular title has large image quality differences between AMD and Nvidia GPUs, I would be more inclined to attribute that to the familiarity of the developer with GPU architectures rather than the architectures themselves since there are clearly titles that are well executed on both.

If there were glaringly shortcomings in one or the other, clearly they would have been well noted by PC reviewers, game reviewers, and various pundits. It would probably show up in both companies' fiscal results as well.
Actually i brought it up as i see this in every game. I will try to do a video capture or something to show the difference once i get some time.
 
Joined
Jun 21, 2021
Messages
2,623 (2.59/day)
System Name daily driver Mac mini M2 Pro
Processor Apple Silicon M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple Silicon M2 Pro (16-core GPU)
Storage Apple proprietary 512GB SSD + various external HDDs
Display(s) LG 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
Software macOS Ventura 13 (including latest patches)
Benchmark Scores (My Windows daily driver is a Beelink Mini S12. I'm not interested in benchmarking.)
I will reiterate that I have seen many comparisons by PC and videogame reviewers between AMD and Nvidia and the image quality results are quite comparable. In fact, the rasterized image quality is so close that most reviewers focus on quantitative measurements like FPS or other qualitative measurements like ray tracing aesthetics or super-sampling prowess.

It's worth pointing out that you are a longtime AMD proponent. It would be completely understandable if you have not yet figured out how to best configure your new GeForce card especially because the two control panels are vastly different and you've owned the GeForce card for less than a week.

One thing I cannot speak for is the fact that you are using the Studio driver. All of the videogame image comparisons are with Nvidia's Game Ready drivers not their Studio drivers.

Remember that Unreal Engine -- as a videogame engine -- is designed to be run using Nvidia's Game Ready drivers as part of the preferred software environment.

In the context of a writer analyzing videogame images, I don't recall ever reading an article where the author chose to use the Studio driver. I don't ever recall a game developer recommending players to install the Studio driver either.

Without a doubt the big game studios test their titles' gold master release candidates with the Game Ready driver because that's what the typical videogamer is going to install.
 
Last edited:

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,866 (3.00/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
I’d go with NVIDIA as it has much better RT performance. Also, both cards have been reviewed by TPU, so don’t take anyone’s word for it, just get the objectI’ve truth from there. After that, it would be a good idea to read other reviews from reputable sites.
 
Joined
Jul 2, 2021
Messages
120 (0.12/day)
System Name Black Diamond
Processor AMD Ryzen 5 3600X 6-Core Processor 3.80 GHz
Motherboard msi a320 a pro max
Cooling Default
Memory 32 GB ddr4 3200mhz
Video Card(s) Rtx 2070, 86 F no extra fans or stuff.
Storage nvme samsung (the fastest one) 1tb
Display(s) Some sceptre monitor and a tv
Case rosewill
Power Supply rosewill
Mouse Verbatim
Keyboard Magic wings
Software Lots
Benchmark Scores IDk
bro srsly people dont know, dont buy something if you dont know what is. Rtx is better get a higher power supply. Solved close this.
 
Joined
Jan 8, 2009
Messages
548 (0.10/day)
System Name AMD RyZen PC
Processor AMD RyZen 5950x
Motherboard ASUS Crosshair VIII Hero 570x WIFI
Cooling Custom Loop
Memory 64GB G.Skill Trident Z DDR4 3200 MHz 14C x4
Video Card(s) Evga 3080 TI
Storage Seagate 8TB + 3TB + 4TB + 2TB external + 512 Samsung 980
Display(s) LG 4K 144Hz 27GN950-B
Case Thermaltake CA-1F8-00M1WN-02 Core X71 Tempered Glass Edition Black
Audio Device(s) XI-FI 8.1
Power Supply EVGA 700W
Mouse Microsoft
Keyboard Microsoft
Software Windows 10 x64 Pro
Ya I am surprised as well after spending 2k on 3080TI. I don't really care about afterburn at this point. My concerns are more about texture quality and stuff. Sure I will post it here. I need some time to do all that work again so I can take screenshot.

Should I try Nvcleaninstall as I have already does a clean install of windows and have reinstalled Sutido drivers ? Yes I am not on Game Driver due to the nature of what I am doing.

You are correct. The Nvidia Experience profiles is messing up with all the quality for extra FPS. :/ I installed Driver only with Physix and used windows store control center and I do not see the texture glitch again. But I only tested it with division. have to do it with rest. however after I removed all the profile stuff. FPS has dropped and now when I get into main building in Division2 the GPU usage goes to 0 and it moves in like 3FPS :/ but as soon as I come out it goes fixed. have not found a solution for that yet.

I’d go with NVIDIA as it has much better RT performance. Also, both cards have been reviewed by TPU, so don’t take anyone’s word for it, just get the objectI’ve truth from there. After that, it would be a good idea to read other reviews from reputable sites.

bro srsly people dont know, dont buy something if you dont know what is. Rtx is better get a higher power supply. Solved close this.


Thanks but I think you guys missed my post that I already got it and have been discussion issues I am facing. But ya thanks anyways.
 
Joined
Jun 21, 2021
Messages
2,623 (2.59/day)
System Name daily driver Mac mini M2 Pro
Processor Apple Silicon M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple Silicon M2 Pro (16-core GPU)
Storage Apple proprietary 512GB SSD + various external HDDs
Display(s) LG 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
Software macOS Ventura 13 (including latest patches)
Benchmark Scores (My Windows daily driver is a Beelink Mini S12. I'm not interested in benchmarking.)
The Nvidia Experience profiles is messing up with all the quality for extra FPS. :/
If you are referring to GeForce Experience, yes the Game Ready drivers can be installed without it. Just download the standalone installer package; it will give you the choice of installing just the drivers or the drivers + GeForce Experience. You may wish to do a driver cleanup using Display Driver Uninstaller first.

GeForce Experience attempts to tweak a user's card settings to something that will the majority of gamers will find satisfactory. It's there to reduce the amount of time an individual needs to spend on tweaking settings on their own to balance quality and performance.

AMD's Radeon Software has the same functionality under the Gaming tab but it doesn't kick in automatically -- at least the way that I have it configured.
 
Last edited:
Joined
Jan 8, 2009
Messages
548 (0.10/day)
System Name AMD RyZen PC
Processor AMD RyZen 5950x
Motherboard ASUS Crosshair VIII Hero 570x WIFI
Cooling Custom Loop
Memory 64GB G.Skill Trident Z DDR4 3200 MHz 14C x4
Video Card(s) Evga 3080 TI
Storage Seagate 8TB + 3TB + 4TB + 2TB external + 512 Samsung 980
Display(s) LG 4K 144Hz 27GN950-B
Case Thermaltake CA-1F8-00M1WN-02 Core X71 Tempered Glass Edition Black
Audio Device(s) XI-FI 8.1
Power Supply EVGA 700W
Mouse Microsoft
Keyboard Microsoft
Software Windows 10 x64 Pro
If you are referring to GeForce Experience, yes the Game Ready drivers can be installed without it. Just download the standalone installer package; it will give you the choice of installing just the drivers or the drivers + GeForce Experience. You may wish to do a driver cleanup using Display Driver Uninstaller first.

GeForce Experience attempts to tweak a user's card settings to something that will the majority of gamers will find satisfactory. It's there to reduce the amount of time an individual needs to spend on tweaking settings on their own to balance quality and performance.

AMD's Radeon Software has the same functionality under the Gaming tab but it doesn't kick in automatically -- at least the way that I have it configured.

Thanks, Yup I see that.

Now the issue is, Without the GeForce Experience I am having issues with division 2. The solution on web is to change the game from Dx12 to Dx11 which I do not like.

Games starts lagging once I enter Whitehouse or some building. when I look at clock speed of GPU on MSI afterburn. its stuck at 210/405 as if I am on desktop... its like the GPU thinks I stopped gaming or something and then kicks back up once I am out of the building No clue how to fix that.
 
Joined
Feb 20, 2021
Messages
80 (0.07/day)
Location
United States
Processor Intel i7 2600k @ Stock
Motherboard Asrock Z68 Pro 3
Cooling Corsair A70
Memory 16GB DDR3
Video Card(s) Nvidia Geforce GTX 970
Storage 1x Samsung 860 Evo 250GB, 2x WD 1TB HDD
Display(s) Samsung 2253BW
Case Silverstone RL06
Power Supply Corsair RM850x
Games starts lagging once I enter Whitehouse or some building. when I look at clock speed of GPU on MSI afterburn. its stuck at 210/405 as if I am on desktop... its like the GPU thinks I stopped gaming or something and then kicks back up once I am out of the building No clue how to fix that.

Have you tried putting the power management mode in the control panel to "Prefer Maximum Performance"? I'm pretty sure it prevents the card from reducing clocks when it's under load. If that doesn't work, then it might be the game being funky.
 
Joined
Feb 20, 2019
Messages
7,194 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
As a game developer I'd say that you probably want to stick to AMD since that's what Sony and Microsoft's consoles have been running for two generations + refreshes now.

Game engines are optimised for consoles first since that's a higher-profit market and easier to troubleshoot due to limited permutations of hardware variety. We had guys running Navi GPUs and their compiled output would work great on a wide range of hardware including integrated graphics. The one guy who insisted he had to have a 2080Ti for the extra VRAM and RT support ended up just making extra work for himself because nothing would run the way he wanted it to on any non-RTX hardware.

Stay away from Nvidia cards unless you intend to completely ignore all of their proprietary features and Nvidia dev utils.

Disclaimer:
I'm not a game developer and barely really understand what the guys in my office are doing with Unreal/Unity. I just get called in to troubleshoot their stuff when it doesn't work.


One thing I cannot speak for is the fact that you are using the Studio driver. All of the videogame image comparisons are with Nvidia's Game Ready drivers not their Studio drivers.
Oh, I see you've already bought the Geforce.
Well, from experience the Studio driver is not good. It might work for Adobe or Autodesk but for game engine stuff it was a disaster - it just seemed hopelessly out of date for the latest version of software. Use the Game-ready drivers without Geforce Experience for sure.

Also, if you're having issues after running Vega, use DDU in safe mode and do a clean install. I CTRL+F'ed this thread and there was not a single mention of DDU which exists exclusively because this is a very real and very common problem. Even if that doesn't solve your issue, at least you can rule out driver conflicts by using DDU and installing fresh....
 
Last edited:

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,866 (3.00/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Have you tried putting the power management mode in the control panel to "Prefer Maximum Performance"? I'm pretty sure it prevents the card from reducing clocks when it's under load. If that doesn't work, then it might be the game being funky.
I found that it made quiite a difference on my GTX 580 and 780 Ti to put them on max performance. However, on my 2080, there's almost no difference. I say try it and see, can't hurt and will just take a few minutes of his time.
 
Top