• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Three New NVIDIA Tools Help Dev's Quickly Debug and Speed-up Games

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,349 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Today's top video games use complex programming and rendering techniques that can take months to create and tune in order to get the image quality and silky-smooth frame rates that gamers demand. Thousands of developers worldwide including members of Blizzard Entertainment, Crytek, Epic Games, and Rockstar Games rely on NVIDIA development tools to create console and PC video games. Today, NVIDIA has expanded its award-winning development suite with three new tools that vastly speed up this development process, keeping projects on track and costs under control.

The new tools which are available now include:
  • PerfHUD 6-a graphics debugging and performance analysis tool for DirectX 9 and 10 applications.
  • FX Composer 2.5-an integrated development environment for fast creation of real-time visual effects.
  • Shader Debugger-helps debug and optimize shaders written with HLSL, CgFX, and COLLADA FX Cg in DirectX and OpenGL.


"These new tools reinforce our deep and longstanding commitment to help game developers fulfill their vision," said Tony Tamasi, vice president of technical marketing for NVIDIA. "Creating a state-of-the-art video game is an incredibly challenging task technologically, which is why we invest heavily in creating powerful, easy-to-use video game optimization and debugging tools for creating console and PC games."

More Details on the New Tools

PerfHUD 6 is a new and improved version of NVIDIA's graphics debugging and performance analysis tool for DirectX 9 and 10 applications. PerfHUD is widely used by the world's leading game developers to debug and optimize their games. This new version includes comprehensive support for optimizing games for multiple GPUs using NVIDIA SLI technology, powerful new texture visualization and override capabilities, an API call list, dependency views, and much more. In a recent survey, more than 300 PerfHUD 5 users reported an average speedup of 37% after using PerfHUD to tune their applications.

"Spore relies on a host of graphical systems that support a complex and evolving universe. NVIDIA PerfHUD provides a unique and essential tool for in-game performance analysis," said Alec Miller, Graphics Engineer at Maxis. "The ability to overlay live GPU timings and state helps us rapidly diagnose, fix, and then verify optimizations. As a result, we can simulate rich worlds alongside interactive gameplay. I highly recommend PerfHUD because it is so simple to integrate and to use."

FX Composer 2.5 is an integrated development environment for fast creation of real-time visual effects. FX Composer 2.5 can be used to create shaders for HLSL, CgFX, and COLLADA FX Cg in DirectX and OpenGL. This new release features an improved user interface, DirectX 10 Support, ShaderPerf with GeForce 8 and 9 Series support, visual models and styles, and particle systems.

As longer, more complex shaders become pervasive, debugging shaders has become more of a challenge for developers. To assist developers with this task, NVIDIA introduces the brand-new NVIDIA Shader Debugger, a plug-in for FX Composer 2.5 that enables developers to inspect their code while seeing shader variables applied in real time on their geometry. The Shader Debugger can be used to debug HLSL, CgFX, and COLLADA FX Cg shaders in both DirectX and OpenGL.

The NVIDIA Shader Debugger is the first product in the NVIDIA Professional Developer Tools lineup. These are new tools directed at professional developers who need more industrial-strength capabilities and support. For example, the NVIDIA Shader Debugger will run on leading GPUs from all vendors.

In addition to the free versions available for non-commercial use, some of the new tools are subject to a license fee, but are priced to be accessible to developers. Existing free tools (such as FX Composer, PerfHUD, Texture Tools, and SDKs) will not be affected-they will continue to be available to all developers at no cost. Shader Debugger pricing information is available at www.shaderdebugger.com.

NVIDIA encourages developers to visit its developer web site here and its developer tools forums here.

View at TechPowerUp Main Site
 

PCpraiser100

New Member
Joined
Jul 17, 2008
Messages
1,062 (0.18/day)
System Name REBEL R1
Processor Core i7 920
Motherboard ASUS P6T
Cooling Stock
Memory 6GB OCZ GOLD TC LV Kit 1866MHz@1.65V 9-9-9-24
Video Card(s) Two Sapphire HD 5770 Vapor-X Xfire'd and OC'd (920/1330)
Storage Seagate 7200.11 500GB 32MB
Case Antec Three Hundred
Audio Device(s) ASUS Xonar D1 PCI Sound Card
Power Supply OCZ StealthXStream 500W
Software Windows 7 Ultimate 64-bit
Benchmark Scores 16585 Performance Score on 3DMark Vantage
Any room for the red team?

And the chances of optimized games for ATI would be......?
 

Nkd

Joined
Sep 15, 2007
Messages
364 (0.06/day)
this is one reason most games run better on nvidia hardware from the beginning and tend to have better minimum fps, and it takes ati atleast few driver updates to squeeze the full performance from a game or sometimes takes a year, just like cat8.6 and COD4, I have a hd 4870 right now, but nvidia seems to have 10 times better developer relations than ati, ati always relies on their driver team. AMD needs to step up the game, that is why nvidia seems to end up getting more customers because of their developer relations.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,349 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Joined
Nov 13, 2007
Messages
10,228 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
this is one reason most games run better on nvidia hardware from the beginning and tend to have better minimum fps, and it takes ati atleast few driver updates to squeeze the full performance from a game or sometimes takes a year, just like cat8.6 and COD4, I have a hd 4870 right now, but nvidia seems to have 10 times better developer relations than ati, ati always relies on their driver team. AMD needs to step up the game, that is why nvidia seems to end up getting more customers because of their developer relations.

Yep... you hit that nail on the head.
 

Nkd

Joined
Sep 15, 2007
Messages
364 (0.06/day)
well dx10.1 wont make much of a difference. even if you look at it with the dx10.1 patch the gtx 280 performs just like hd 4870, dx10.1 is only good for AA performance, well that has been the case for Assasin's Creed.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,349 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
You'll soon know. DX10.1 addresses several AA related issues with DX10. A DX10.1 game engine churns out more fps than DX10, NV's developer relations would be useless. The increments are hypothetically huge.
 

PCpraiser100

New Member
Joined
Jul 17, 2008
Messages
1,062 (0.18/day)
System Name REBEL R1
Processor Core i7 920
Motherboard ASUS P6T
Cooling Stock
Memory 6GB OCZ GOLD TC LV Kit 1866MHz@1.65V 9-9-9-24
Video Card(s) Two Sapphire HD 5770 Vapor-X Xfire'd and OC'd (920/1330)
Storage Seagate 7200.11 500GB 32MB
Case Antec Three Hundred
Audio Device(s) ASUS Xonar D1 PCI Sound Card
Power Supply OCZ StealthXStream 500W
Software Windows 7 Ultimate 64-bit
Benchmark Scores 16585 Performance Score on 3DMark Vantage
this is one reason most games run better on nvidia hardware from the beginning and tend to have better minimum fps, and it takes ati atleast few driver updates to squeeze the full performance from a game or sometimes takes a year, just like cat8.6 and COD4, I have a hd 4870 right now, but nvidia seems to have 10 times better developer relations than ati, ati always relies on their driver team. AMD needs to step up the game, that is why nvidia seems to end up getting more customers because of their developer relations.

Ya really, luckily the HL2 series is one of my favorite games as the Source Engine has scalable performance optimized for both sides in the generation:toast:. I mean like both companies have their unique strengths and weaknesses which sets them apart:slap:, why can't they just compete fairly with mediocre game engines that don't give a $h** for any side so that we all know who is the best on the block. I know for a fact that ATI will bust this whole graphics propaganda bull$h** especially for the position they are in. So by the looks of it because Microsoft french kisses Nvidia, they will replaced the DirectX ATI is really getting good at and then start with DirectX 11 announcements. Nvidia, you are a snitch SOB who thinks about living a lie because you can't handle the truth that you just bride your way out of situations.
 
Joined
Nov 13, 2007
Messages
10,228 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
why can't they just compete fairly with mediocre game engines that don't give a $h** for any side so that we all know who is the best on the block....Nvidia, you are a snitch SOB who thinks about living a lie because you can't handle the truth that you just bribe your way out of situations.

Its not unfair at all... Nvidia pours tons of money and support into game development. Ati doesnt have money, so they pour money into designing better hardware. ATI cards still run great - Look at Crysis. Its optimized for Nvidia, but it runs amazing on the 4870.

DX 10 is not that special, Im glad Microshaft is moving to DX11...

Speaking of... @ bta: Is DX11 going to be backwards compatible with DX10? Does it fall back to DX10 if the card doesnt support it like DX 10.1 does?

EDIT: although the crap they pulled with the Assasin's Creed patch makes them deserve the "snitch SOB" label. :laugh:
 
Last edited:

PCpraiser100

New Member
Joined
Jul 17, 2008
Messages
1,062 (0.18/day)
System Name REBEL R1
Processor Core i7 920
Motherboard ASUS P6T
Cooling Stock
Memory 6GB OCZ GOLD TC LV Kit 1866MHz@1.65V 9-9-9-24
Video Card(s) Two Sapphire HD 5770 Vapor-X Xfire'd and OC'd (920/1330)
Storage Seagate 7200.11 500GB 32MB
Case Antec Three Hundred
Audio Device(s) ASUS Xonar D1 PCI Sound Card
Power Supply OCZ StealthXStream 500W
Software Windows 7 Ultimate 64-bit
Benchmark Scores 16585 Performance Score on 3DMark Vantage
Its not unfair at all... Nvidia pours tons of money and support into game development. Ati doesnt have money, so they pour money into designing better hardware. ATI cards still run great - Look at Crysis. Its optimized for Nvidia, but it runs amazing on the 4870.

DX 10 is not that special, Im glad Microshaft is moving to DX11...

Speaking of... @ bta: Is DX11 going to be backwards compatible with DX10? Does it fall back to DX10 if the card doesnt support it like DX 10.1 does?

In some cases your right phanbuey, but there are some ways Nvidia uses to cut corners. My friend use to own a 6600GT and decided to play BF2 on full settings (BTW BF2 is an Nvidia game) and when we played it at full settings a big freeze suddenly occurred and then we saw that all the textures were less pixelated. Huge bust for Nvidia. And did you know that Crysis could optimize the 8800GT by renaming a certain config file in any way? Another bust. And then I played HL2 Episode One and CS Source on my borrowed friend's computer which had a 7900GTX and in the games HDR bloom was on steroids. But when I looked closely at the lit up props in the games the reflections and textures were not tat pixelated compared to other props around me in the game (full settings 1600x1200 by the way). Another bust for Nvidia as when I moved away from the HDR areas my framerates went down by a 30fps drop.
Now onto DX11, I am kinda glad they are developing it but it just worries me on the performance point-of-view that because Nvidia is the closer partner, they will be looking at the API's development closely. And since DX11 is one or two years away from now, Nvidia has got themselves a bit headstart in getting their GTXwahtever-the-heck series ready for DX11 games.
 
Joined
Feb 27, 2007
Messages
50 (0.01/day)
Location
Huntington, NY
System Name Home PC
Processor AMD Ryzen 7 1700
Motherboard ASRock Fatal1ty X370 Gaming K4 AM4
Cooling AMD Wraith Spire
Memory 16 GB Corsair Vengeance PC3000 DDR4
Video Card(s) PowerColor RED DRAGON Radeon RX Vega 56
Storage Samsung 850 Evo 1TB, Crucial MX300 500GB
Display(s) Dell S2719DGF 1440p
Case Phanteks Enthoo Pro Series PH-ES614P
Audio Device(s) Onboard
Power Supply SeaSonic M12II 620 Bronze
Mouse Logitech G9X
Keyboard Dell
Software Windows 10 Pro
this is one reason most games run better on nvidia hardware from the beginning and tend to have better minimum fps, and it takes ati atleast few driver updates to squeeze the full performance from a game or sometimes takes a year, just like cat8.6 and COD4, I have a hd 4870 right now, but nvidia seems to have 10 times better developer relations than ati, ati always relies on their driver team. AMD needs to step up the game, that is why nvidia seems to end up getting more customers because of their developer relations.

And where are you getting your facts from?
Here is a list of AMD/ATI development tools for Radon products:

# » AMD Tootle (Triangle Order Optimization Tool)
# » ATI Compress
# » CubeMapGen
# » GPU Mesh Mapper
# » GPU PerfStudio
# » GPU ShaderAnalyzer
# » Normal Mapper
# » OpenGL ES 2.0 Emulator
# » RenderMonkey™
# » The Compressonator
# » AMD Stream™
# » HLSL2GLSL

More info on each can be found here: http://developer.amd.com/gpu/Pages/default.aspx
 

panchoman

Sold my stars!
Joined
Jul 16, 2007
Messages
9,595 (1.57/day)
Processor Amd Athlon X2 4600+ Windsor(90nm) EE(65W) @2.9-3.0 @1.45
Motherboard Biostar Tforce [Nvidia] 550
Cooling Thermaltake Blue Orb-- bunch of other fans here and there....
Memory 2 gigs (2x1gb) of patriot ddr2 800 @ 4-4-4-12-2t
Video Card(s) Sapphire X1950pro Pci-E x16 @stock@stock on stock
Storage Seagate 7200.11 250gb Drive, WD raptors (30/40) in Raid 0
Display(s) ANCIENT 15" sony lcd, bought it when it was like 500 bucks
Case Apevia X-plorer blue/black
Audio Device(s) Onboard- Why get an sound card when you can hum??
Power Supply Antec NeoHe 550-manufactured by seasonic -replacement to the discontinued smart power series
Software Windows XP pro SP2 -- vista is still crap
wonder what perfhud will tell crytek :p
 

PCpraiser100

New Member
Joined
Jul 17, 2008
Messages
1,062 (0.18/day)
System Name REBEL R1
Processor Core i7 920
Motherboard ASUS P6T
Cooling Stock
Memory 6GB OCZ GOLD TC LV Kit 1866MHz@1.65V 9-9-9-24
Video Card(s) Two Sapphire HD 5770 Vapor-X Xfire'd and OC'd (920/1330)
Storage Seagate 7200.11 500GB 32MB
Case Antec Three Hundred
Audio Device(s) ASUS Xonar D1 PCI Sound Card
Power Supply OCZ StealthXStream 500W
Software Windows 7 Ultimate 64-bit
Benchmark Scores 16585 Performance Score on 3DMark Vantage
And where are you getting your facts from?
Here is a list of AMD/ATI development tools for Radon products:

# » AMD Tootle (Triangle Order Optimization Tool)
# » ATI Compress
# » CubeMapGen
# » GPU Mesh Mapper
# » GPU PerfStudio
# » GPU ShaderAnalyzer
# » Normal Mapper
# » OpenGL ES 2.0 Emulator
# » RenderMonkey™
# » The Compressonator
# » AMD Stream™
# » HLSL2GLSL

More info on each can be found here: http://developer.amd.com/gpu/Pages/default.aspx

I wonder if Valve is using any of these tools?

I heard Postal 3 is going to be powered by the Source engine and it has wide environments so these tools might work to keep ATI on the playing field.
 
Joined
Jun 20, 2007
Messages
3,942 (0.64/day)
System Name Widow
Processor Ryzen 7600x
Motherboard AsRock B650 HDVM.2
Cooling CPU : Corsair Hydro XC7 }{ GPU: EK FC 1080 via Magicool 360 III PRO > Photon 170 (D5)
Memory 32GB Gskill Flare X5
Video Card(s) GTX 1080 TI
Storage Samsung 9series NVM 2TB and Rust
Display(s) Predator X34P/Tempest X270OC @ 120hz / LG W3000h
Case Fractal Define S [Antec Skeleton hanging in hall of fame]
Audio Device(s) Asus Xonar Xense with AKG K612 cans on Monacor SA-100
Power Supply Seasonic X-850
Mouse Razer Naga 2014
Software Windows 11 Pro
Benchmark Scores FFXIV ARR Benchmark 12,883 on i7 2600k 15,098 on AM5 7600x
In some cases your right phanbuey, but there are some ways Nvidia uses to cut corners. My friend use to own a 6600GT and decided to play BF2 on full settings (BTW BF2 is an Nvidia game) and when we played it at full settings a big freeze suddenly occurred and then we saw that all the textures were less pixelated. Huge bust for Nvidia. And did you know that Crysis could optimize the 8800GT by renaming a certain config file in any way? Another bust. And then I played HL2 Episode One and CS Source on my borrowed friend's computer which had a 7900GTX and in the games HDR bloom was on steroids. But when I looked closely at the lit up props in the games the reflections and textures were not tat pixelated compared to other props around me in the game (full settings 1600x1200 by the way). Another bust for Nvidia as when I moved away from the HDR areas my framerates went down by a 30fps drop.
Now onto DX11, I am kinda glad they are developing it but it just worries me on the performance point-of-view that because Nvidia is the closer partner, they will be looking at the API's development closely. And since DX11 is one or two years away from now, Nvidia has got themselves a bit headstart in getting their GTXwahtever-the-heck series ready for DX11 games.


Seriously, hush and take it elsewhere..



I don't see a point of moving onto DX11, until they sort out DX10. Over half of the 'issues' we face through DX10 are due to lack of development exposure - unless DX11 is just going to be a rehash of the core architecture introduced by way of DX10, then ok, yet if it storms in with new pipes, that's just going to screw things up.
 
Joined
Apr 21, 2008
Messages
5,250 (0.90/day)
Location
IRAQ-Baghdad
System Name MASTER
Processor Core i7 3930k run at 4.4ghz
Motherboard Asus Rampage IV extreme
Cooling Corsair H100i
Memory 4x4G kingston hyperx beast 2400mhz
Video Card(s) 2X EVGA GTX680
Storage 2X Crusial M4 256g raid0, 1TbWD g, 2x500 WD B
Display(s) Samsung 27' 1080P LED 3D monitior 2ms
Case CoolerMaster Chosmos II
Audio Device(s) Creative sound blaster X-FI Titanum champion,Creative speakers 7.1 T7900
Power Supply Corsair 1200i, Logitch G500 Mouse, headset Corsair vengeance 1500
Software Win7 64bit Ultimate
Benchmark Scores 3d mark 2011: testing
allready most games support nvidia
 
Joined
Feb 26, 2007
Messages
850 (0.14/day)
Location
USA
this is one reason most games run better on nvidia hardware from the beginning and tend to have better minimum fps, and it takes ati atleast few driver updates to squeeze the full performance from a game or sometimes takes a year, just like cat8.6 and COD4, I have a hd 4870 right now, but nvidia seems to have 10 times better developer relations than ati, ati always relies on their driver team. AMD needs to step up the game, that is why nvidia seems to end up getting more customers because of their developer relations.
Very True!
Ya really, luckily the HL2 series is one of my favorite games as the Source Engine has scalable performance optimized for both sides in the generation:toast:. I mean like both companies have their unique strengths and weaknesses which sets them apart:slap:, why can't they just compete fairly with mediocre game engines that don't give a $h** for any side so that we all know who is the best on the block. I know for a fact that ATI will bust this whole graphics propaganda bull$h** especially for the position they are in. So by the looks of it because Microsoft french kisses Nvidia, they will replaced the DirectX ATI is really getting good at and then start with DirectX 11 announcements. Nvidia, you are a snitch SOB who thinks about living a lie because you can't handle the truth that you just bride your way out of situations.
LoL, you complain about Nvidia for working with MS on DX11, but ATI was the one working with MS on DX10.1. Difference being?
If Nvidia is smart enough to help developers out then they deserve to have companies willing to work with them . . .
In some cases your right phanbuey, but there are some ways Nvidia uses to cut corners. My friend use to own a 6600GT and decided to play BF2 on full settings (BTW BF2 is an Nvidia game) and when we played it at full settings a big freeze suddenly occurred and then we saw that all the textures were less pixelated. Huge bust for Nvidia. And did you know that Crysis could optimize the 8800GT by renaming a certain config file in any way? Another bust. And then I played HL2 Episode One and CS Source on my borrowed friend's computer which had a 7900GTX and in the games HDR bloom was on steroids. But when I looked closely at the lit up props in the games the reflections and textures were not tat pixelated compared to other props around me in the game (full settings 1600x1200 by the way). Another bust for Nvidia as when I moved away from the HDR areas my framerates went down by a 30fps drop.
Now onto DX11, I am kinda glad they are developing it but it just worries me on the performance point-of-view that because Nvidia is the closer partner, they will be looking at the API's development closely. And since DX11 is one or two years away from now, Nvidia has got themselves a bit headstart in getting their GTXwahtever-the-heck series ready for DX11 games.
Perhaps ATI should start working with MS on this as well then?
 

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
25,846 (3.79/day)
Location
Alabama
System Name Rocinante
Processor I9 14900KS
Motherboard EVGA z690 Dark KINGPIN (modded BIOS)
Cooling EK-AIO Elite 360 D-RGB
Memory 64GB Gskill Trident Z5 DDR5 6000 @6400
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 500GB 980 Pro | 1x 1TB 980 Pro | 1x 8TB Corsair MP400
Display(s) Odyssey OLED G9 G95SC
Case Lian Li o11 Evo Dynamic White
Audio Device(s) Moondrop S8's on Schiit Hel 2e
Power Supply Bequiet! Power Pro 12 1500w
Mouse Lamzu Atlantis mini (White)
Keyboard Monsgeek M3 Lavender, Akko Crystal Blues
VR HMD Quest 3
Software Windows 11
Benchmark Scores I dont have time for that.
I cant wait to play with this stuff!!!
 
Joined
Nov 13, 2007
Messages
10,228 (1.70/day)
Location
Austin Texas
Processor 13700KF Undervolted @ 5.6/ 5.5, 4.8Ghz Ring 200W PL1
Motherboard MSI 690-I PRO
Cooling Thermalright Peerless Assassin 120 w/ Arctic P12 Fans
Memory 48 GB DDR5 7600 MHZ CL36
Video Card(s) RTX 4090 FE
Storage 2x 2TB WDC SN850, 1TB Samsung 960 prr
Display(s) Alienware 32" 4k 240hz OLED
Case SLIGER S620
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse Xlite V2
Keyboard RoyalAxe
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Joined
Jun 16, 2008
Messages
3,175 (0.55/day)
Location
Brockport, NY
System Name Is rly gud
Processor Intel Core i5 11600kf
Motherboard Asus Prime Z590-V ATX
Memory (48GB total) 16GB (2x8GB) Crucial Ballistix Sport 3000MHZ and G. Skill Ripjaws 32GB 3200MHZ (2x16GB)
Video Card(s) GIGABYTE RTX 3060 12GB
Storage 1TB MSI Spatium M370 NVMe M.2 SSD
Display(s) 32" Viewsonic 4k, 34" Samsung 3440x1440, XP Pen Creative Pro 13.3
Power Supply EVGA 600 80+ Gold
VR HMD Meta Quest Pro, Tundra Trackers
Software Windows 10
I bet that as Nvidia finally moves onto DX 10.1, RV800 will be announced as DX11. XD!!
I'm holding out for the 5000 series. If the 5950 doesn't outperform 4870x2 (sounds hard to beat) then I'll get a cheap 4870x2 next year.
 
Joined
Dec 28, 2006
Messages
4,378 (0.69/day)
Location
Hurst, Texas
System Name The86
Processor Ryzen 5 3600
Motherboard ASROCKS B450 Steel Legend
Cooling AMD Stealth
Memory 2x8gb DDR4 3200 Corsair
Video Card(s) EVGA RTX 3060 Ti
Storage WD Black 512gb, WD Blue 1TB
Display(s) AOC 24in
Case Raidmax Alpha Prime
Power Supply 700W Thermaltake Smart
Mouse Logitech Mx510
Keyboard Razer BlackWidow 2012
Software Windows 10 Professional
dont you remember the 5900 did beat the 4800 :p
 
Top