• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Unveils GeForce 361.75 WHQL Game Ready Drivers

Joined
Oct 25, 2005
Messages
193 (0.03/day)
Location
Long Island, NY
Processor 9700K
Motherboard Asrock Z390 Phantom Gaming-ITX/ac
Cooling Alpenfohn Black Ridge
Memory 32GB Micron VLP 18ADF2G72AZ-3G2E1
Video Card(s) 3090 FE
Display(s) Samsung G9 NEO
Case Formd T1
Power Supply Corsair SF750
Excellent job NVIDIA. NOT. First it failed 3 times through NVIDIA Experience, then it entirely fucked up all the drivers so that Windows 10 installed it's shitty driver from July 2015. Then I had to manually remove the new old driver and again install this pile of crap just to be greeted with this bullshit. NVIDIA, are you fucking kidding me?

View attachment 71469

"Eject GTX 980". WHY!? WHY NVIDIA!? I don't want to have an option to eject god damn ONLY and DEDICATED graphic card in my system. Not only I'll have to watch this god damn icon down there the entire time, I'll by mistake eject ONLY graphic card when I'll want to eject god damn USB drive and lose image. This is just lazy and stupid.

There's no need to eject usb drives since windows 7 and beyond (maybe it was xp sp2, i forget). delayed writes are disabled on removable drives.
 
Joined
Jul 19, 2006
Messages
43,587 (6.72/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
Okay... who's the first one :nutkick:

Report in while I am at work.
Everything installed normally with no issue. The few games I play are working as they should.
 
Joined
Dec 18, 2011
Messages
42 (0.01/day)
System Name TBD
Processor Core i7-4790K @Stock
Motherboard Asrock Z97-Pro4
Cooling Hyper 212 Evo
Memory G Skill Ares 3 8 GB (2x4GB)
Video Card(s) MSi RX480 8 GB Gaming X
Storage Samsung 850 250 Gb SSD, WB Blue 2TB, WD Blue 1TB
Display(s) Benq E2440 1080p
Case NZXT S340
Audio Device(s) Onboard
Power Supply Antec 500W
Software Windows 7 SP1 64 Bit
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
There's no need to eject usb drives since windows 7 and beyond (maybe it was xp sp2, i forget). delayed writes are disabled on removable drives.

You're missing the point entirely. And yes, I do prefer to safely remove some devices, like portable HDD because it spins down and then I unplug it. It doesn't like unplugging while it's spinning. Not to mention the USB icon was indication that I have USB plugged in. Now this shit is there the entire time so I strictly have to look at the USB ports if anything is in them or click the damn thing to see the list. Stupid.
 
Joined
Jan 13, 2011
Messages
219 (0.05/day)
View attachment 71469
"Eject GTX 980". WHY!? WHY NVIDIA!? I don't want to have an option to eject god damn ONLY and DEDICATED graphic card in my system. Not only I'll have to watch this god damn icon down there the entire time, I'll by mistake eject ONLY graphic card when I'll want to eject god damn USB drive and lose image. This is just lazy and stupid.
This is part of the new BETA support for external GPUs however this message will be removed from the next driver.
https://forums.geforce.com/default/...eedback-thread-1-27-16-/post/4791013/#4791013
 
Joined
Jan 27, 2015
Messages
1,065 (0.32/day)
System Name loon v4.0
Processor i7-11700K
Motherboard asus Z590TUF+wifi
Cooling Custom Loop
Memory ballistix 3600 cl16
Video Card(s) eVga 3060 xc
Storage WD sn570 1tb(nvme) SanDisk ultra 2tb(sata)
Display(s) cheap 1080&4K 60hz
Case Roswell Stryker
Power Supply eVGA supernova 750 G6
Mouse eats cheese
Keyboard warrior!
Benchmark Scores https://www.3dmark.com/spy/21765182 https://www.3dmark.com/pr/1114767
never had an idle problem, not getting either game and back on W7. so any reason to grab this?

like a performance increase in MGSV:TPP . . oh wait that's fine already . .
 
Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
never had an idle problem, not getting either game and back on W7. so any reason to grab this?

like a performance increase in MGSV:TPP . . oh wait that's fine already . .

You appear to have answered you own question.

Pesky nVidia and their superb day one support! *shakes fist*
 
Joined
Feb 23, 2008
Messages
1,064 (0.18/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)

Stick with Win7 and a Titan X

PCWorld - Rise of the Tomb Raider (PC) review impressions: Gorgeous game, ugly stutter
PCWorld said:
UPDATE, 2:00 PM Pacific: I’ve installed Nvidia’s Game Ready Drivers and it helped a bit, but didn’t completely eliminate the stuttering. The Geothermal Valley continues to display precipitous drops in frame rate, falling from around 100 down to 55. Tweaking the Level of Detail down a notch gained me back five frames (bringing it to a slightly-stuttery 60), but be aware that even a high-powered rig might show quite a bit of slowdown in these larger areas regardless of how the rest of the game runs.

PCGamer - Rise of the Tomb Raider review
PCGamer said:
While my 970 GTX couldn't keep up with the demands of running every option at maximum—dropping to a stutter during cutscenes and set pieces—a few sensible reductions had it running smoothly and consistently at 60 frames per second.
 
Last edited:
Joined
Apr 19, 2012
Messages
12,062 (2.75/day)
Location
Gypsyland, UK
System Name HP Omen 17
Processor i7 7700HQ
Memory 16GB 2400Mhz DDR4
Video Card(s) GTX 1060
Storage Samsung SM961 256GB + HGST 1TB
Display(s) 1080p IPS G-SYNC 75Hz
Audio Device(s) Bang & Olufsen
Power Supply 230W
Mouse Roccat Kone XTD+
Software Win 10 Pro
Joined
Nov 18, 2010
Messages
7,125 (1.45/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
It runs beautifully on AMD gpus. Doesn't have crossfire support, but a 290X is bang on for max settings as 1080p.

Is it me, but the hair looks like a sponge?

And AMD cards have snow in the here and nvidia don't, is it true?
 
Joined
Apr 19, 2012
Messages
12,062 (2.75/day)
Location
Gypsyland, UK
System Name HP Omen 17
Processor i7 7700HQ
Memory 16GB 2400Mhz DDR4
Video Card(s) GTX 1060
Storage Samsung SM961 256GB + HGST 1TB
Display(s) 1080p IPS G-SYNC 75Hz
Audio Device(s) Bang & Olufsen
Power Supply 230W
Mouse Roccat Kone XTD+
Software Win 10 Pro
Is it me, but the hair looks like a sponge?

And AMD cards have snow in the here and nvidia don't, is it true?

Hair looks great for me, I have the hair effects on, but not set to NVidia processing. As for snow I have no idea, I have snow :D
 
Joined
Dec 7, 2014
Messages
175 (0.05/day)
Location
Tokyo, Japan
System Name Workstation
Processor Intel Core-i9 9960x
Motherboard Asus WS X299 SAGE
Cooling Custom
Memory Crucial Ballistix 128 GB (8 x BL2K16G36C16U4B)
Video Card(s) EVGA GTX 1080 Ti FTW3
Storage Intel Optane 905P (SSDPED1D015TAX1), Sabrent Rocket 2 TB, 4 x HGST DC HC530
Display(s) NEC PA272W-BK-SV, Eizo CG2700S-BK
Case Custom
Audio Device(s) Asus Essence STX II, Sony WH-1000XM
Power Supply Seasonic PRIME Platinum 1200 W
Mouse Logicool G502 Proteus Core
Keyboard Logicool G810
Software Windows 10 Pro 64-bit, several Adobe software
I have GTX 980 and I can report I have no problem.
Only installed the driver and PhysX without 3D, audio and Experience.
I do not have Rise of Tomb Raider or The Division but Witcher 3 and Fallout 4 run well.

But I have not purchase Rise of Tomb Raider yet. Should I buy now or wait until a later date? I am interested to play the game but I have doubts. @RCoon? @rtwjunkie? Anybody can give an advice to me?
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
But I have not purchase Rise of Tomb Raider yet. Should I buy now or wait until a later date?
I think @RCoon's review of it is today in a few hours, or Friday. You could wait till then.
 
Joined
Apr 19, 2012
Messages
12,062 (2.75/day)
Location
Gypsyland, UK
System Name HP Omen 17
Processor i7 7700HQ
Memory 16GB 2400Mhz DDR4
Video Card(s) GTX 1060
Storage Samsung SM961 256GB + HGST 1TB
Display(s) 1080p IPS G-SYNC 75Hz
Audio Device(s) Bang & Olufsen
Power Supply 230W
Mouse Roccat Kone XTD+
Software Win 10 Pro
I have GTX 980 and I can report I have no problem.
Only installed the driver and PhysX without 3D, audio and Experience.
I do not have Rise of Tomb Raider or The Division but Witcher 3 and Fallout 4 run well.

But I have not purchase Rise of Tomb Raider yet. Should I buy now or wait until a later date? I am interested to play the game but I have doubts. @RCoon? @rtwjunkie? Anybody can give an advice to me?

I think @RCoon's review of it is today in a few hours, or Friday. You could wait till then.

Review goes up tomorrow at 9AM PST for both Bombshell and Tomb Raider. That said if you cba to wait for that, Tomb Raider gets a definitive thumbs up from me. Bit heavy on QTE's and scripted events, but even so, much better than the previous title, and even that was relatively good.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Tomb Raider gets a definitive thumbs up from me

Good to hear! Even if more QTE's than last one, it puts it more in line with the previous 3 (LAU), which is fine with me.
 
Joined
Apr 19, 2012
Messages
12,062 (2.75/day)
Location
Gypsyland, UK
System Name HP Omen 17
Processor i7 7700HQ
Memory 16GB 2400Mhz DDR4
Video Card(s) GTX 1060
Storage Samsung SM961 256GB + HGST 1TB
Display(s) 1080p IPS G-SYNC 75Hz
Audio Device(s) Bang & Olufsen
Power Supply 230W
Mouse Roccat Kone XTD+
Software Win 10 Pro
Good to hear! Even if more QTE's than last one, it puts it more in line with the previous 3 (LAU), which is fine with me.

Main story is like 10 hours worth of content. Within that there's around 5-10 QTE's so they're fairly spread out. Completed the game and it says I've got just under 40% of the whole content remaining
 
Joined
Jan 29, 2016
Messages
128 (0.04/day)
System Name Ryzen 5800X-PC / RyzenITX (2nd system 5800X stock)
Processor AMD Ryzen 7 5800X (atx) / 5800X itx (soon one pc getting 5800X3D upgrade! ;)
Motherboard Gigabyte X570 AORUS MASTER (ATX) / X570 I Aorus Pro WiFi (ITX)
Cooling AMD Wrath Prism Cooler / Alphenhone Blackridge (ITX)
Memory OLOY 4000Mhz 16GB x 2 (32GB) DDR4 4000 Mhz CL18, (22,22,22,42) 1.40v AT & ITX PC's (2000 Fclk)
Video Card(s) AMD Radeon RX 6800 XT (ATX) /// AMD Radeon RX 6700 XT 12GB GDDR6 (ITX)
Storage (Sys)Sammy 970EVO 500GB & SabrentRocket 4.0+ 2TB (ATX) | SabrentRocket4.0+ 1TB NVMe (ITX)
Display(s) 30" Ultra-Wide 21:9 200Hz/AMD FREESYNC 200hz/144hz LED LCD Montior Connected Via Display Port (x2)
Case Lian Li Lancool II Mesh (ATX) / Velkase Velka 7 (ITX)
Audio Device(s) Realtek HD ALC1220 codec / Onboard HD Audio* (BOTH) w/ EQ settings
Power Supply 850w (Antec High-Current Gamer) HC-850 PSU (80+ gold certified) ATX) /650Watt Thermaltake SFX (ITX)
Mouse Logitech USB Wireless KB & MOUSE (Both Systems)
Keyboard Logitech USB Wireless KB & MOUSE (Both Systems)
VR HMD Oculus Quest 2 - 128GB - Standalone + Oculus link PC
Software Windows 10 Home x64bit 2400 /BOTH SYSTEMS
Benchmark Scores CPUZ - ATX-5800X (ST:670) - (MT: 6836.3 ) CPUZ - ITX -5800X (ST:680.2) - (MT: 7015.2) ??? same CPU?
Nvidia (I CALL THEM, nShitia, AKA *SARCASTIC* I have an nShitia Crapforce GTCrapX 970si) LOL

F*cking nvidia , I call them (nShitia) iv hated nVidia ever scence they bought, took over 3Dfx Interactive INC.
, I loved 3Dfx Graphics Accelerators, iv seen people say that nvidia OR ATI/AMD, or even SiS Graphics chips by S3, like the S3 Virge DX 4MB accelerator, S3 Savage 3D, so nVidia, ATI/AMD, SIS, Matrox, PowerVR. etc.

From WIKI.. below......................

In the PC world, notable failed first tries for low-cost 3D graphics chips were the S3 ViRGE, ATI Rage, and Matrox Mystique. These chips were essentially previous-generation 2D accelerators with 3D features bolted on. Many were even pin-compatible with the earlier-generation chips for ease of implementation and minimal cost. Initially, performance 3D graphics were possible only with discrete boards dedicated to accelerating 3D functions (and lacking 2D GUI acceleration entirely) such as the PowerVR and the 3Dfx Voodoo. However, as manufacturing technology continued to progress, video, 2D GUI acceleration and 3D functionality were all integrated into one chip. Rendition's Verite chipsets were among the first to do this well enough to be worthy of note. In 1997, Rendition went a step further by collaborating with Hercules and Fujitsu on a "Thriller Conspiracy" project which combined a Fujitsu FXG-1 Pinolite geometry processor with a Vérité V2200 core to create a graphics card with a full T&L engine years before Nvidia's GeForce 256. This card, designed to reduce the load placed upon the system's CPU, never made it to market.
OpenGL appeared in the early '90s as a professional graphics API, but originally suffered from performance issues which allowed the Glide API to step in and become a dominant force on the PC in the late '90s.[30] However, these issues were quickly overcome and the Glide API fell by the wayside. Software implementations of OpenGL were common during this time, although the influence of OpenGL eventually led to widespread hardware support. Over time, a parity emerged between features offered in hardware and those offered in OpenGL. Direct X became popular among Windows game developers during the late 90s. Unlike OpenGL, Microsoft insisted on providing strict one-to-one support of hardware. The approach made Direct X less popular as a standalone graphics API initially, since many GPUs provided their own specific features, which existing OpenGL applications were already able to benefit from, leaving Direct X often one generation behind. (See: Comparison of OpenGL and Direct3D.)
Over time, Microsoft began to work more closely with hardware developers, and started to target the releases of Direct X to coincide with those of the supporting graphics hardware. Direct3D 5.0 was the first version of the burgeoning API to gain widespread adoption in the gaming market, and it competed directly with many more-hardware-specific, often proprietary graphics libraries, while OpenGL maintained a strong following. Direct3D 7.0 introduced support for hardware-accelerated transform and lighting (T&L) for Direct3D, while OpenGL had this capability already exposed from its inception. 3D accelerator cards moved beyond being just simple rasterizers to add another significant hardware stage to the 3D rendering pipeline. The Nvidia GeForce 256 (also known as NV10) was the first consumer-level card released on the market with hardware-accelerated T&L, while professional 3D cards already had this capability. Hardware transform and lighting, both already existing features of OpenGL, came to consumer-level hardware in the '90s and set the precedent for later pixel shader and vertex shader units which were far more flexible and programmable.

Im even building an old AMD K7 AMD Athlon Slot A Thunderbird - (T-Bird) 750MHz Slot A CPU with........ 3Dfx Voodoo 5 5500 64MB AGP 2X and TWO (2x) 2x 3Dfx Voodoo2 2000 PCI in SLI .

if im wrong about this, im at least right when either 3Dfx and or ATI Technologies both had invented a dual GPU / dual card setup aka .by ATI Tech. the, ( ATi Rage Furry MAXX) which was the first ATi Graphics card that had multiple ( 2x ) GPU Chips on a single graphics card, with 3Dfx Interactive INC. was the first , 3D Graphics accelerator company to design a dual card system for increasing 3D acceleration performance which was SLI Technology, Invented / or developed by 3Dfx Interactive. SLI Technology, SLI stands for ****( SCAN LINE INTERLEAVE )**** which connected 2 3Dfx Voodoo2's (either 8MB or 12MB Voodoo2 3D accelerators to increase 3D performance, 3Dfx then, eventually produced the....
3Dfx Voodoo 5 5500 64MB (32MB SGRAM*or*DDR) per VSA-100 GPU, an was in the PCI Interface, but ultimately, also had the AGP version as well, which i own the AGP version. 3Dfx Interactive's Last 3D Accelerator was, or would have been the 3Dfx Voodoo 5 6000 AGP which would of had,......
4x 3Dfx VSA-100 GPU Chips on a single card, but do to needing an external power supply, and i believe 3Dfx was having a lot of problems with the Voodoo 5 6000, ...
which would of had, not only 4x VSA-100 GPUS, it would of had 32MB of SGRAM*or* 1st Gen DDR Vram, the voodoo 5 5500 had 64MB for the Two VSA-100 GPUS, the Voodoo 5 6000 would of had 128MB or Vram for all 4 gpus, and prob, would have been the first 3D accelerator wit h 128MB of Vram, even though the 4 gpus ran in SLI, still, id love to have the voodoo 5 6000, just for the hell of it, whether the card worked or not... but the SLI technology nor the AMD/ATI crossfire X, was not invented by ATI/AMD nor nVidia.

yeah, they invented their versions, u know, ATI/AMD called their form of SLI, Crossfire / Crossfire X, and nshitia's (nVidia) SLI is just called SLI that they got when they bought out 3Dfx Interactive, which i despised nVidia for in the first place and this was years ago back in the early 2000's 1999, - 2002, and all nvidia used was the SLI tech, nvidia completely discarded the 3Dfx Glide2x /glide3x, 3Dfx's Glide API, which had problems, but i still loved the quality and Performance that i had from the glide API and 3dfx Voodoo cards... my first 3Dfx card was a creative labs voodoo banshee, which was the first 3Dfx, 3D Hardware accelerator which also had 2D hardware acceleration on a single GPU die, a 3Dfx Voodoo Banshee was just pretty much, a Voodoo 2 ,the two 3Dfx voodoo 2 Pixel Pipeline unit chips & the 3Dfx voodoo 2's single texel unit chip , (or Vise Versa) -i may have that backwards- but all built into a singel 3Dfx GPU chip with full 2D Hardware Acceleration and clocked higher than the Voodoo2's original gpu clocks, the voodoo banshee, i believe ran at either 100MHZ or 166MHZ , i cant rememberm but i think the 3dfx banshee was 100MHz, but the voodoo 2's pixel / texel pipeline chips ran at 90MHZ so the banshee was an excelently improved voodoo2 with out the need for a seperate 2D accelerator and a pass through vga cable..


but my point is, i FUCKING despise Nvidai, and ill always call them nShitia, Nice, any other nVidai hater like my code name for nvida, nShitia***

and im not a AMD/ati fanboy, AMD products are just the only hardware parts i can afford, they are way cheaper than INTEL or nShitia hardware and provide performance that is good enough for games i play, if i need more performance, then i do a slight over clock on my FX-8150 CPU, and my Radeon R9 290 GPU which is right now, my FX-8150 is clocked @ 4.0 Ghz, CPU, with no turbo core and all 4 core modules enabled for 8 cores, 2 cores per pair/module, which has good performance, just AMD needs to finally out perform Intel cpus with AMD's Zen cores, and i cant wait to try out an AMD ZEN FX Processor.. cant we buy Engineering samples of AMD Zen damnit... ok an peice out..
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Nvidia (I CALL THEM, nShitia, AKA *SARCASTIC* I have an nShitia Crapforce GTCrapX 970si) LOL

F*cking nvidia , I call them (nShitia) iv hated nVidia ever scence they bought, took over 3Dfx Interactive INC.
, I loved 3Dfx Graphics Accelerators, iv seen people say that nvidia OR ATI/AMD, or even SiS Graphics chips by S3, like the S3 Virge DX 4MB accelerator, S3 Savage 3D, so nVidia, ATI/AMD, SIS, Matrox, PowerVR. etc.

From WIKI.. below......................

In the PC world, notable failed first tries for low-cost 3D graphics chips were the S3 ViRGE, ATI Rage, and Matrox Mystique. These chips were essentially previous-generation 2D accelerators with 3D features bolted on. Many were even pin-compatible with the earlier-generation chips for ease of implementation and minimal cost. Initially, performance 3D graphics were possible only with discrete boards dedicated to accelerating 3D functions (and lacking 2D GUI acceleration entirely) such as the PowerVR and the 3Dfx Voodoo. However, as manufacturing technology continued to progress, video, 2D GUI acceleration and 3D functionality were all integrated into one chip. Rendition's Verite chipsets were among the first to do this well enough to be worthy of note. In 1997, Rendition went a step further by collaborating with Hercules and Fujitsu on a "Thriller Conspiracy" project which combined a Fujitsu FXG-1 Pinolite geometry processor with a Vérité V2200 core to create a graphics card with a full T&L engine years before Nvidia's GeForce 256. This card, designed to reduce the load placed upon the system's CPU, never made it to market.
OpenGL appeared in the early '90s as a professional graphics API, but originally suffered from performance issues which allowed the Glide API to step in and become a dominant force on the PC in the late '90s.[30] However, these issues were quickly overcome and the Glide API fell by the wayside. Software implementations of OpenGL were common during this time, although the influence of OpenGL eventually led to widespread hardware support. Over time, a parity emerged between features offered in hardware and those offered in OpenGL. Direct X became popular among Windows game developers during the late 90s. Unlike OpenGL, Microsoft insisted on providing strict one-to-one support of hardware. The approach made Direct X less popular as a standalone graphics API initially, since many GPUs provided their own specific features, which existing OpenGL applications were already able to benefit from, leaving Direct X often one generation behind. (See: Comparison of OpenGL and Direct3D.)
Over time, Microsoft began to work more closely with hardware developers, and started to target the releases of Direct X to coincide with those of the supporting graphics hardware. Direct3D 5.0 was the first version of the burgeoning API to gain widespread adoption in the gaming market, and it competed directly with many more-hardware-specific, often proprietary graphics libraries, while OpenGL maintained a strong following. Direct3D 7.0 introduced support for hardware-accelerated transform and lighting (T&L) for Direct3D, while OpenGL had this capability already exposed from its inception. 3D accelerator cards moved beyond being just simple rasterizers to add another significant hardware stage to the 3D rendering pipeline. The Nvidia GeForce 256 (also known as NV10) was the first consumer-level card released on the market with hardware-accelerated T&L, while professional 3D cards already had this capability. Hardware transform and lighting, both already existing features of OpenGL, came to consumer-level hardware in the '90s and set the precedent for later pixel shader and vertex shader units which were far more flexible and programmable.

Im even building an old AMD K7 AMD Athlon Slot A Thunderbird - (T-Bird) 750MHz Slot A CPU with........ 3Dfx Voodoo 5 5500 64MB AGP 2X and TWO (2x) 2x 3Dfx Voodoo2 2000 PCI in SLI .

if im wrong about this, im at least right when either 3Dfx and or ATI Technologies both had invented a dual GPU / dual card setup aka .by ATI Tech. the, ( ATi Rage Furry MAXX) which was the first ATi Graphics card that had multiple ( 2x ) GPU Chips on a single graphics card, with 3Dfx Interactive INC. was the first , 3D Graphics accelerator company to design a dual card system for increasing 3D acceleration performance which was SLI Technology, Invented / or developed by 3Dfx Interactive. SLI Technology, SLI stands for ****( SCAN LINE INTERLEAVE )**** which connected 2 3Dfx Voodoo2's (either 8MB or 12MB Voodoo2 3D accelerators to increase 3D performance, 3Dfx then, eventually produced the....
3Dfx Voodoo 5 5500 64MB (32MB SGRAM*or*DDR) per VSA-100 GPU, an was in the PCI Interface, but ultimately, also had the AGP version as well, which i own the AGP version. 3Dfx Interactive's Last 3D Accelerator was, or would have been the 3Dfx Voodoo 5 6000 AGP which would of had,......
4x 3Dfx VSA-100 GPU Chips on a single card, but do to needing an external power supply, and i believe 3Dfx was having a lot of problems with the Voodoo 5 6000, ...
which would of had, not only 4x VSA-100 GPUS, it would of had 32MB of SGRAM*or* 1st Gen DDR Vram, the voodoo 5 5500 had 64MB for the Two VSA-100 GPUS, the Voodoo 5 6000 would of had 128MB or Vram for all 4 gpus, and prob, would have been the first 3D accelerator wit h 128MB of Vram, even though the 4 gpus ran in SLI, still, id love to have the voodoo 5 6000, just for the hell of it, whether the card worked or not... but the SLI technology nor the AMD/ATI crossfire X, was not invented by ATI/AMD nor nVidia.

yeah, they invented their versions, u know, ATI/AMD called their form of SLI, Crossfire / Crossfire X, and nshitia's (nVidia) SLI is just called SLI that they got when they bought out 3Dfx Interactive, which i despised nVidia for in the first place and this was years ago back in the early 2000's 1999, - 2002, and all nvidia used was the SLI tech, nvidia completely discarded the 3Dfx Glide2x /glide3x, 3Dfx's Glide API, which had problems, but i still loved the quality and Performance that i had from the glide API and 3dfx Voodoo cards... my first 3Dfx card was a creative labs voodoo banshee, which was the first 3Dfx, 3D Hardware accelerator which also had 2D hardware acceleration on a single GPU die, a 3Dfx Voodoo Banshee was just pretty much, a Voodoo 2 ,the two 3Dfx voodoo 2 Pixel Pipeline unit chips & the 3Dfx voodoo 2's single texel unit chip , (or Vise Versa) -i may have that backwards- but all built into a singel 3Dfx GPU chip with full 2D Hardware Acceleration and clocked higher than the Voodoo2's original gpu clocks, the voodoo banshee, i believe ran at either 100MHZ or 166MHZ , i cant rememberm but i think the 3dfx banshee was 100MHz, but the voodoo 2's pixel / texel pipeline chips ran at 90MHZ so the banshee was an excelently improved voodoo2 with out the need for a seperate 2D accelerator and a pass through vga cable..


but my point is, i FUCKING despise Nvidai, and ill always call them nShitia, Nice, any other nVidai hater like my code name for nvida, nShitia***

and im not a AMD/ati fanboy, AMD products are just the only hardware parts i can afford, they are way cheaper than INTEL or nShitia hardware and provide performance that is good enough for games i play, if i need more performance, then i do a slight over clock on my FX-8150 CPU, and my Radeon R9 290 GPU which is right now, my FX-8150 is clocked @ 4.0 Ghz, CPU, with no turbo core and all 4 core modules enabled for 8 cores, 2 cores per pair/module, which has good performance, just AMD needs to finally out perform Intel cpus with AMD's Zen cores, and i cant wait to try out an AMD ZEN FX Processor.. cant we buy Engineering samples of AMD Zen damnit... ok an peice out..

Wow, that's awesome how you joined up and appear to have jumped in just to write an entire rambling book about how much you hate Nvidia. Good job. :rolleyes:

Normally I'd welcome someone to TPU with their first post, but a hate-filled sewage-piece with barely any punctuation, and hard as hell to read kind of just took that welcoming incentive away.
 
Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
I think Nvidia must have stolen his first born child or something. :laugh:
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.65/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Nvidia (I CALL THEM, nShitia, AKA *SARCASTIC* I have an nShitia Crapforce GTCrapX 970si) LOL

F*cking nvidia , I call them (nShitia) iv hated nVidia ever scence they bought, took over 3Dfx Interactive INC.
, I loved 3Dfx Graphics Accelerators, iv seen people say that nvidia OR ATI/AMD, or even SiS Graphics chips by S3, like the S3 Virge DX 4MB accelerator, S3 Savage 3D, so nVidia, ATI/AMD, SIS, Matrox, PowerVR. etc.

From WIKI.. below......................

In the PC world, notable failed first tries for low-cost 3D graphics chips were the S3 ViRGE, ATI Rage, and Matrox Mystique. These chips were essentially previous-generation 2D accelerators with 3D features bolted on. Many were even pin-compatible with the earlier-generation chips for ease of implementation and minimal cost. Initially, performance 3D graphics were possible only with discrete boards dedicated to accelerating 3D functions (and lacking 2D GUI acceleration entirely) such as the PowerVR and the 3Dfx Voodoo. However, as manufacturing technology continued to progress, video, 2D GUI acceleration and 3D functionality were all integrated into one chip. Rendition's Verite chipsets were among the first to do this well enough to be worthy of note. In 1997, Rendition went a step further by collaborating with Hercules and Fujitsu on a "Thriller Conspiracy" project which combined a Fujitsu FXG-1 Pinolite geometry processor with a Vérité V2200 core to create a graphics card with a full T&L engine years before Nvidia's GeForce 256. This card, designed to reduce the load placed upon the system's CPU, never made it to market.
OpenGL appeared in the early '90s as a professional graphics API, but originally suffered from performance issues which allowed the Glide API to step in and become a dominant force on the PC in the late '90s.[30] However, these issues were quickly overcome and the Glide API fell by the wayside. Software implementations of OpenGL were common during this time, although the influence of OpenGL eventually led to widespread hardware support. Over time, a parity emerged between features offered in hardware and those offered in OpenGL. Direct X became popular among Windows game developers during the late 90s. Unlike OpenGL, Microsoft insisted on providing strict one-to-one support of hardware. The approach made Direct X less popular as a standalone graphics API initially, since many GPUs provided their own specific features, which existing OpenGL applications were already able to benefit from, leaving Direct X often one generation behind. (See: Comparison of OpenGL and Direct3D.)
Over time, Microsoft began to work more closely with hardware developers, and started to target the releases of Direct X to coincide with those of the supporting graphics hardware. Direct3D 5.0 was the first version of the burgeoning API to gain widespread adoption in the gaming market, and it competed directly with many more-hardware-specific, often proprietary graphics libraries, while OpenGL maintained a strong following. Direct3D 7.0 introduced support for hardware-accelerated transform and lighting (T&L) for Direct3D, while OpenGL had this capability already exposed from its inception. 3D accelerator cards moved beyond being just simple rasterizers to add another significant hardware stage to the 3D rendering pipeline. The Nvidia GeForce 256 (also known as NV10) was the first consumer-level card released on the market with hardware-accelerated T&L, while professional 3D cards already had this capability. Hardware transform and lighting, both already existing features of OpenGL, came to consumer-level hardware in the '90s and set the precedent for later pixel shader and vertex shader units which were far more flexible and programmable.

Im even building an old AMD K7 AMD Athlon Slot A Thunderbird - (T-Bird) 750MHz Slot A CPU with........ 3Dfx Voodoo 5 5500 64MB AGP 2X and TWO (2x) 2x 3Dfx Voodoo2 2000 PCI in SLI .

if im wrong about this, im at least right when either 3Dfx and or ATI Technologies both had invented a dual GPU / dual card setup aka .by ATI Tech. the, ( ATi Rage Furry MAXX) which was the first ATi Graphics card that had multiple ( 2x ) GPU Chips on a single graphics card, with 3Dfx Interactive INC. was the first , 3D Graphics accelerator company to design a dual card system for increasing 3D acceleration performance which was SLI Technology, Invented / or developed by 3Dfx Interactive. SLI Technology, SLI stands for ****( SCAN LINE INTERLEAVE )**** which connected 2 3Dfx Voodoo2's (either 8MB or 12MB Voodoo2 3D accelerators to increase 3D performance, 3Dfx then, eventually produced the....
3Dfx Voodoo 5 5500 64MB (32MB SGRAM*or*DDR) per VSA-100 GPU, an was in the PCI Interface, but ultimately, also had the AGP version as well, which i own the AGP version. 3Dfx Interactive's Last 3D Accelerator was, or would have been the 3Dfx Voodoo 5 6000 AGP which would of had,......
4x 3Dfx VSA-100 GPU Chips on a single card, but do to needing an external power supply, and i believe 3Dfx was having a lot of problems with the Voodoo 5 6000, ...
which would of had, not only 4x VSA-100 GPUS, it would of had 32MB of SGRAM*or* 1st Gen DDR Vram, the voodoo 5 5500 had 64MB for the Two VSA-100 GPUS, the Voodoo 5 6000 would of had 128MB or Vram for all 4 gpus, and prob, would have been the first 3D accelerator wit h 128MB of Vram, even though the 4 gpus ran in SLI, still, id love to have the voodoo 5 6000, just for the hell of it, whether the card worked or not... but the SLI technology nor the AMD/ATI crossfire X, was not invented by ATI/AMD nor nVidia.

yeah, they invented their versions, u know, ATI/AMD called their form of SLI, Crossfire / Crossfire X, and nshitia's (nVidia) SLI is just called SLI that they got when they bought out 3Dfx Interactive, which i despised nVidia for in the first place and this was years ago back in the early 2000's 1999, - 2002, and all nvidia used was the SLI tech, nvidia completely discarded the 3Dfx Glide2x /glide3x, 3Dfx's Glide API, which had problems, but i still loved the quality and Performance that i had from the glide API and 3dfx Voodoo cards... my first 3Dfx card was a creative labs voodoo banshee, which was the first 3Dfx, 3D Hardware accelerator which also had 2D hardware acceleration on a single GPU die, a 3Dfx Voodoo Banshee was just pretty much, a Voodoo 2 ,the two 3Dfx voodoo 2 Pixel Pipeline unit chips & the 3Dfx voodoo 2's single texel unit chip , (or Vise Versa) -i may have that backwards- but all built into a singel 3Dfx GPU chip with full 2D Hardware Acceleration and clocked higher than the Voodoo2's original gpu clocks, the voodoo banshee, i believe ran at either 100MHZ or 166MHZ , i cant rememberm but i think the 3dfx banshee was 100MHz, but the voodoo 2's pixel / texel pipeline chips ran at 90MHZ so the banshee was an excelently improved voodoo2 with out the need for a seperate 2D accelerator and a pass through vga cable..


but my point is, i FUCKING despise Nvidai, and ill always call them nShitia, Nice, any other nVidai hater like my code name for nvida, nShitia***

and im not a AMD/ati fanboy, AMD products are just the only hardware parts i can afford, they are way cheaper than INTEL or nShitia hardware and provide performance that is good enough for games i play, if i need more performance, then i do a slight over clock on my FX-8150 CPU, and my Radeon R9 290 GPU which is right now, my FX-8150 is clocked @ 4.0 Ghz, CPU, with no turbo core and all 4 core modules enabled for 8 cores, 2 cores per pair/module, which has good performance, just AMD needs to finally out perform Intel cpus with AMD's Zen cores, and i cant wait to try out an AMD ZEN FX Processor.. cant we buy Engineering samples of AMD Zen damnit... ok an peice out..

Nvidia and Amd both do crappy things sometimes but I would like to point out that Nvidia makes a nice profit and the shareholders profit from that whereas AMD goes into the red by hundreds of millions of dollars and owes more than they are worth. Their share price has dropped from about $40 to about $2 in the last 10 years and they could bankrupt in 2019 if they can't come up with 600 million dollars to repay just one of their big debts.

Also, one more thing.....

 
Joined
Aug 20, 2007
Messages
20,787 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
Adam Krazispeed, that was almost artful in how bad it was. should I... thank it?
 
Joined
Feb 8, 2012
Messages
3,013 (0.68/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Adam Krazispeed, that was almost artful in how bad it was. should I... thank it?
You just did without actually doing it ... your charisma has increased, rest to level up.
 
Joined
Sep 29, 2013
Messages
97 (0.03/day)
Processor Intel i7 4960x Ivy-Bridge E @ 4.6 Ghz @ 1.42V
Motherboard x79 AsRock Extreme 11.0
Cooling EK Supremacy Copper Waterblock
Memory 65.5 GBs Corsair Platinum Kit @ 666.7Mhz
Video Card(s) PCIe 3.0 x16 -- Asus GTX Titan Maxwell
Storage Samsung 840 500GBs + OCZ Vertex 4 500GBs 2x 1TB Samsung 850
Audio Device(s) Soundblaster ZXR
Power Supply Corsair 1000W
Mouse Razer Naga
Keyboard Corsair K95
Software Zbrush, 3Dmax, Maya, Softimage, Vue, Sony Vegas Pro, Acid, Soundforge, Adobe Aftereffects, Photoshop
In other news, seems like the new driver screwed up VRayRT for me in some manner. Any rendering with a light cache GI just produces blotches of black on the 3D models. I thought it was an issue with the diffuse maps, but there's nothing wrong with them. In addition, it works perfectly fine in Active Shader Mode. Gee NVidia, why you got to break my GPU-rendering features. You know one of those things that made you supposively better than AMDerps. Now I have to go back to rendering on the CPU--smooth move bro-dumbo! The prior driver was working just fine till you had to f**** it up...

Don't really play much PC Games. So I don't have much to gripe about there. The driver gave me zero issues during the installation...
 
Top