• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 390.65 Driver with Spectre Fix Benchmarked in 21 Games

Joined
Sep 22, 2017
Messages
889 (0.37/day)
The Meltdown and Spectre vulnerabilities have been making many headlines lately. So far, security researchers have identified three variants. Variant 1 (CVE-2017-5753) and Variant 2 (CVE-2017-5715) are Spectre, while Variant 3 (CVE-2017-5754) is Meltdown. According to their security bulletin, NVIDIA has no reason to believe that their display driver is affected by Variant 3. In order to strengthen security against Variant 1 and 2, the company released their GeForce 390.65 driver earlier today, so NVIDIA graphics card owners can sleep better at night.

Experience tells us that some software patches come with performance hits, whether we like it or not. We were more than eager to find out if this was the case with NVIDIA's latest GeForce 390.65 driver. Therefore, we took to the task of benchmarking this revision against the previous GeForce 388.71 driver in 21 different games at the 1080p, 1440p, and 4K resolutions. We even threw in an Ethereum mining test for good measure. Our test system is powered by an Intel Core i7-8700K processor overclocked to 4.8 GHz, paired with G.Skill Trident-Z 3866 MHz 16 GB memory on an ASUS Maximus X Hero motherboard. We're running the latest BIOS, which includes fixes for Spectre, and Windows 10 64-bit with Fall Creators Update, fully updated, which includes the KB4056891 Meltdown Fix.


We grouped all 21 games, each at three resolutions, into a single chart. Each entry on the X axis is for a single test, showing the percentage difference between old and new driver in percent. Negative values stand for a performance decrease when using today's driver. Positive numbers for performance gained.

Cryptominers can rest assured that the new GeForce 390.65 driver won't affect their profits negatively. Our testing shows zero impact in Ethereum mining. With regard to gaming, there is no significant difference in performance either. The new driver actually gains a little bit of performance on average over the previous version (+0.32%). The results hint at some undocumented small performance gains in Wolfenstein 2 and F1 2017; the other games are nearly unchanged. Even if we exclude those two titles, the performance difference is still +0.1%. The variations that you see in the chart above are due to random effects and due to limited precision on taking measurements in Windows. Generally, for the kind of testing done in our VGA reviews we typically expect 1-2% margin of error between benchmark runs, even when using the same game, at identical settings, on the same hardware.

View at TechPowerUp Main Site
 
Joined
Oct 2, 2004
Messages
13,791 (1.94/day)
I don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
 
Joined
Oct 16, 2013
Messages
41 (0.01/day)
Processor i7 4930k
Motherboard Rampage IV Extreme
Cooling Thermalright HR-02 Macho
Memory 4 X 4096 MB G.Skill DDR3 1866 9-10-9-26
Video Card(s) Gigabyte GV-N780OC-3GD
Storage Crucial M4 128GB, M500 240GB, Samsung HD103SJ 1TB
Display(s) Planar PX2710MW 27" 1920x1080
Case Corsair 500R
Power Supply RAIDMAX RX-1200AE
Software Windows 10 64-bit
I don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
NVIDIA is patching for their GPUs not CPUs made by other vendors. NVIDIA does market GPGPU capability of their products so technically sensitive information may be stored in GPU cache; I guess it's just NVIDIA wants to show their attitude towards market, actual possibility of being targeted should be minimal. Who's gonna use GPU to process anything related to their password?
 
Joined
May 4, 2011
Messages
633 (0.13/day)
System Name Smooth-Operator
Processor AMD Ryzen 7 3800x
Motherboard Asrock x570 Taichi
Cooling AMD Wraith Prism
Memory 2x16GB 3200MHz CL16@CL14 DDR4
Video Card(s) Sapphire Radeon RX 580 8GB NITRO+
Storage 2x4TB WD HGST 7K6 7200RPM 256MB
Display(s) Samsung S24E370DL 24" IPS Freesync 75Hz
Case Fractal Design Focus G Window Blue
Audio Device(s) Creative X-Fi Titanium PCIe x1
Power Supply Corsair HX850 80+ Platinum
Mouse Gigabyte Aorus M3
Keyboard Zalman ZM-K300M
Software Windows 10 x64 Enterprise/Ubuntu Budgie amd64
I don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
Exactly my thoughts. How is gpu driver supposed to fix cpu related problems, especially cpu architecture flaws. IMO this sentence in changelog is PR bs very much in nvidia's style. The closest thing to that i can recall is MFAA - "aa" which "improves" msaa and for some reason release of mfaa somehow happened with exactly that driver where users started to see more aliasing across many games.
 
Joined
May 2, 2014
Messages
40 (0.01/day)
System Name The Veil of Yamineko (Upcoming Build)
Processor Intel Core "Devil's Canyon" i7-4790k @ 4.0 Ghz (4 CPUs, 8 Threads)
Motherboard ASRock Z97 Extreme4
Cooling 1 x Noctua NH D14 CPU Cooler, 2 x Noctua Case Fans (140mm)
Memory 16 GB DDR3 SDRAM @ 1600 MHz
Video Card(s) EVGA GTX 780 Ti Dual Classified 3GB GDDR5 384-bit
Storage 1 x 2TB SATA Drive @ 7200RPM, 1 x 256 GB SSD
Display(s) AOC 21.5" LED @ 1080p 16:9
Case Corsair 500R (White)
Power Supply Corsair 850W
Software Windows 7 Professional
I don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.

They might not really be affected by this, but they're working with other companies to iron out performance issues.

  • NVIDIA has determined that these exploits (or other similar exploits that may arise) do not affect GPU computing, so their hardware is mostly immune. They will be working with other companies to update device drivers to help mitigate any CPU performance issues, and they are evaluating their ARM-based SoCs (Tegra).
Source: https://www.androidcentral.com/meltdown-spectre (about halfway)
 
Joined
Jun 13, 2012
Messages
1,316 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
I don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
If the CPU is busying handleing other tasks instead of running the game and driver it can effect fps a bit. if you every play a game and stream at same time you notice when OBS is encoding and using cpu you do have a bit of fps lose if the game can use all of the cpu on its own. There is only so much Nvidia can do to optimize the driver to limit the amount of cpu impact though. If the spectre update can cost 30%(which who knows who made the this claim being this high so far as i know there was no source quoted), Could in theory lose 20-30% fps.

Unless something has changed over the time since Story I read where they used Doom, gtx1060 vs rx 480 test. If you had high end cpu both cards were close with i think 480 a hair ahead, But as they went down the line with slower cpu the performance for gtx1060 stayed consistent where as the rx 480 lost a bit of fps. This was a while ago sure AMD has worked on that and made it better but it does prove that GPU driver if not optimized can be harmed a bit by cpu being slowed down.
 
Last edited:
Joined
Dec 30, 2010
Messages
2,082 (0.43/day)
I don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.

I remember a guy once found an exploit on Nvidia videocards. He was watching a porn site, and when he closed it, it seemed that the GPU did'nt really clear the memory holding cache of the info he was watching.

In other words, everything you do on your computer gets stored in VRAM. This was'nt cleared the moment you close or minimize the application.

I'm sure this and a few other fixes. If you where banking or whatever, something with codes or sensitive information, i'm sure this gets saved in the VRAM as well, patch is written to prevent stealing data from one instance to another in VRAM.
 
Joined
Sep 26, 2012
Messages
856 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling Noctua NH-D15S, 7 x Noctua NF-A14 industrialPPC IP67 2000RPM
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 Strix
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Alienware AW3821DW, Wacom Cintiq Pro 15
Case Fractal Design Torrent
Audio Device(s) Topping A90/D90 MQA, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply ASUS THOR 1600T
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + OpenSUSE MicroOS
I don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.

Spectre allows for application snooping. Whilst the exploit is hardware based, an application vendor can help mitigate the ability for spectre to snoop on it by changing how it protects itself.
 
Joined
Mar 31, 2012
Messages
828 (0.19/day)
Location
NL
System Name SIGSEGV
Processor INTEL i7-7700K | AMD Ryzen 2700X
Motherboard QUANTA | ASUS Crosshair VII Hero
Cooling Air cooling 4 heatpipes | Corsair H115i | Noctua NF-A14 IndustrialPPC Fan 3000RPM
Memory Micron 16 Gb DDR4 2400 | GSkill Ripjaws 32Gb DDR4 3200 3400(OC) 14-14-14-34 @1.38v
Video Card(s) Nvidia 1060 6GB | Gigabyte 1080Ti Aorus
Storage 1TB 7200/256 SSD PCIE | ~ TB | 970 Evo
Display(s) 15,5" / 27"
Case Black & Grey | Phanteks P400S
Audio Device(s) Realtek
Power Supply Li Battery | Seasonic Focus Gold 750W
Mouse g402
Keyboard Leopold|Ducky
Software LinuxMint KDE |UBUNTU | Windows 10 PRO
Benchmark Scores i dont care about scores
I don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.

LOL. /jokesonthissiteisbeyondofmyimagination
 
Joined
Feb 14, 2012
Messages
2,304 (0.52/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
Exactly my thoughts. How is gpu driver supposed to fix cpu related problems, especially cpu architecture flaws.

The gpu driver runs with priveledge, and by recoding key indirect branches, it closes a side band data leak.
 
Joined
Sep 15, 2011
Messages
6,457 (1.41/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
If you look to the graph closelly, the biggest performance inpact is in those games that are CPU bound ;)
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.61/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
They are covering their rears is all, not a big deal.
 
Joined
Dec 22, 2011
Messages
3,890 (0.87/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
If you look to the graph closelly, the biggest performance inpact is in those games that are CPU bound ;)

I think the CPU duopoly needs a shack up, both of them dragging their feet holding the industry back.

Raja will save the day.
 
Joined
Aug 2, 2011
Messages
1,448 (0.31/day)
Processor Ryzen 9 7950X3D
Motherboard MSI X670E MPG Carbon Wifi
Cooling Custom loop, 2x360mm radiator,Lian Li UNI, EK XRes140,EK Velocity2
Memory 2x16GB G.Skill DDR5-6400 @ 6400MHz C32
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra OC Scanner core +750 mem
Storage MP600 2TB,960 EVO 1TB,XPG SX8200 Pro 1TB,Micron 1100 2TB,OCZ Vector 512GB,1.5TB Caviar Green
Display(s) Acer X34S, Acer XB270HU
Case LianLi O11 Dynamic White
Audio Device(s) Logitech G-Pro X Wireless
Power Supply EVGA P3 1200W
Mouse Logitech G502 Lightspeed
Keyboard Logitech G512 Carbon w/ GX Brown
VR HMD HP Reverb G2 (V2)
Software Win 11
No benchmarks should be taking place until the microcode updates come out from Intel/Motherboard manufacturers. Until then these fixes are minimal.
 
Joined
Apr 12, 2013
Messages
6,728 (1.68/day)
I don't get the point of this. The flaws and fixes for it affect CPU performance, not GPU performance. So, what's the point of testing GPU's? Unless the test includes pre and post CPU patches in relation with drivers pre and post patches.
We don't know how spectre & meltdown affect GPU, I assume if there's a fix then there's also an exploitable hole somewhere that must be patched. Meltdown patch is a software fix, so there might be more ways than one to exploit that vulnerability. Perhaps GPU's are also vulnerable, I dunno? Spectre is a bigger problem, there could be more exploits (aside from Variant 1 & 2) with that vulnerability in the future.
 
Joined
Apr 12, 2013
Messages
6,728 (1.68/day)
Yes, you assume, incorrectly.
https://meltdownattack.com/
Read, Learn. Stop spreading misinformation. GPU's are not affected by these vulnerabilities.
First of all stop pretending you know everything about spectre or meltdown.
The project zero team devised an attack reading Intel manuals, they also looked at the rowhammer attack & took it as a stepping stone to make this work!
Not only that there were 4 independent teams who found it all inside a year of the rowhammer demo in 2016! Stop, read, learn -
TRIPLE MELTDOWN: HOW SO MANY RESEARCHERS FOUND A 20-YEAR-OLD CHIP FLAW AT THE SAME TIME
 
Last edited:
Joined
Oct 30, 2008
Messages
1,901 (0.34/day)
Processor 5930K
Motherboard MSI X99 SLI
Cooling WATER
Memory 16GB DDR4 2132
Video Card(s) EVGAY 2070 SUPER
Storage SEVERAL SSD"S
Display(s) Catleap/Yamakasi 2560X1440
Case D Frame MINI drilled out
Audio Device(s) onboard
Power Supply Corsair TX750
Mouse DEATH ADDER
Keyboard Razer Black Widow Tournament
Software W10HB
Benchmark Scores PhIlLyChEeSeStEaK
There has always been a java script vulnerability, it runs on your GPU......It says in the thread title Spectre Fix.
 
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
First of all stop pretending you know everything about spectre or meltdown.
I'm not and I don't. However, I've read enough from the people who discovered and researched the problem to know that your conclusions and the misinformation you are stating are incorrect.
The project zero team devised an attack reading Intel manuals, they also looked at the rowhammer attack & took it as a stepping stone to make this work!
How does that apply to a GPU? IIRC, rowhammer was mitigated several years ago with software updates, firmware revisions and hardware redesigns. Not really a problem itself nor part of these problems.
Stop, read, learn -
Cute..
Twitter? Not something any researcher takes seriously.
TRIPLE MELTDOWN: HOW SO MANY RESEARCHERS FOUND A 20-YEAR-OLD CHIP FLAW AT THE SAME TIME
Interesting, but not relevant to the context of this discussion. Again, GPU's are not vulnerable to Meltdown or Spectre. GPU processors work in different way than that of a standard CPU. That's what makes them so useful in the tasks they are made for.
There has always been a java script vulnerability, it runs on your GPU......
LOL! Now that was funny!
 
Joined
Sep 26, 2012
Messages
856 (0.20/day)
Location
Australia
System Name ATHENA
Processor AMD 7950X
Motherboard ASUS Crosshair X670E Extreme
Cooling Noctua NH-D15S, 7 x Noctua NF-A14 industrialPPC IP67 2000RPM
Memory 2x32GB Trident Z RGB 6000Mhz CL30
Video Card(s) ASUS 4090 Strix
Storage 3 x Kingston Fury 4TB, 4 x Samsung 870 QVO
Display(s) Alienware AW3821DW, Wacom Cintiq Pro 15
Case Fractal Design Torrent
Audio Device(s) Topping A90/D90 MQA, Fluid FPX7 Fader Pro, Beyerdynamic T1 G2, Beyerdynamic MMX300
Power Supply ASUS THOR 1600T
Mouse Xtrfy MZ1 - Zy' Rail, Logitech MX Vertical, Logitech MX Master 3
Keyboard Logitech G915 TKL
VR HMD Oculus Quest 2
Software Windows 11 + OpenSUSE MicroOS
I'm not and I don't. However, I've read enough from the people who discovered and researched the problem to know that your conclusions and the misinformation you are stating are incorrect.

How does that apply to a GPU? IIRC, rowhammer was mitigated several years ago with software updates, firmware revisions and hardware redesigns. Not really a problem itself nor part of these problems.

Cute..

Twitter? Not something any researcher takes seriously.

Interesting, but not relevant to the context of this discussion. Again, GPU's are not vulnerable to Meltdown or Spectre. GPU processors work in different way than that of a standard CPU. That's what makes them so useful in the tasks they are made for.

LOL! Now that was funny!

Sigh, the GPU itself as a hardware layer hasn't been shown to be vulnerable, however the driver that runs in software space can be affected by Spectre (or Meltdown) as the drivers memory space sits in the kernel, and thus ultimately can affect the GPU.
 
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
Sigh, the GPU itself as a hardware layer hasn't been shown to be vulnerable, however the driver that runs in software space can be affected by Spectre (or Meltdown) as the drivers memory space sits in the kernel, and thus ultimately can affect the GPU.
That's an interesting and intriguing point! However because of the way both of these vulnerabilities work, the GPU is not directly vulnerable and it would likely be so difficult to pull off that it would be a waste of an attackers time and effort to attempt.
 
Joined
Apr 12, 2013
Messages
6,728 (1.68/day)
How does that apply to a GPU? IIRC, rowhammer was mitigated several years ago with software updates, firmware revisions and hardware redesigns. Not really a problem itself nor part of these problems.
My bad, its a timing side channel attack on KASLR, demoed in 2016 ~
https://www.blackhat.com/docs/us-16...Layout-Randomization-KASLR-With-Intel-TSX.pdf
Cute..

Twitter? Not something any researcher takes seriously.
So you've got nothing to counter & you choose to shut your eyes & ears? The twit tells us about possible side channel attacks that could work on x86-64.
https://lwn.net/Articles/738975/
https://gruss.cc/files/kaiser.pdf

Interesting, but not relevant to the context of this discussion. Again, GPU's are not vulnerable to Meltdown or Spectre. GPU processors work in different way than that of a standard CPU. That's what makes them so useful in the tasks they are made for.
Side channel attacks could affect every piece of hardware out there. Also what did Nvidia patch if there's nothing to patch in there, explain that?

edit - You must've also missed unified memory in CUDA then, starting with CUDA 6 IIRC?
https://devblogs.nvidia.com/parallelforall/unified-memory-in-cuda-6/
 
Last edited:
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
My bad, its a timing side channel attack on KASLR, demoed in 2016 ~
https://www.blackhat.com/docs/us-16...Layout-Randomization-KASLR-With-Intel-TSX.pdf
Ok now that's a bit different. KASLR is very specific and complex attack. It's not easily carried out to begin with. Not sure how it will relate to Meltdown and Spectre, but the complexity would likely become exponential.
EDIT, I was thinking about something else when I saw that name. It seems KASLR and Kaiser are one and the same. However, this still doesn't change that fact that GPU's are not directly affected by MLTDWN&SPCTR.
So you've got nothing to counter & you choose to shut your eyes & ears.
Twitter is a convoluted mess most of the time and I will not waste my time with it.
Didn't really read up on Kaiser to much as it seemed easily fixed and somewhat limited to the Linux sector. But I did gloss over that pdf and am not seeing the connection to it, Meltdown, Spectre and GPU's
Side channel attacks could affect every piece of hardware out there.
Perhaps, but they are notoriously difficult to pull off. Most attackers either won't or can't successfully harvest usable data from such an attack. The best most could hope for is to crash the target system.
Also what did Nvidia patch if there's nothing to patch in there, explain that?
IIRC, Nvidia's latest patch release had nothing to do with MLTDN&SPCTR. Do you have a link? Google is giving me nothing..
 
Last edited:
Joined
Apr 12, 2013
Messages
6,728 (1.68/day)
Ok now that's a bit different. KASLR is very specific and complex attack. It's not easily carried out to begin with. Not sure how it will relate to Meltdown and Spectre, but the complexity would likely become exponential.
That KASLR demo was what lead to project zero's discovery, it isn't as complex if there's a hardware (design) flaw. This is a developing situation so I can't say if we'll see more spectre, meltdown variants. My assumption is that the GPU could be exposed in more ways than one, like Camm said ~ drivers for instance are vulnerable.

The GPU uarch might not have the same spectre or meltdown vulnerability, but we don't know if there are similar design flaws - which could in theory affect them, simply by studying the architecture in detail.
Starting From Zero

How did Horn independently stumble on the notion of attacking speculative execution in Intel's chips? As he tells it, by reading the manual.

In late April of last year, the 22-year-old hacker—whose job at Project Zero was his first out of college—was working in Zurich, Switzerland, alongside a coworker, to write a piece of processor-intensive software, one whose behavior they knew would be very sensitive to the performance of Intel's chips. So Horn dived into Intel's documentation to understand how much of the program Intel's processors could run out-of-order to speed it up.

He soon saw that for one spot in the code he was working on, the speculative execution quirks Intel used to supercharge its chip speed could lead to what Horn describes as a "secret" value being accidentally accessed, and then stored in the processor's cache. "In other words, [it would] make it possible for an attacker to figure out the secret," Horn writes in an email to WIRED. "I then realized that this could—at least in theory—affect more than just the code snippet we were working on, and decided to look into it."
This is what is important, a design flaw enabled 4 different teams to find the same vulnerabilities. So in essence it's just a matter of looking at the uarch long enough & hard enough, I'm not passing any judgement but I'm not ruling it out either.
IIRC, Nvidia's latest patch release had nothing to do with MLTDN&SPCTR. Do you have a link? Google is giving me nothing..
From TPU ~
Security Update
Fixed CVE-2017-5753: Computer systems with microprocessors utilizing speculative execution and branch prediction may allow unauthorized disclosure of information to an attacker with local user access via a side-channel analysis.
 
Last edited:
Top