News Posts matching "NVIDIA"

Return to Keyword Browsing

Pay $160 for the AREZ Sticker: The Mess GPP Landed AIC Partners and Consumers in

The same exact graphics cards, made by the same exact manufacturer, in the same exact factory, with the only difference being the "AREZ Strix" branding, priced a whopping USD $160 apart - that's the kind of mess NVIDIA GPP (GeForce Partners Program) left in its wake. Newegg lists the ASUS ROG Strix Radeon RX Vega 64 (STRIX-RXVEGA64-O8G-GAMING) graphics card at USD $589.99. This card was made before ASUS decided to re-brand its AMD Radeon graphics cards under the AREZ Strix brand, necessitated by NVIDIA GPP. The post-rebrand AREZ Strix Radeon RX Vega 64 (AREZ RXVEGA64-O8G-GAMING), is priced at $749.99 on the same site, a whopping $160 premium for what is basically a sticker. Just to make sure this isn't a discrepancy between the various sellers from Newegg's marketplace, we also post screenshots that confirm both listings are "sold and shipped by Newegg" (and not a marketplace partner).

We noticed this anomaly on Newegg last week (the week of 9th July), and initially dismissed it for a listing error that would be resolved by the retailer in a couple of days. The week passed, and the listings didn't change. NVIDIA triggered a strong backlash for the language of its GeForce Partners Program (GPP), which implicitly forced its AIC (add-in card) partners to keep their well-established gaming hardware brands (eg: ROG, Aorus, MSI Gaming, etc.,) exclusive to GeForce GTX graphics cards, forcing them to re-brand their AMD Radeon products (and stripping them of those well-established brands, thereby putting AMD at a disadvantage). NVIDIA eventually cancelled GPP, but not before the likes of ASUS and MSI committed changes to their product stacks. AREZ is the Frankenstein's monster that was too late to abort, which now threatens to rip off uninformed consumers.

An Anthem for SLI: Bioware's New Universe in 60 FPS 4K Run on Two NVIDIA GeForce GTX 1080Ti GPUs

Lo and Behold: SLI working properly. This was my first reaction whilst reading up on this potential news piece (which, somewhat breaking the fourth wall, actually did end up as one). My thought likely isn't alone; it's been a while since we heard of any relevant dual graphics card configuration and performance improvement now, as developers seem to be throwing dreams of any "Explicit Multi-GPU" tech out of the water. This slight deviation from the news story aside, though: Anthem needed two of the world's fastest GPUs running in tandem to deliver a 4K, 60 FPS experience.

Let's Go Driverless: Daimler, Bosch Select NVIDIA DRIVE for Robotaxi Fleets

(Editor's Note: NVIDIA continues to spread its wings in the AI and automotive markets, where it has rapidly become the de facto player. While the company's gaming products have certainly been the ones to project the company's image - and profits - that allowed it to come to be one of the world's leading tech companies, it's hard to argue that AI and datacenter accelerators has become one of the chief departments in raking in profits for the company. The company's vision for Level 4 and Level 5 autonomous driving and the future of our connected cities is an inspiring one, that came straight from yesterday's science fiction. Here's hoping the human mind, laws and city design efforts accompany these huge technological leaps -or at least don't strangle them too much.)

Press a button on your smartphone and go. Daimler, Bosch and NVIDIA have joined forces to bring fully automated and driverless vehicles to city streets, and the effects will be felt far beyond the way we drive. While the world's billion cars travel 10 trillion miles per year, most of the time these vehicles are sitting idle, taking up valuable real estate while parked. And when driven, they are often stuck on congested roadways. Mobility services will solve these issues plaguing urban areas, capture underutilized capacity and revolutionize the way we travel.

AMD Beats NVIDIA's Performance in the Battlefield V Closed Alpha

A report via PCGamesN points to some... interesting performance positioning when it comes to NVIDIA and AMD offerings. Battlefield V is being developed by DICE in collaboration with NVIDIA, but it seems there's some sand in the gears of performance improvements as of now. I say this because according to the report, AMD's RX 580 8 GB graphics card (the only red GPU to be tested) bests NVIDIA's GTX 1060 6GB... by quite a considerable margin at that.

The performance difference across both 1080p and 1440p scenarios (with Ultra settings) ranges in the 30% mark, and as has been usually the case, AMD's offerings are bettering NVIDIA's when a change of render - to DX12 - is made - AMD's cards teeter between consistency or worsening performance under DX 12, but NVIDIA's GTX 1060 consistently delivers worse performance levels. Perhaps we're witnessing some bits of AMD's old collaboration efforts with DICE? Still, It's too early to cry wolf right now - performance will only likely improve between now and the October 19th release date.

Lenovo to Update Legion Y530 with GeForce GTX 1160

Lenovo is mincing no words to talk about NVIDIA's upcoming GeForce 11-series graphics processors being part of its future GPU options for desktops and notebooks. LaptopMedia reports that the company is planning to make the mid-range GeForce GTX 1160 an option for its Legion Y530 15-inch gaming notebook. It features a 15.6-inch IPS display with Full HD (1920 x 1080 pixels) resolution, and a 144 Hz option for this display. Back to the GTX 1160, and LaptopMedia seems to confirm that the GPU will feature 6 GB of dedicated memory. If NVIDIA is doubling memory with this generation, this could indicate 6 GB to be a successor to the GTX 1060 3 GB; and the likelihood of a better endowed 12 GB GTX 1160 to succeed the GTX 1060 6 GB. It could be an action-packed 2H-2018 for PC graphics.

NVIDIA Releases GeForce 398.46 Hotfix Drivers

NVIDIA today released the GeForce 398.46 Hotfix drivers. These kind of driver releases are expeditiously rolled out to address glaring bugs with drivers, usually affecting gameplay of major titles. The 398.46 Hotfix drivers primarily address the issue of random black textures in "Wolfenstein II: The New Colossus" from publisher Bethesda Softworks. This game uses the Vulkan API and takes advantage of mega-textures. As a reminder, the driver is only available for 64-bit versions of Windows 10. Grab the driver from the link below.
DOWNLOAD: NVIDIA GeForce 398.46 Hotfix Driver

NVIDIA "GT104" Based GeForce GTX 1180 Surfaces on Vietnamese Stores

A Vietnamese online store put up the first listing of a GeForce GTX 1180 based ASUS ROG Strix graphics card. The store even put out some specifications of the card, beginning with it being based on the "GT104" silicon, based on the "Turing" series. With "Turing" NVIDIA appears to be forking its GPU architectures on the basis of chips that feature DPFP (double-precision floating point) cores and Tensor cores, and those that lack both (and only feature SPFP cores). "Turing" is probably a fork of "Volta" that lacks both DPFP CUDA cores and Tensor cores; and sticks to the cheaper GDDR6 memory architecture, while "Volta" based GPUs, such as the TITAN V, implement pricier HBM2 memory.

Among the specifications of the GeForce GTX 1180 are 3,584 CUDA cores, and 16 GB of GDDR6 memory across a 256-bit wide memory interface. The memory is clocked at 14 GHz (GDDR6-effective), which works out to 409.6 GB/s of memory bandwidth. Pre-launch prices, just like most specifications, tend to be bovine excrement, which in this case converts to a little over USD $1,500, and isn't really relevant. What is, however, interesting is the availability date of September 28.

Due to Reduced Demand, Graphics Cards Prices to Decline 20% in July - NVIDIA Postponing Next Gen Launch?

DigiTimes, citing "sources from the upstream supply chain", is reporting an expected decrease in graphics card pricing for July. This move comes as a way for suppliers to reduce the inventory previously piled in expectation of continued demand from cryptocurrency miners and gamers in general. It's the economic system at work, with its strengths and weaknesses: now that demand has waned, somewhat speculative price increases of yore are being axed by suppliers to spur demand. This also acts as a countermeasure to an eventual flow of graphics cards from ceasing-to-be miners to the second-hand market, which would further place a negative stress on retailers' products.

Alongside this expected 20% retail price drop for graphics cards, revenue estimates for major semiconductor manufacturer TSMC and its partners is being revised towards lower than previously-projected values, as demand for graphics and ASIC chips is further reduced. DigiTimes' sources say that the worldwide graphics card market now has an inventory of several million units that is being found hard to move (perhaps because the products are already ancient in the usual hardware tech timeframes), and that Nvidia has around a million GPUs still pending logistical distribution. Almost as an afterthought, DigiTimes also adds that NVIDIA has decided to postpone launch of their next-gen products (both 12 nm and then, forcibly, 7 nm) until supply returns to safe levels.

Lenovo Blurts Out GeForce "11-series"

A Lenovo spokesperson inadvertantly disclosed that NVIDIA's next GeForce GTX consumer lineup will follow the numbering sequence "11-series" (eg: GTX 1180), laying to rest rumors that it could even follow the "20-series" (eg: GTX 2080) naming convention. Speaking with Brainbean at the company's E3 booth (published last week), the spokesperson was shown describing the company's Legion Cube gaming desktops, which ships with a GeForce GTX 1060 graphics card in its base variant. The spokesperson is then heard saying that along the road, the company would expand its graphics options to include GeForce "11-series."

This should mean that the company will follow a predictable launch cycle of introducing its next-generation graphics architecture with the GeForce GTX 1180 and possibly GTX 1170; following it up with smaller GTX 1160 and GTX 1150; and in the following months, release the big GTX 1180 Ti. The spokesperson hints at the likelihood of Lenovo adding these as options to the Legion desktops by "Fall 2018," which could mean a DIY channel launch by late-Summer or early-Fall.
The video follows.

On The Coming Chiplet Revolution and AMD's MCM Promise

With Moore's Law being pronounced as within its death throes, historic monolithic die designs are becoming increasingly expensive to manufacture. It's no secret that both AMD and NVIDIA have been exploring an MCM (Multi-Chip-Module) approach towards diverting from monolithic die designs over to a much more manageable, "chiplet" design. Essentially, AMD has achieved this in different ways with its Zen line of CPUs (two CPU modules of four cores each linked via the company's Infinity Fabric interconnect), and their own R9 and Vega graphics cards, which take another approach in packaging memory and the graphics processing die in the same silicon base - an interposer.

NVIDIA Releases GeForce 398.36 WHQL Drivers

NVIDIA today released GeForce 398.36 WHQL software. These drivers come game-ready for Ubisoft's upcoming "The Crew 2" game. The drivers also bring new and updated SLI profiles for titles such as Dark Souls Remastered, Hand of Fate 2, Need for Speed Payback, and Super Mega Baseball 2. A 3D Vision Profile for Output Zero - Good is included as well. The fixed issues include Pascal graphics cards crashing spontaneously in Gear of War 4, G-SYNC remaining active after a game, game crashes when launched in Surround mode, and many other fixes.

DOWNLOAD: NVIDIA GeForce 398.36 WHQL

Shuttle Introduces DH02U 1.3-liter Mini-PC with GeForce GTX 1050

Shuttle's product family of 1.3-liter PCs has a powerful new addition. As the first model in this format, the DH02U no longer relies solely on the graphics performance which is determined by the soldered-on processor. For the first time, the latter is supported by an NVIDIA GeForce GTX 1050 graphics chip with 4 GB of memory. This means that the DH02U is fast enough for fluent 3D visualization and to operate, via HDMI 2.0b, four high-resolution monitors in 4K at 60 Hz.
  • Suitable for ambient temperatures of up to 50 °C
  • Support for up to four 4K monitors
  • Available in two versions: with Intel Celeron or Intel Core i5 processor

Revised NVIDIA Reviewers NDA Raises Eyebrows: Our Thoughts

An "attack on journalism" exclaims German tech publication Heise.de, on NVIDIA's latest non-disclosure agreement (NDA), a document tech journalists and reviewers have to sign in order to receive graphics card samples and information from NVIDIA. The language of this NDA, released verbatim to the web by Heise, provides a glimpse of what terms reviewers agree to, in order to write launch-day reviews of new products. NDAs are sort of like the EULA you agree to before installing software. There are NDAs for even little things like new thermal pastes, and reviewers end up signing dozens of them each year. Over time, it becomes second nature for reviewers to not publish before a date prescribed by the manufacturer, NDA or not.

The spirit of an NDA is: "we are giving you information/a sample in good faith, don't post your review before date/time/timezone." Such an NDA casts no aspersions on the credibility of the review since it doesn't dictate how the review should be, or what it should say. It doesn't say "don't post your review before we approve what you wrote." NVIDIA samples usually ship with a PDF titled "reviewer's guide," which only politely suggests to reviewers something along the lines of "here's our cool new graphics card that's capable of playing this game at that resolution with these settings, just don't test it on something like Linux with Nouveau drivers, because that either won't work or won't show what our card is truly capable of." Heise's close inspection of the latest NDA by NVIDIA suggests to them that NVIDIA is mandating positive reviews now. We disagree.

NVIDIA GV102 Prototype Board With GDDR6 Spotted, Up to 525 W Power Delivery. GTX 1180 Ti?

Reddit user 'dustinbrooks' has posted a photo of a prototype graphics card design that is clearly made by NVIDIA and "tested by a buddy of his that works for a company that tests NVIDIA boards". Dustin asked the community what he was looking at, which of course got tech enthusiasts interested.

The card is clearly made by NVIDIA as indicated by the markings near the PCI-Express x16 slot connector. What's also visible is three PCI-Express 8-pin power inputs and a huge VRM setup with four fans. Unfortunately the GPU in the center of the board is missing, but it should be GV102, the successor to GP102, since GDDR6 support is needed. The twelve GDDR6 memory chips located around the GPU's solder balls are marked as D9WCW, which decodes to MT61K256M32JE-14:A. These chips are Micron-made 8 Gbit GDDR6, specified for 14 Gb/s data rate, operating at 1.35 V. With twelve chips, this board has a 384-bit memory bus and 12 GB VRAM. The memory bandwidth at 14 Gbps data rate is a staggering 672 GB/s, which conclusively beats the 484 GB/s that Vega 64 and GTX 1080 Ti offer.

TSMC is Ramping Up 7nm Production, 5nm Next Year

At their technology symposium in Taipei, TSMC CEO CC Wei has made remarks, dismissing speculation that their 7 nanometer yield rate was not as good as expected. Rather the company is ramping up production capacity for 7 nm quickly, up 9% from 10.5 million wafers in 2017, to 12 million wafers in 2018. They plan to tape out more than 50 chip designs in 2018, with the majority of the tape outs for AI, GPU and crypto applications, followed by 5G and application processors.

Most of their orders for the 7 nanometer node come from big players like AMD, Bitmain, NVIDIA and Qualcomm. Apple's A12 processor for upcoming iPhones is also a major driver for TSMC's 7 nanometer growth. These orders will be fulfilled in early 2019, so it'll be a bit longer before we have 7 nm processors for the masses.

Next-gen 5 nanometer production will kick off next year, followed by mass production in late 2019 or early 2020. The company will invest as much as USD 25 billion in their new production facilities for this process node.

With Summit, US Regains Leadership from China in TOP500 Supercomputers Listing

We previously covered in more depth the fact that the US was gearing up to overtake China's Sunway TaihuLight, then the world's fastest supercomputer, with its Summit machine, built in collaboration between IBM (with its water-cooled Power Systems AC922 nodes with 24-core processors and 96 processing threads) and NVIDIA (GV100 GPUs).

Now, this US dream has finally come to pass, and in a big way - the Summit delivers more than double the performance of China's posterchild, coming in at 200 PetaFLOPs of computing power. Summit boasts of 27,648 Volta Tensor Core GPUs and 9,216 CPUs within its 5,600 square feet. The Summit supercomputer consumes 15 MW of power (the site where it's deployed is able to deliver up to 20 MW), which is on-par with China's Sunway - but remember, it more than doubles the peak PetaFlops from 93 to 200. A good step in the battle for supercomputer supremacy, but China still has an increasing foothold in the number of systems it has employed and registered with the TOP500.

NVIDIA G-Sync HDR Module Adds $500 to Monitor Pricing

PCPer had the opportunity to disassemble the ASUS ROG Swift PG27UQ 27", a 4K 144 Hz G-Sync HDR Monitor and found that the G-Sync module is a newer version than the one used on 1st generation G-Sync monitors (which of course do not support 4K / 144 Hz / HDR). The module is powered by an FPGA made by Altera (Intel-owned since 2015). The exact model number is Arria 10 GX 480, which is a high-performance 20 nanometer SoC that provides enough bandwidth and LVDS pins to process the data stream.

The FPGA is sold in low quantities for $2000 at Digikey and Mouser. Assuming that NVIDIA buys thousands, PCPer suggests that the price of this chip alone will add $500 to monitor cost. The BOM cost is further increased by 3 GB of DDR4 memory on the module. With added licensing fees for G-SYNC, this explains why these monitors are so expensive.

NVIDIA's Next Gen GPU Launch Held Back to Drain Excess, Costly Built-up Inventory?

We've previously touched upon whether or not NVIDIA should launch their 1100 or 2000 series of graphics cards ahead of any new product from AMD. At the time, I wrote that I only saw benefits to that approach: earlier time to market -> satisfaction of upgrade itches and entrenchment as the only latest-gen manufacturer -> raised costs over lack of competition -> ability to respond by lowering prices after achieving a war-chest of profits. However, reports of a costly NVIDIA mistake in overestimating demand for its Pascal GPUs does lend some other shades to the whole equation.

Write-offs in inventory are costly (just ask Microsoft), and apparently, NVIDIA has found itself in a miscalculating demeanor: overestimating gamers' and miners' demand for their graphics cards. When it comes to gamers, NVIDIA's Pascal graphics cards have been available in the market for two years now - it's relatively safe to say that the majority of gamers who needed higher-performance graphics cards have already taken the plunge. As to miners, the cryptocurrency market contraction (and other factors) has led to a taper-out of graphics card demand for this particular workload. The result? NVIDIA's demand overestimation has led, according to Seeking Alpha, to a "top three" Taiwan OEM returning 300,000 GPUs to NVIDIA, and "aggressively" increased GDDR5 buying orders from the company, suggesting an excess stock of GPUs that need to be made into boards.

NVIDIA CEO Jensen Huang Gives Away 20 "CEO Edition TITAN V" To Titans of the AI Industry

During the Computer Vision and Pattern Recognition conference in Salt Lake City on Wednesday, NVIDIA CEO Jensen Huang pulled a PR stunt eased by NVIDIA's currently entrenched position in the AI-acceleration market. The new TITAN V graphics card being given away under the "CEO Edition" tag feature more than double the amount of HBM memory - up to 32 GB of HBM 2 memory compared to the original Titan V's 12 GB.

This is, essentially, a Titan-branded Quadro GV100 accelerator, made all the more exotic for the limited-edition branding. It features the GV100 graphics processor - a large chip with a die area of 815 mm² and 21,100 million transistors. It features 5120 shading units, 320 texture mapping units and 128 ROPs. Also included are 640 tensor cores which help improve the speed of machine learning applications.

NVIDIA's Next-Gen Graphics Cards to Launch in Q3 2018, Breadcrumb Trail Indicates

We the media and you enthusiasts are always getting scare jumps every time a high-profile launch is announced - or even hinted at. And few product launches are as enthusing as those of new, refined graphics cards architectures - the possibilities for extra performance, bang for buck improvements, mid-tier performance that belonged in last generation's halo products - it's all a mix of merriment and expectation - even if it sometimes tastes a little sour.

Adding to the previous breadcrumbs neatly laid-out regarding NVIDIA's Hot Chips presentation on a new "Next Generation mainstream GPU", the source for et another piece of bread that would make Grettel proud comes from Power Logic, a fan supplier for numerous AIB partners (company representative holding an EVGA graphics card below), who have recently said they expected "Q3 orders to be through the roof". Such an increase in demand usually means increased orders as AIB partners stock up on materials to produce a substantial enough stock for new product launches, and does fall in line with the NVIDIA Hot Chips presentation in August. Q3 starts in July, though, and while the supply-chain timings are unknown, it seems somewhat tight for a July product launch that coincides with the increased fan orders.

NVIDIA Joins S&P 100 Stock Market Index

With tomorrow's opening bell, NVIDIA will join the Standard and Poors S&P 100 index, replacing Time Warner. The spot that NVIDIA is joining in has been freed up by the merger of Time Warner with AT&T. This marks a monumental moment for the company as membership in the S&P 100 is reserved for only the largest and most important corporations in the US. From the tech sector the list comprises illustrious names such as Apple, Amazon, Facebook, Google Alphabet, IBM, Intel, Microsoft, Netflix, Oracle, Paypal, Qualcomm and Texas Instruments.

NVIDIA's stock has seen massive gains over the last years, thanks to delivering record quarter after record quarter. Recent developments have transformed the company from a mostly gaming GPU manufacturer to a company that is leading in the fields of GPU compute, AI and machine learning. This of course inspires investors, so the NVIDIA stock has been highly sought after, now sitting above 265 USD, which brings the company's worth to over 160 billion USD. Congratulations!

Gigabyte Announces Availability of Two New, Smaller NVIDIA GTX 1050 3 GB Graphics Cards

Gigabyte has announced two new SKUs that join their previous interpretation of NVIDIA's GeForce GTX 1050 3 GB graphics card. Adding (or maybe subtracting, on account of it being smaller) to their lineup is the new GTX 1050 3 GB D5, which features a shorter PCB (172 mm x 113 mm x 30 mm, minus 19 mm) when compared to the original 3 GB OC model (for reference, 191 mm long, 111 mm wide, and 36 mm in height). The reduced footprint of the graphics card means there's now a single fan (now at 90 mm instead of 2x 80 mm) at work on cooling the GPU - but it's such a lean one that that factor should pose no problem. Additionally, the clocks are slightly lower on this card, and adhere to NVIDIA's reference of 1392 MHz base and 1518 MHz boost. Connectivity-wise, there's 1x DVI-D, 1x HDMI 2.0b and 1x DisplayPort 1.4 port (up to three simultaneous displays are supported).

The other, more interesting model, is the GTX 1050 OC Low Profile 3G. Like the name implies, this is a low-profile graphics card, best suited for space-starved enclosures. Coming in at 167mm x 68.9mm x 37mm, it's as small as this type of card has ever been in Gigabyte's lineup. The low profile version should feature higher performance than the D5 one, since it clocks in at 1404 MHz (1544 MHz boost clock) in the default Game Mode. OC Mode brings the graphics card's clocks to 1430 MHz and 1569 MHz respectively. Likely owing to an expected HTPC usage for these graphics cards, Gigabyte has added an extra HDMI 2.0 port to this card, bringing connectivity to 1x DVI-D, 2x HDMI 2.0b, and 1x DisplayPort 1.4 port.

The Cyberpunk 2077 E3 Demo Ran on a Modern, Yet Achievably-Specced PC

(Update:It has come to light that the Cyberpunk 2077 E3 Demo actually ran at 1080p, not 4K, as previously speculated on this story.)

Cyberpunk 2077 is likely one of the more highly anticipated videogames in recent times, due in no small part - well, due specifically - to CD Projekt Red's pedigree as a developer. To say that any "projekt" the Polish team chose to tackle would be met with silly levels of expectations is likely correct - few developers have followed their stratospherical level of improvement, time and again, with every new game release.

While the E3 demo shown during Microsoft's press conference was met with extreme enthusiasm, there was some level of fear as well, due to the developers' choice to tackle the Cyberpunk universe from a first-person perspective instead of the third-person one they've perfected over the years. But after all is said and done, a demo is a demo, and the gaming press has been much more vocal about the closed-doors gameplay experience they were offered.

Lenovo Reveals ThinkPad P52 with Xeon Hexa-Core CPU and 128 GB of RAM

Lenovo recently announced its latest ThinkPad P52 mobile workstation designed for 3D rendering, content creation, and AI simulations. The laptop can be equipped with a 8th generation Intel Core or Xeon hexa-core processor. What's amazing with the Lenovo ThinkPad P52 is its ability to house up to 128 GB of DDR4 memory and 6 TB of storage. The P52 flaunts a 15.6-inch 4K display with a 10-bit color depth, 100% Adobe color gamut, and 400 nits of brightness. The laptop's graphics duties are delegated to a mobile high-end NVIDIA Quadro P3200 GPU with 6 GB of GDDR5 memory.

The ThinkPad P52 comes with two Thunderbolt 3 ports alongside three USB-A 3.1 Gen 1 ports, a HDMI 2.0 port, and a mini DisplayPort 1.4. It also comes with a Smart Card and an integrated 4-in-1 card reader. Internet connectivity includes a conventional RJ45 Gigabit Ethernet port, Intel Wireless-AC 9560 802.11ac adapter with integrated Bluetooth 5.0, and a Fibocomm 4G LTE (Cat 9) modem. The ThinkPad P52 isn't only fast, but it's probably one of the most secure mobile workstations to date. Lenovo implemented various security measures into the P52 such as TPM 2.0, IR camera, fingerprint reader, and ThinkShutter. The manufacturer didn't disclose the pricing of the ThinkPad P52. However, we'll find out soon as the laptop should be available by late June.

NVIDIA Has a DisplayPort Problem Which Only a BIOS Update Can Fix

NVIDIA "Maxwell" and "Pascal" graphics architectures introduced support for modern display connectivity to keep up with the breakneck pace at which display resolutions are scaling up. The two introduce support for DisplayPort 1.4 and 1.3, however the implementation is less than perfect. Some of the newer monitors that leverage DisplayPort 1.4 or 1.3 standards don't function as designed on "Maxwell" (GeForce GTX 900 series) and "Pascal" (GeForce 10-series) graphics cards, with users reporting a range of bugs from blank screens until the operating system loads, to frozen boot sequences.

Unfortunately, these issues cannot be fixed by driver updates, and require graphics card BIOS updates. Luckily, you won't be at the mercy of lethargic AIC partners looking to limit their warranty claims by going slow on BIOS updates, or NVFlash rocket-science. NVIDIA released a tool which can detect if your graphics card needs the update, and then updates the BIOS for you, from within Windows. The app first unloads your driver, and flashes your graphics card BIOS (a process which must not be interrupted, lest you end up with an expensive brick).

Update: We have confirmation that the tool is intended for both reference-design and custom-design graphics cards.
DOWNLOAD: NVIDIA Graphics Firmware Update Tool for DisplayPort 1.3 and 1.4 Displays
Return to Keyword Browsing