News Posts matching "NVIDIA"

Return to Keyword Browsing

NVIDIA Settles Class-Action Lawsuit Over GTX 970 Memory

NVIDIA settled in a 2015 class-action lawsuit against it, for misrepresenting the amount of memory on GeForce GTX 970 graphics cards. The company has agreed to pay every buyer of the card USD $30 (per card), and also cover the legal fees of the class, amounting to $1.3 million. The company, however, did not specify how much money it has set aside for the payout, and whether it will compensate only those buyers who constitute the class (i.e. buyers in the U.S., since that's as far as the court's jurisdiction can reach), or the thousands of GTX 970 buyers worldwide.

"The settlement is fair and reasonable and falls within the range of possible approval," attorneys for the proposed Class said in the filing. "It is the product of extended arms-length negotiations between experienced attorneys familiar with the legal and factual issues of this case and all settlement class members are treated fairly under the terms of the settlement." The class alleged that NVIDIA falsified the amount of memory a GeForce GTX 970 GPU can really use, when an investigation found that it could only address 3.5 GB of it properly. NVIDIA CEO Jen-Hsun Huang apologized to buyers about the issue and promised that it would never happen again.

Source: TopClassActions

NVIDIA Releases GeForce 369.00 Beta with Latest OpenGL Extensions

NVIDIA released the GeForce 369.00 Beta drivers featuring three new OpenGL extensions released by its architecture review board (ARB), and NVIDIA in 2016. These include the "ARB_gl_spirv," which works on NVIDIA "Kepler" architecture and above; the "EXT_window_rectangles," extension, which requires NVIDIA "Fermi" architecture and above; and the homebrew "NVX_blend_equation_advanced_multi_draw_buffers," which requires NVIDIA "Pascal" architecture. These updates to OpenGL are also shipped for the Linux platform via the 367.36.02 drivers. Grab the drivers from the links below.
DOWNLOAD: NVIDIA GeForce 369.00 BetaImage Credit: DigitalTrends

NVIDIA Launches Maxed-out GP102 Based Quadro P6000

Late last week, NVIDIA announced the TITAN X Pascal, its fastest consumer graphics offering targeted at gamers and PC enthusiasts. The reign of TITAN X Pascal being the fastest single-GPU graphics card could be short-lived, as NVIDIA announced a Quadro product based on the same "GP102" silicon, which maxes out its on-die resources. The new Quadro P6000, announced at SIGGRAPH alongside the GP104-based Quadro P5000, features all 3,840 CUDA cores physically present on the chip.

Besides 3,840 CUDA cores, the P6000 features a maximum FP32 (single-precision floating point) performance of up to 12 TFLOP/s. The card also features 24 GB of GDDR5X memory, across the chip's 384-bit wide memory interface. The Quadro P5000, on the other hand, features 2,560 CUDA cores, up to 8.9 TFLOP/s FP32 performance, and 16 GB of GDDR5X memory across a 256-bit wide memory interface. It's interesting to note that neither cards feature full FP64 (double-precision) machinery, and that is cleverly relegated to NVIDIA's HPC product line, the Tesla P-series.

NVIDIA Accelerates Volta to May 2017?

Following the surprise TITAN X Pascal launch slated for 2nd August, it looks like NVIDIA product development cycle is running on steroids, with reports emerging of the company accelerating its next-generation "Volta" architecture debut to May 2017, along the sidelines of next year's GTC. The architecture was originally scheduled to make its debut in 2018.

Much like "Pascal," the "Volta" architecture could first debut with HPC products, before moving on to the consumer graphics segment. NVIDIA could also retain the 16 nm FinFET+ process at TSMC for Volta. Stacked on-package memory such as HBM2 could be more readily available by 2017, and could hit sizable volumes towards the end of the year, making it ripe for implementation in high-volume consumer products.


Source: WCCFTech

NVIDIA Announces the GeForce GTX TITAN X Pascal

In a show of shock and awe, NVIDIA today announced its flagship graphics card based on the "Pascal" architecture, the GeForce GTX TITAN X Pascal. Market availability of the card is scheduled for August 2, 2016, priced at US $1,199. Based on the 16 nm "GP102" silicon, this graphics card is endowed with 3,584 CUDA cores spread across 56 streaming multiprocessors, 224 TMUs, 96 ROPs, and a 384-bit GDDR5X memory interface, holding 12 GB of memory.

The core is clocked at 1417 MHz, with 1531 MHz GPU Boost, and 10 Gbps memory, churning out 480 GB/s of memory bandwidth. The card draws power from a combination of 6-pin and 8-pin PCIe power connectors, the GPU's TDP is rated at 250W. NVIDIA claims that the GTX TITAN X Pascal is up to 60 percent faster than the GTX TITAN X (Maxwell), and up to 3 times faster than the original GeForce GTX TITAN.

PNY Unveils its GeForce GTX 1060 Graphics Card

PNY today unveiled its GeForce GTX 1060 graphics card. Based on a custom-design PCB, the card sticks to NVIDIA reference clock speeds of 1506 MHz core, 1709 MHz GPU Boost, and 8 Gbps memory. The card features a custom aluminium fin-stack cooling solution with a pair of 70 mm fans. The card draws power from a single 8-pin PCIe power connector. The card is expected to be priced around $279.

MSI Announces its GeForce GTX 1060 Series

As the world's most popular GAMING graphics card vendor, MSI is proud to introduce brand new graphics cards based on NVIDIA's new Pascal GPU with fierce new looks and supreme performance to match. Featuring the same high build quality as its bigger brothers, the MSI GeForce GTX 1060 GAMING X 6G is kept cool by the impressive TWIN FROZR VI thermal design allowing for higher core and memory clock speeds for increased performance in games.

The famous shapes of the eye-catching TWIN FROZR cooler are intensified by a fiery red GAMING glow piercing through the cover, while the MSI GAMING dragon RGB LED on the side can be set to any of 16.8 million colors to match your mood or build. A completely new custom 6-phase PCB design using Military Class 4 components with an 8-pin power connector enables higher overclocking performance to push your graphics card to the max. The classy matte black solid metal backplate gives the card more structural strength and provides a nice finishing touch.

EVGA Announces its GeForce GTX 1060 Series

The EVGA GeForce GTX 1060 graphics card is loaded with innovative new gaming technologies, making it the perfect choice for the latest high-definition games. Powered by NVIDIA Pascal - the most advanced GPU architecture ever created - the GeForce GTX 1060 delivers brilliant performance that opens the door to virtual reality and beyond.

These cards also feature EVGA ACX cooling technology. EVGA ACX once again brings new and exciting features to the award winning EVGA ACX cooling technology. SHP gives increased heatpipes and copper contact area for cooler operation, and optimized fan curve for even quieter gaming. Of course, ACX coolers also feature optimized swept fan blades, double ball bearings and an extreme low power motor, delivering more airflow with less power, unlocking additional power for the GPU.

ASUS Announces the GeForce GTX 1060 STRIX Graphics Card

ASUS Republic of Gamers (ROG) today announced Strix GeForce GTX 1060, an all-new VR-ready gaming graphics card with ultra-fast gaming performance, advanced cooling and reliability, and personalized styling. Powered by the latest NVIDIA GeForce GTX 1060 graphics processing unit (GPU), clocked at 1873MHz in OC mode, ROG Strix GeForce GTX 1060 delivers up to 5%-faster performance in 3DMark Fire Strike Extreme and 6.5%-faster gaming performance in Doom.

ROG Strix GeForce GTX 1060 is packed with exclusive ASUS technologies, including DirectCU III with a patented triple wing-blade 0dB fan designed to deliver maximum airflow for 30%-cooler and three-times (3X) quieter performance, and ASUS FanConnect, which features GPU-controlled fan headers to connect to system fans for targeted supplemental cooling. Industry-exclusive Auto-Extreme technology with Super Alloy Power II components ensures premium quality and reliability.

NVIDIA GeForce GTX 1060 Now Available

NVIDIA announced availability of the GeForce GTX 1060 graphics card. Targeted at the Radeon RX 480, the GTX 1060 is priced at USD $249, however, its own Founders Edition (reference) card is priced at $299, and available exclusively from the company website. The GTX 1060 is based on the new 16 nm GP106 silicon, featuring 1,280 CUDA cores, 80 TMUs, 48 ROPs, and a 192-bit GDDR5 memory interface, holding 6 GB of memory.

The core on the GTX 1060 is clocked at 1506 MHz, with a maximum GPU Boost frequency of 1709 MHz, and 8 Gbps memory, which puts out 192 GB/s of memory bandwidth. The card draws power from a single 6-pin PCIe power connector, as the chip's TDP is rated at just 120W. You get most of the features NVIDIA introduced with the "Pascal" architecture, but the biggest change is lack of NVIDIA SLI support. Even custom-design cards will lack SLI support. NVIDIA's add-in card (AIC) partners will launch their offerings today, alongside the Founders Edition SKUs.

Futuremark Releases 3DMark Time Spy DirectX 12 Benchmark

Futuremark released the latest addition to the 3DMark benchmark suite, the new "Time Spy" benchmark and stress-test. All existing 3DMark Basic and Advanced users have limited access to "Time Spy," existing 3DMark Advanced users have the option of unlocking the full feature-set of "Time Spy" with an upgrade key that's priced at US $9.99. The price of 3DMark Advanced for new users has been revised from its existing $24.99 to $29.99, as new 3DMark Advanced purchases include the fully-unlocked "Time Spy." Futuremark announced limited-period offers that last up till 23rd July, in which the "Time Spy" upgrade key for existing 3DMark Advanced users can be had for $4.99, and the 3DMark Advanced Edition (minus "Time Spy") for $9.99.

Futuremark 3DMark "Time Spy" has been developed with inputs from AMD, NVIDIA, Intel, and Microsoft, and takes advantage of the new DirectX 12 API. For this reason, the test requires Windows 10. The test almost exponentially increases the 3D processing load over "Fire Strike," by leveraging the low-overhead API features of DirectX 12, to present a graphically intense 3D test-scene that can make any gaming/enthusiast PC of today break a sweat. It can also make use of several beyond-4K display resolutions.



DOWNLOAD: 3DMark with TimeSpy v2.1.2852

NVIDIA Releases GeForce 368.81 WHQL Virtual Reality Game Ready Drivers

NVIDIA today shipped out the GeForce 368.81 WHQL drivers, which are optimized for a suite of VR-Ready games. To begin with, the drivers come with optimization for NVIDIA's VR Funhouse (a tech demonstrator that takes advantage of Simultaneous Multi-Projection); "Everest VR," "Raw Data," a hotly anticipated AAA VR title that made waves at E3, "Obduction," and "The Assembly."
DOWNLOAD: NVIDIA GeForce 368.81 WHQL for Windows 10 64-bit | Windows 10 32-bit | Windows 8/7/Vista 64-bit | Windows 8/7/Vista 32-bit

TechPowerUp GPU-Z 1.9.0 Released

TechPowerUp today released the latest version of GPU-Z, the popular graphics subsystem information, diagnostic, and monitoring utility that no enthusiast can leave home without. With the latest version 1.9.0, GPU-Z is out of "beta." We chose 1.9.0 over the more sequential 1.0.0 as it presents better continuity and averts the confusion of 1.0.0 (read 1.0) somehow sounding older.

Version 1.9.0 adds support for NVIDIA GeForce GTX 1060, GTX 940MX, and GT 740 (GK107). On machines with GeForce "Pascal" GPUs, it can also tell if you have an SLI HB bridge installed in your machine or a classic SLI bridge. This way you know if your pre-built OEM has cut-costs. GPU-Z can now reliably extract video BIOS from AMD "Polaris" GPUs such as the Radeon RX 480. Improvements were made to the way it reads engine clock on AMD "Polaris" GPUs. All communication between GPU-Z and TechPowerUp servers (such as voluntary BIOS uploads, validations, etc.) now happens over secure HTTPS.

DOWNLOAD: TechPowerUp GPU-Z 1.9.0 | GPU-Z 1.9.0 ASUS ROG Themed
The change-log follows.

DOOM with Vulkan Renderer Significantly Faster on AMD GPUs

Over the weekend, Bethesda shipped the much awaited update to "DOOM" which can now take advantage of the Vulkan API. A performance investigation by ComputerBase.de comparing the game's Vulkan renderer to its default OpenGL renderer reveals that Vulkan benefits AMD GPUs far more than it does to NVIDIA ones. At 2560 x 1440, an AMD Radeon R9 Fury X with Vulkan is 25 percent faster than a GeForce GTX 1070 with Vulkan. The R9 Fury X is 15 percent slower than the GTX 1070 with OpenGL renderer on both GPUs. Vulkan increases the R9 Fury X frame-rates over OpenGL by a staggering 52 percent! Similar performance trends were noted with 1080p. Find the review in the link below.

Source: ComputerBase.de

Palit Intros the GeForce GTX 1060 JetStream Series Graphics Cards

Palit Microsystems Ltd, the leading graphics card manufacturer, releases the latest Pascal architecture Palit GeForce GTX 1060 Series. With innovative new gaming technologies, GeForce GTX 1060 makes it the perfect choice for the latest high-definition games.

Palit GeForce GTX 1060 Super JetStream 6GB has 1280 cores and ships with 1847 MHz Boost clock, and 8 GHz (GDDR5-effective) memory clocks. Palit GeForce GTX 1060 6GB series provides 3X VR gaming performance than previous-generation GPUs. Equipped with the JetStream cooler and optimized product design, Palit GTX 1060 JetStream series offers the best graphics benchmark performance and great acoustics effect for PC users.

ZOTAC Intros the Super Compact GeForce GTX 1060 Series

ZOTAC International, a global manufacturer of innovation, is pleased to change the playing field of graphics cards once more with a new member in ZOTAC GeForce GTX 10 Series. The ZOTAC GeForce GTX 1060 series is the latest addition within the NVIDIA Pascal architecture, with superior performance that can drive the VR experience.

The ZOTAC GeForce GTX 1060 series will be available in Mini and AMP Edition, both super compact short length cards. The Mini enables raw gaming power to come to smaller builds without compromising thermal performance and noise level. This is made possible with a single wide 90 mm fan helped with direct GPU contact and a carefully designed aluminum heatsink providing for even heat dissipation all within a 6.85-inch length.

Inno3D Intros its GeForce GTX 1060 Lineup

Inno3D, a leading manufacturer of high-end hardware components and computer utilities, introduces its new family of Inno3D GeForce GTX 1060 graphics cards. Just as the flagship line-up, Inno3D will launch its Gaming OC and flagship iChiLL series simultaneously. Based on NVIDIA's new Pascal architecture, the new Inno3D GeForce GTX 1060 graphics cards delivers outstanding performance in the latest games, severe applications and Virtual Reality.

NVIDIA GeForce GTX 1060 Founders Edition PCB Pictured

Here's one of the first pictures of the NVIDIA GeForce GTX 1060 Founders Edition (reference) PCB. The PCB is about 2/3rds the length of the actual card, and despite that, it's pretty barren. Power is drawn from a 6-pin PCIe power connector, however, this connector isn't on the PCB, but is on a receptacle towards the end of the cooler. NVIDIA designed this in response to complaints that on cards with PCB shorter than the cooler, the power connector would be in the middle of the card. It would also block the illuminated GeForce GTX logo along the top.

The 6-pin PCIe power receptacle connects to the card at big solder points. This approach has one downside. If you want to change the cooler (to, say, an aftermarket air cooler), you will have to deal with that ugly cabling. The card uses a simple 3+1 phase VRM to power the GPU, with its TDP rated at just 120W. The GP106 GPU is neighbored by six 8 Gbps GDDR5 memory chips populating its 192-bit memory bus. There's no SLI support. Display outputs include three DisplayPort 1.4, and one each of HDMI 2.0b and DVI.

NVIDIA Announces the GeForce GTX 1060, 6 GB GDDR5, $249

NVIDIA today announced its third desktop consumer graphics card based on the "Pascal" architecture, the GeForce GTX 1060. NVIDIA aims to strike a price-performance sweetspot, by pricing this card aggressively at US $249 (MSRP), with its reference "Founders Edition" variant priced at $299. To make sure two of these cards at $500 don't cannibalize the $599-699 GTX 1080, NVIDIA didn't even give this card 2-way SLI support. Retail availability of the cards will commence from 19th July, 2016. NVIDIA claims that the GTX 1060 performs on-par with the GeForce GTX 980 from the previous generation.

The GeForce GTX 1060 is based on the new 16 nm "GP106" silicon, the company's third ASIC based on this architecture after GP100 and GP104. It features 1,280 CUDA cores spread across ten streaming multiprocessors, 80 TMUs, 48 ROPs, and a 192-bit wide GDDR5 memory interface, holding 6 GB of memory. The card draws power from a single 6-pin PCIe power connector, as the GPU's TDP is rated at just 120W. The core is clocked up to 1.70 GHz, and the memory at 8 Gbps, at which it belts out 192 GB/s of memory bandwidth. Display outputs include three DisplayPorts 1.4, one HDMI 2.0b, and a DVI.

NVIDIA Releases the GeForce 368.69 WHQL Drivers

NVIDIA released its latest version of GeForce drivers. Version 368.69 WHQL comes game-ready for "DiRT Rally VR," one of the first AAA titles to make VR a key gameplay and experience component. The drivers also add SLI profiles for "Armored Warfare," "iRacing Motorsport Simulator," "Lost Ark," and "Tiger Knight." Interestingly, it does not include GeForce Experience 3.0 (new UI, mandatory login); and instead bundles version 2.11 of the app. Grab the drivers from the links below.
DOWNLOAD: NVIDIA GeForce 368.69 WHQL for Windows 10 64-bit | Windows 10 32-bit | Windows 8/7/Vista 64-bit | Windows 8/7/Vista 32-bit

NVIDIA GeForce Experience Gets UI Update, Won't Work Without Login

NVIDIA released a major update to its GeForce Experience app, which significantly changes the user interface (UI). The new GeForce Experience 3.0 is being shipped as a public beta, and is currently not part of an NVIDIA driver installer. Its UI now has two key sections, one which deals with game setting optimization, and the other which lets users access NVIDIA GeForce features such as Ansel, GameStream, driver updates, etc. Perhaps the biggest change here is that having an online account with NVIDIA is no longer optional, if you want to use GeForce Experience.

NVIDIA uses this account to store your game settings and other preferences on the cloud, so they're portable between all your devices, and could be useful if you're a PC enthusiast that frequently changes hardware. On the flip-side, though, GeForce Experience becomes another app that dials home each time you start your PC, impacting start-up speed. The new UI does make things more organized, and labels your games much like a DRM client like Origin or Steam would. You don't need GeForce Experience to use NVIDIA graphics cards, though. The app's install is still optional, and can be unchecked in the "Custom install" screen of the GeForce driver installer.

Source: Tom's Hardware, NVIDIA

Microsoft Refines DirectX 12 Multi-GPU with Simple Abstraction Layer

Microsoft is sparing no efforts in promoting DirectX 12 native multi-GPU as the go-to multi-GPU solution for game developers, obsoleting proprietary technologies like SLI and CrossFire. The company recently announced that it is making it easier for game developers to code their games to take advantage of multiple GPUs without as much coding as they do now. This involves the use of a new hardware abstraction layer that simplifies the process of pooling multiple GPUs in a system, which will let developers bypass the Explicit Multi-Adapter (EMA) mode of graphics cards.

This is the first major step by Microsoft since its announcement that DirectX 12, in theory, supports true Mixed Multi-Adapter configurations. The company stated that it will release the new abstraction layer as part of a comprehensive framework into the company's GitHub repository with two sample projects, one which takes advantage of the new multi-GPU tech, and one without. Exposed to this code, game developers' learning curve will be significantly reduced, and they will have a template on how to implement multi-GPU in their DirectX 12 projects with minimal effort. With this, Microsoft is supporting game developers in implementing API native multi-GPU, even as GPU manufacturers stated that while their GPUs will support EMA, the onus will be on game-developers to keep their games optimized.Source: GitHub

NVIDIA GeForce GTX 1060 3DMark Firestrike Performance Revealed

A Chinese PC bulletin board member with access to a GeForce GTX 1060 sample, put it through 3DMark Firestrike (standard) and 3DMark Firestrike Ultra. The card was tested on a machine powered by a Core i7-6700K processor. The screenshots, particularly the GPU-Z screenshot, reveals something fascinating. It looks like the rumors of NVIDIA launching two distinct SKUs of the GTX 1060 could be true. The driver is reporting the GPU name as "GeForce GTX 1060 6GB." Mentioning memory amount in the name string is unusual for NVIDIA, in this case, it could point to the possibility of a 6 GB SKU, and another with 3 GB memory.

Moving on to the business end of the story, the card's 3DMark Firestrike scores are 11,225 points for the standard test, and 3,014 points for Firestrike Ultra. This isn't significantly faster than the Radeon RX 480 8 GB. Here are some 3DMark Firestrike numbers for the RX 480. NVIDIA is expected to launch the GeForce GTX 1060 later this month.


Source: XFastest

NVIDIA to Unveil GeForce GTX TITAN P at Gamescom

NVIDIA is preparing to launch its flagship graphics card based on the "Pascal" architecture, the so-called GeForce GTX TITAN P, at the 2016 Gamescom, held in Cologne, Germany, between 17-21 August. The card is expected to be based on the GP100 silicon, and could likely come in two variants - 16 GB and 12 GB. The two differ by memory bus width besides memory size. The 16 GB variant could feature four HBM2 stacks over a 4096-bit memory bus; while the 12 GB variant could feature three HBM2 stacks, and a 3072-bit bus. This approach by NVIDIA is identical to the way it carved out Tesla P100-based PCIe accelerators, based on this ASIC. The cards' TDP could be rated between 300-375W, drawing power from two 8-pin PCIe power connectors.

The GP100 and GTX TITAN P isn't the only high-end graphics card lineup targeted at gamers and PC enthusiasts, NVIDIA is also working the GP102 silicon, positioned between the GP104 and the GP100. This chip could lack FP64 CUDA cores found on the GP100 silicon, and feature up to 3,840 CUDA cores of the same kind found on the GP104. The GP102 is also expected to feature simpler 384-bit GDDR5X memory. NVIDIA could base the GTX 1080 Ti on this chip.
Source: VR World, Many Thanks to okidna for the tip.

NVIDIA GeForce GTX 1060 Doesn't Support SLI? Reference PCB Difficult to Mod

Here are some more technical pictures of NVIDIA GeForce GTX 1060 reference-design board, which reveals quite a few features about the card. The biggest revelation is that the card completely lacks SLI bridge fingers. We wonder if NVIDIA has innovated a bridge-less SLI for this card, although we find it unlikely given the amount of efforts the company put into marketing the SLI HB bridge, and the reason SLI needs a bridge in the first place. Meanwhile, the Radeon RX 480 supports 4-way CrossFireX.

Next up, the PCB is shorter than the card itself, and NVIDIA's unique new reference-cooler makes the card about 50% longer than its PCB. NVIDIA listened to feedback about shorter PCBs pushing power connectors towards the middle of the cards; and innovated a unique design, in which the card's sole 6-pin PCIe power connector is located where you want it (towards the end), and internal high-current wires are soldered to the PCB. Neato? Think again. What if you want to change the cooler, or maybe use a water-block? Prepare to deal with six insulated wires sticking out of somewhere in the PCB, and running into that PCIe power receptacle. The rear PCB shot also seems to confirm the 192-bit memory bus, given how some memory chip pads are blanked out by lacking SMT components needed by the memory chip.
Source: PurePC.pl
Return to Keyword Browsing