News Posts matching #GTX 1660

Return to Keyword Browsing

MSI Officially Enters the Business Laptop Market With New Logo

Micro-Star International (MSI), the technology and laptop giant debuted its brand new "Business & Productivity" lineup,: The Summit, Prestige, and Modern series. The company also revealed its new, minimalistic, and modern logo for the new lineup. This bold new direction is a major milestone for MSI.

The new line also includes one of the first laptops, Prestige 14 Evo certified on Intel's Evo platform, which indicates advanced efficiency and better mobility. The gaming series also unveiled the world's thinnest 15-inch gaming laptop, the Stealth 15M. All powered by the latest 11th Gen Intel processor, the new lineup provides unprecedented efficiency and performance in the face of the new remote working trend due to COVID-19.

BIOSTAR Expands Extreme Gaming Graphics Card Line with GeForce GTX 16-series GPUs

BIOSTAR has been a late entrant to the custom-design graphics card scene with its recent launch of two Radeon RX 5000 series products under the Extreme Gaming product line. The company expanded it with a pair of GeForce GTX 16-series products, the VN1665XF69 based on GeForce GTX 1660, and the VN1655XF41 based on the GTX 1650. The Extreme Gaming GTX 1660 sticks to NVIDIA reference clock speeds of 1785 MHz GPU Boost, and 8 Gbps (GDDR5-effective) memory. It features a dual-fan, dual-slot aluminium fin-stack heatsink. The Extreme Gaming GTX 1650, too, sticks to NVIDIA reference clocks for the SKU - 1665 MHz GPU Boost, and 8 Gbps memory. It uses a simpler aluminium mono-block heatsink that's ventilated by a pair of 70 mm fans. It's likely that both cards will be sold by BIOSTAR at close-to-reference pricing.

AMD Radeon RX 5600 XT Features 2,304 Stream Processors

AMD's upcoming Radeon RX 5600 XT graphics card features the same exact stream processor count as the $350 RX 5700, according to a leaked specs sheet of a an AIB partner's custom-design graphics card. With a stream processor count of 2,304, it's safe to assume that the RX 5600 XT is based on the same 7 nm "Navi 10" silicon as the RX 5700 series. What set the RX 5600 XT apart from the RX 5700, besides lower clock-speeds, is the memory subsystem, which is severely stripped down. The Radeon RX 5600 XT will be equipped with 6 GB of GDDR6 memory across a 192-bit wide memory interface. What's more, the memory ticks at 12 Gbps, compared to 14 Gbps on the RX 5700 series.

With these specs, the RX 5600 XT has 288 GB/s of memory bandwidth at its disposal, same as NVIDIA's GeForce GTX 1660 Ti. In contrast, with 8 GB of 256-bit GDDR6 running at 14 Gbps, the RX 5700 enjoys 448 GB/s. The specs sheet suggests that AMD has also dialed down the engine clock-speeds (GPU clocks) a bit, with up to 1620 MHz boost, up to 1460 MHz gaming, and 1235 MHz base. With these specs, it's highly likely that the RX 5600 XT outperforms the GTX 1660 Ti and gets close to the RTX 2060. It all boils down to pricing. The RX 5500 XT is a decent GTX 1650-series alternative with a lukewarm price thanks to NVIDIA's aggressive product-stack management by getting its partners to lower prices of the GTX 1660 and GTX 1660 Super. It would be interesting to see if AMD can outfox NVIDIA in the sub-$300 market.

AMD RX 5600 XT Poised to Offer Vega 56-like Performance, Possible Specs Rumored

AMD's upcoming RX 5600 XT will bring about a much needed power increase over the current baseline RX 5500 series, slotting smoothly between it and the mainstream, high-performance RX 5700 and RX 5700 XT. New benchmarks spotted by Videocardz place AMD's upcoming graphics card (which could feature a 6 GB VRAM with higher capacities likely to be offered as well) some 35% ahead of the RX 5500, as well as on the overall performance level of AMD's RX Vega 56. That AMD card debuted at $399 and now has performance 8% to 15% higher than NVIDIA's current GTX 1660 SUPER, exactly where AMD would want the RX 5600 XT's performance to land.

Other details come courtesy of another publication, where Igor Wallosseck over at Igor's Lab says that AMD could be looking at harvesting the Navi 10 dies that power the company's RX 5700 XT and RX 5700 by disabling one of four Asynchronous Compute Engines (ACEs). These four ACEs are found two each on one of Navi's Shader Engines (SEs), and disabling one ACE and subordinate hardware from the full Navi 10's 40 RDNA Units, 2,560 Stream Processors (SPs), 160 texture mapping units (TMUs) and 64 render output units (ROPs) would make up for an RX 5600 XT with 30 RDNA CUs, 1,920 SPs, 120 TMUs, 48 ROPs and expected 3 MB of L2 cache. AMD could be looking to position the AMD RX 5600 XT in the $249 price range, since top tier RX 5500 XT tend to go for $200.

AMD Radeon RX 5500 To Launch Come December 12th

According to a source cited by China's Ithome, AMD has contacted AIB with regards to launch plans for the company's RX 5500, the mainstream graphics cards based on Navi 14. For now, there are still no news on any RX 5500 XT graphics cards from the company - whether or not there is such as SKU being prepared for later launch is still unclear. The launch date of December 12th is in line with previous release expectations, and should be a full launch with multiple AIB partners releasing their solutions.

The RX 5500 has been tested to be a competitor to NVIDIA's GTX 1660 graphics card, replacing AMD's RX 570, RX 580 and RX 590 graphics cards from the product stack. The Navi 14 chip that the RX 5500 is based on TSMC's 7 nm manufacturing technology, is configured with 22 RDNA compute units (1,408 stream processors), and features a 128-bit wide GDDR6 memory bus. VRAM-wise it will be available in either 4 GB or 8 GB of memory running at 14 Gbps data-rate, yielding 224 GB/s of memory bandwidth. GPU clocks are listed as 1670 MHz "Boost," and 1845 MHz "Gaming". Typical board power is rated at 110 W, with a single 8-pin PCIe power input being enough to deliver required power save for some more exotic AIB designs.

AMD Radeon RX 5500 (OEM) Tested, Almost As Fast as RX 580

German publication Heise.de got its hands on a Radeon RX 5500 (OEM) graphics card and put it through their test bench. The numbers yielded show exactly what caused NVIDIA to refresh its entry-level with the GeForce GTX 1650 Super and the GTX 1660 Super. The RX 5500, in Heise's testing was found matching the previous-generation RX 580, and NVIDIA's current-gen GTX 1660 (non-Super). When compared to factory-overclocked RX 580 NITRO+ and GTX 1660 OC, the RX 5500 yielded similar 3DMark Firestrike performance, with 12,111 points, compared to 12,744 points of the RX 580 NITRO+, and 12,525 points of the GTX 1660 OC.

The card was put through two other game tests at 1080p, "Shadow of the Tomb Raider," and "Far Cry 5." In SoTR, the RX 5500 put out 59 fps, which was slightly behind the 65 fps of the RX 580 NITRO+, and 69 fps of the GTX 1660 OC. In "Far Cry 5," it scored 72 fps, which again is within reach of the 75 fps of the RX 580 NITRO+, and 85 fps of the GTX 1660 OC. It's important to once again note that the RX 580 and GTX 1660 in this comparison are factory-overclocked cards, while the RX 5500 is ticking a stock speeds. Heise also did some power testing, and found the RX 5500 to have a lower idle power-draw than the GTX 1660 OC, at 7 W compared to 10 W of the NVIDIA card; and 12 W of the RX 580 NITRO+. Gaming power-draw is also similar to the GTX 1660, with the RX 5500 pulling 133 W compared to 128 W of the GTX 1660. This short test shows that the RX 5500 is in the same league as the RX 580 and GTX 1660, and explains how NVIDIA had to make its recent product-stack changes.

NVIDIA Announces Financial Results for Third Quarter Fiscal 2020

NVIDIA today reported revenue for the third quarter ended Oct. 27, 2019, of $3.01 billion compared with $3.18 billion a year earlier and $2.58 billion in the previous quarter. GAAP earnings per diluted share for the quarter were $1.45, compared with $1.97 a year ago and $0.90 in the previous quarter. Non-GAAP earnings per diluted share were $1.78, compared with $1.84 a year earlier and $1.24 in the previous quarter.

"Our gaming business and demand from hyperscale customers powered Q3's results," said Jensen Huang, founder and CEO of NVIDIA. "The realism of computer graphics is taking a giant leap forward with NVIDIA RTX. "This quarter, we have laid the foundation for where AI will ultimately make the greatest impact. We extended our reach beyond the cloud, to the edge, where GPU-accelerated 5G, AI and IoT will revolutionize the world's largest industries. We see strong data center growth ahead, driven by the rise of conversational AI and inference."

AMD Radeon RX 5500 Marketing Sheets Reveal a bit More About the Card

Marketing material of AMD's upcoming Radeon RX 5500 mid-range graphics cards leaked to the web, providing insights to the product's positioning in AMD's stack. The October 2019 dated document lists out the card's specification, performance relative to a competing NVIDIA product, and a provides a general guidance on what experience to expect form it. To begin with, the RX 5500 desktop graphics card is based on the 7 nm "Navi 14" silicon, and is configured with 22 RDNA compute units, amounting to 1,408 stream processors. The chip features a 128-bit wide GDDR6 memory bus, which is paired with either 4 GB or 8 GB of memory running at 14 Gbps data-rate, yielding 224 GB/s of memory bandwidth. Its GPU clocks are listed as 1670 MHz "gaming," and 1845 MHz boost. The company didn't mention nominal clocks. The typical board power is rated at 110 W, and a single 8-pin PCIe power input is deployed on the reference-design board.

The second slide is where things get very interesting. AMD tabled its product stack, and the RX 570, RX 580, and RX 590 are missing, even as the RX 560 isn't. This is probably a sign of AMD phasing out the Polaris-based 1080p cards in the very near future, and replacing them with the RX 5500, and possibly a better endowed "RX 5500 XT," if rumors of the "Navi 14" featuring more CUs are to be believed. What is surprising about this whole presentation though is that only the "RX 5500" is listed, with the "XT" nowhere in sight. Let's hope the XT version gets released further down the road. In the product stack, the RX 5500 is interestingly still being compared to the GeForce GTX 1650, with no mention of the GTX 1660. This document was probably made when the GTX 1660 Super hadn't launched. A different slide provides some guidance on what kind of experiences to expect from the various cards, rated N/A, good, better, or excellent. According to it, the RX 5500 should provide "excellent" AAA gaming at 1080p, fairly smooth gaming at high settings (graded "better"), "excellent" e-Sports gaming, and "better" 1440p gaming. The card is also "excellent" at all non-gaming graphics, such as watching 4K video, photo/video creator work, game streaming at any resolution, and general desktop use.

Inno3D Announces New Gaming OC X2 and Twin X2 OC RGB Graphics Cards

INNO3D, a leading manufacturer of pioneering high-end multimedia components and various innovations is thrilled to announce the new range of INNO3D fans including the TWIN X2 OC RGB and GAMING OC X2, while also adopting the popular TWIN X2 and COMPACT on existing GPUs. So what GPU will get what fan? Take a look at the list below.

Our engineers were at the drawing board and had the task of designing two new fans to essentially target customers with specific requirements when purchasing their graphics card. High on the list of requests was the need for the INNO3D GTX 16 series to have RGB so that even the less hardcore gamers can also enjoy and marvel at the colour cycling display when playing their favourite games. It is not all form and no function, far from it - the RGB cooler has dual 9 cm fan with the best balance of noise and cooling performance. The cooler houses a big heatsink with 3 heatpipes for efficient heat dissipation while made up of an 8-layer PCB with 8 pin power input for stable overclocking. All this in a relatively small form factor with the length of just 22 cm.

MSI Announces New GeForce GTX 16 SUPER Series Graphics Cards

As the world's most popular GAMING graphics card vendor, MSI is proud to announce its brand-new graphics card line-up based on NVIDIA's Turing architecture with outstanding performance. Equipped with excellent thermal solutions, MSI GeForce GTX 16 SUPER series are designed to provide higher core and memory clock speeds for increased performance in games.

MSI's GAMING series delivers the top notch in-game and thermal performance that gamers have come to expect from MSI. With a solid and sharp design, VENTUS XS provides a great balance with strong dual fan cooling and outstanding performance. The AERO ITX is a great option for gamers looking to include Turing power into a small form factor build. With this comprehensive line-up there is plenty of choice for any build. Both MSI GeForce GTX 1660 SUPER and GeForce GTX 1650 SUPER will have GAMING, VENTUS XS and AERO ITX models with various differences.

NVIDIA GeForce GTX 1660 SUPER Launching October 29th, $229 With GDDR6

NVIDIA's GeForce GTX 1660 SUPER, the first non raytracing-capable Turing-based SUPER graphics card from the company, is set to drop on October 29th. Contrary to other SUPER releases though, the GTX 1660 SUPER won't feature a new GPU ship brought down from the upwards performance tier. This means it will make use of the same TU116-300 as the GTX 1660 with 1408 CUDA cores, not the 1536 CUDA count of the GTX 1660 Ti. Instead, NVIDIA has increased performance of this SUPER model by endowing it with GDDR6 memory.

The new GDDR6 memory ticks at 14 Gbps, which gives it an advantage over the GTX 1660 Ti model which will still cost more than it. When all is said and done, the GTX 1660 SUPER will feature memory bandwidth in the range of 336 GB/s, significantly more than the GTX 1660 Ti's 288 GB/s, and a huge differentiating factor from the 192 GB/s of the GTX 1660. Of course, the fewer CUDA core resources compared to the GTX 1660 Ti mean it should still deliver lower performance than that graphics card. This justifies its price-tag set at $229 - $20 higher than the GTX 1660, but $50 less than the GTX 1660 Ti.

AMD to Unveil Radeon RX 5500 on October 7

It turns out that the Radeon RX 5500 is arriving a lot sooner than expected, with VideoCardz reporting an October 7th product launch for the card. It's also being reported that the SKU will launch as the Radeon RX 5500 XT, with board partner GIGABYTE being ready with half a dozen custom-design cards, all of which with 8 GB of memory. In a separate report, VideoCardz also confirmed that the RX 5500 series will be based on the latest "Navi" family of GPUs that use the company's latest RDNA architecture, and will be built on the 7 nm silicon fabrication process. What's more, the RX 5500 will reportedly use 8 GB of modern GDDR6 memory across a 128-bit wide memory bus. A WCCFTech report predicts the RX 5500 (XT) will feature 22 RDNA compute units, which works out to 1,408 stream processors.

With these specs, we can see where AMD is going with the RX 5500 (XT). The company wants a viable successor to the Radeon RX 580 or even the RX 590, which it can sell around the $200-250 price-range, competing with a spectrum of NVIDIA GPUs, including the GeForce GTX 1650 and the GTX 1660. The card would target 1080p AAA gaming with high-thru-ultra settings, and 1080p eSports gaming at high refresh-rates. NVIDIA is already preparing a response to the RX 5500 in the form of the GTX 1650 Super and the GTX 1660 Super, which come with beefed up specs.

NVIDIA Announces Financial Results for First Quarter Fiscal 2020

NVIDIA today reported revenue for the first quarter ended April 28, 2019, of $2.22 billion compared with $3.21 billion a year earlier and $2.21 billion in the previous quarter. GAAP earnings per diluted share for the quarter were $0.64, compared with $1.98 a year ago and $0.92 in the previous quarter. Non-GAAP earnings per diluted share were $0.88 compared with $2.05 a year earlier and $0.80 in the previous quarter.

"NVIDIA is back on an upward trajectory," said Jensen Huang, founder and CEO of NVIDIA. "We've returned to growth in gaming, with nearly 100 new GeForce Max-Q laptops shipping. And NVIDIA RTX has gained broad industry support, making ray tracing the standard for next-generation gaming.

TechPowerUp Releases GPU-Z v2.19.0

TechPowerUp today released the latest version of GPU-Z, the definitive graphics subsystem information and diagnostic utility. Version 2.19.0 adds support for several new GPUs, improves user experience, and fixes bugs. To begin with, support is added for AMD Ryzen 3000-series "Picasso" iGPUs, besides NVIDIA GeForce GTX 1650, GTX 1650 Mobile, GTX 1660 Mobile, GTX 1660 Ti Mobile, GeForce MX250, and the TU117-B revision. Transistor counts were added for GeForce MX230 and GP108 chips. AMD Radeon Pro series graphics cards get a proper logo display.

TechPowerUp GPU-Z 2.19.0 also improves support for EVGA iCX technology with better detection of support, and improved accuracy. We've added the ability to detect support for DirectX Raytracing (DXR), Variable-rate shading, WDDM 2.6 (requires Windows 10 1903), and Shader Model 6.5, and Tiled Resources Tier 4 in the Advanced panel. The tab now also lists out new DirectX 12 capabilities incrementally rolled out through Windows 10 1803 and 1809. The ASIC Quality readout will now only display on GPUs that support the read-out. Among the fixes include faster startup on devices with AMD PowerXpress, a crash when no known cards are detected and driver info is sought by a mouse-hover, a startup crash on Windows XP machines, and the correct silicon display of GK210 for Tesla K80.
DOWNLOAD: TechPowerUp GPU-Z 2.19.0

The change-log follows.

NVIDIA Extends DirectX Raytracing (DXR) Support to Many GeForce GTX GPUs

NVIDIA today announced that it is extending DXR (DirectX Raytracing) support to several GeForce GTX graphics models beyond its GeForce RTX series. These include the GTX 1660 Ti, GTX 1660, GTX 1080 Ti, GTX 1080, GTX 1070 Ti, GTX 1070, and GTX 1060 6 GB. The GTX 1060 3 GB and lower "Pascal" models don't support DXR, nor do older generations of NVIDIA GPUs. NVIDIA has implemented real-time raytracing on GPUs without specialized components such as RT cores or tensor cores, by essentially implementing the rendering path through shaders, in this case, CUDA cores. DXR support will be added through a new GeForce graphics driver later today.

The GPU's CUDA cores now have to calculate BVR, intersection, reflection, and refraction. The GTX 16-series chips have an edge over "Pascal" despite lacking RT cores, as the "Turing" CUDA cores support concurrent INT and FP execution, allowing more work to be done per clock. NVIDIA in a detailed presentation listed out the kinds of real-time ray-tracing effects available by the DXR API, namely reflections, shadows, advanced reflections and shadows, ambient occlusion, global illumination (unbaked), and combinations of these. The company put out detailed performance numbers for a selection of GTX 10-series and GTX 16-series GPUs, and compared them to RTX 20-series SKUs that have specialized hardware for DXR.
Update: Article updated with additional test data from NVIDIA.

NVIDIA to Enable DXR Ray Tracing on GTX (10- and 16-series) GPUs in April Drivers Update

NVIDIA had their customary GTC keynote ending mere minutes ago, and it was one of the longer keynotes clocking in at nearly three hours in length. There were some fascinating demos and features shown off, especially in the realm of robotics and machine learning, as well as new hardware as it pertains to AI and cars with the all-new Jetson Nano. It would be fair to say, however, that the vast majority of the keynote was targeting developers and researchers, as usually is the case at GTC. However, something came up in between which caught us by surprise, and no doubt is a pleasant update to most of us here on TechPowerUp.

Following AMD's claims on software-based real-time ray tracing in games, and Crytek's Neon Noir real-time ray tracing demo for both AMD and NVIDIA GPUs, it makes sense in hindsight that NVIDIA would allow rudimentary DXR ray tracing support to older hardware that do not support RT cores. In particular, an upcoming drivers update next month will allow DXR support for 10-series Pascal-microarchitecture graphics cards (GTX 1060 6 GB and higher), as well as the newly announced GTX 16-series Turing-microarchitecture GPUs (GTX 1660, GTX 1660 Ti). The announcement comes with a caveat letting people know to not expect RTX support (think lower number of ray traces, and possibly no secondary/tertiary effects), and this DXR mode will only be supported in Unity and Unreal game engines for now. More to come, with details past the break.

Palit Unveils its GeForce GTX 1660 Lineup

Palit Microsystems Ltd, the leading graphics card manufacturer, releases the new NVIDIA Turing architecture GeForce GTX 16 series in Palit GeForce product line-up, GeForce GTX 1660 Dual OC, Dual, StormX OC and StormX.

Palit GeForce GTX 1660 is built with the latest NVIDIA Turing architecture which performs great at 120FPS, so it's an ideal model for eSports gaming titles It can also reach an amazing performance and image quality while livestreaming to Twitch or YouTube. Like its big brother, the GeForce GTX 1660 utilizes the "TU116" Turing GPU that's been carefully architected to balance performance, power, and cost. TU116 includes all of the new Turing Shader innovations that improve performance and efficiency, including support for Concurrent Floating Point and Integer Operations, a Unified Cache Architecture with larger L1 cache, and Adaptive Shading.

GIGABYTE Unveils its GeForce GTX 1660 Graphics Card Series

GIGABYTE, the world's leading premium gaming hardware manufacturer, today announced the latest GeForce GTX 1660 graphics cards powered by NVIDIA Turing architecture. GIGABYTE launched 3 graphics cards - GeForce GTX 1660 GAMING OC 6G, GeForce GTX 1660 GAMING 6G, GeForce GTX 1660 OC 6G. These GeForce GTX 1660 graphics cards not only use overclocked GPUs certified by GIGABYTE but are also built with GIGABYTE cooling systems for game enthusiasts pursuing extreme performance and the best gaming experience.

GIGABYTE releases the GeForce GTX 1660 GAMING OC 6G graphics card for users who prefer Triple Fan solutions with the new GeForce GTX 1660. In addition to the GIGABYTE "Alternate Spinning" patented function, the unique blade fan design can effectively enhance the airflow, which is spilt by the triangular fan edge and guided smoothly through the 3D stripe curve on the fan surface. Equipped with 3 pure copper composite heat-pipes and direct touch GPU, it can dissipate heat from the GPU quickly, keeping the graphics card working at a low temperature with higher performance and product stability. Moreover, GAMING OC built with extreme durable and overclocking design which provides over-temperature protection design and load balancing for each MOSFET, plus the Ultra Durable certified chokes and capacitors, provides excellent performance and a longer system life.

MSI Reveals New GeForce GTX 1660 Series Graphics Cards

As the world's most popular GAMING graphics card vendor, MSI is proud to announce its new graphics card line-up based on the new GeForce GTX 1660 GPU, the latest addition to the NVIDIA Turing GTX family.

The GeForce GTX 1660 utilizes the "TU116" Turing GPU that's been carefully architected to balance performance, power, and cost. TU116 includes all of the new Turing Shader innovations that improve performance and efficiency, including support for Concurrent Floating Point and Integer Operations, a Unified Cache Architecture with larger L1 cache, and Adaptive Shading.

EVGA Rolls Out the GeForce GTX 1660 for Zero Compromise Gaming

The EVGA GeForce GTX 1660 Graphics Cards are powered by the all-new NVIDIA Turing architecture to give you incredible new levels of gaming realism, speed, power efficiency, and immersion. With the EVGA GeForce GTX 1660 Graphics Cards you get the best gaming experience with next generation graphics performance, ice cold cooling and advanced overclocking features with the all new EVGA Precision X1 software.

First HDB fan on an NVIDIA graphics card optimizes airflow, increases cooling performance, and reduces fan noise by 15%, compared to more commonly-used sleeve-bearing fans on graphics cards. The special upraised 'E' pattern on the enlarged fan allows a further reduction in noise level by 4%. With a brand new layout, completely new codebase, new features and more, the new EVGA Precision X1 software is faster, easier and better than ever.

ZOTAC Unveils its GeForce GTX 1660 Series

ZOTAC Technology, a global manufacturer of innovation, is pleased to expand the GeForce GTX 16 series with the ZOTAC GAMING GeForce GTX 1660 series featuring GDDR5 memory and the NVIDIA Turing Architecture.

Founded in 2017, ZOTAC GAMING is the pioneer movement that comes forth from the core of the ZOTAC brand that aims to create the ultimate PC gaming hardware for those who live to game. It is the epitome of our engineering prowess and design expertise representing over a decade of precision performance, making ZOTAC GAMING a born leading force with the goal to deliver the best PC gaming experience. The logo shows the piercing stare of the robotic eyes, where behind it, lies the strength and future technology that fills the ego of the undefeated and battle experienced.

NVIDIA Launches the GeForce GTX 1660 6GB Graphics Card

NVIDIA today launched the GeForce GTX 1660 6 GB graphics card, its successor to the immensely popular GTX 1060 6 GB. With prices starting at $219.99, the GTX 1660 is based on the same 12 nm "TU116" silicon as the GTX 1660 Ti launched last month; with fewer CUDA cores and a slower memory interface. NVIDIA carved the GTX 1660 out by disabling 2 out of 24 "Turing" SMs on the TU116, resulting in 1,408 CUDA cores, 88 TMUs, and 48 ROPs. The company is using 8 Gbps GDDR5 memory instead of 12 Gbps GDDR6, which makes its memory sub-system 33 percent slower. The GPU is clocked at 1530 MHz, with 1785 MHz boost, which are marginally higher than the GTX 1660 Ti. The GeForce GTX 1660 is a partner-driven launch, meaning that there won't be any reference-design cards, although NVIDIA made should every AIC partner has at least one product selling at the baseline price of $219.99.

Read TechPowerUp Reviews: Zotac GeForce GTX 1660 | EVGA GeForce GTX 1660 XC Ultra | Palit GeForce GTX 1660 StormX OC | MSI GTX 1660 Gaming X

Update: We have updated our GPU database with all GTX 1660 models announced today, so you can easily get an overview over what has been released.

COLORFUL Debuts iGame GeForce GTX 1660 Ultra 6G Graphics Card

Colorful Technology Company Limited, professional manufacturer of graphics cards, motherboards and high-performance storage solutions is proud to announce the latest graphics card for gamers from the iGame series. COLORFUL iGame GeForce GTX 1660 Ultra 6G features NVIDIA Turing shaders to bring life action like never before all made possible by the evolution to 12nm; More cores, more shaders, more power.

The new COLORFUL iGame GeForce GTX 1660 Ultra 6G brings to the table an excellent upgrade for gamers coming off from previous generation graphics cards who want to experience the new generation of mainstream gaming. The iGame GeForce GTX 1660 Ultra 6G delivers excellent performance for its class as well as great cooling and build quality for gamers that want a high-quality card for their gaming systems.

EVGA and GIGABYTE GeForce GTX 1660 Graphics Cards Pictured

Here are some of the first pictures of EVGA's and GIGABYTE's upcoming GeForce GTX 1660 graphics cards reportedly slated for launch later this week. It should come as no surprise that these cards resemble the companies' GTX 1660 Ti offerings, since they're based on the same 12 nm "TU116" silicon, with fewer CUDA cores. The underlying PCBs could be slightly different as the GTX 1660 uses older generation 8 Gbps GDDR5 memory instead of 12 Gbps GDDR6. The "TU116" silicon is configured with 1,408 CUDA cores out of the 1,536 physically present; the memory amount is 6 GB, across a 192-bit wide memory bus. The GTX 1660 baseline price is reportedly USD $219, and the card replaces the GTX 1060 6 GB from NVIDIA's product stack.

EVGA is bringing two designs to the market, a short-length triple-slot card with a single fan; and a more conventional longer card with 2-slot, dual-fan design. The baseline "Black" card could be offered in the shorter design; while the top-tier XC Ultra could be exclusive to the longer design. GIGABYTE, on the other hand, has two designs, a shorter-length dual-fan; and a longer-length triple-fan. Both models are dual-slot. The baseline SKU will be restricted to the shorter board design, while premium Gaming OC SKUs could come in the longer board design.

Details on GeForce GTX 1660 Revealed Courtesy of MSI - 1408 CUDA Cores, GDDR 5 Memory

Details on NVIDIA's upcoming mainstream GTX 1660 graphics card have been revealed, which will help put its graphics-cruncinh prowess up to scrutiny. The new graphics card from NVIDIA slots in below the recently released GTX 1660 Ti (which provides roughly 5% better performance than NVIDIA's previous GTX 1070 graphics card) and above the yet-to-be-released GTX 1650.

The 1408 CUDA cores in the design amount to a 9% reduction in computing cores compared to the GTX 1660 Ti, but most of the savings (and performance impact) likely comes at the expense of the 6 GB (8 Gbps) GDDR5 memory this card is outfitted with, compared to the 1660 Ti's still GDDR6 implementation. The amount of cut GPU resources form NVIDIA is so low that we imagine these chips won't be coming from harvesting defective dies as much as from actually fusing off CUDA cores present in the TU116 chip. Using GDDR5 is still cheaper than the GDDR6 alternative (for now), and this also avoids straining the GDDR6 supply (if that was ever a concern for NVIDIA).
Return to Keyword Browsing