News Posts matching "GTX 1080"

Return to Keyword Browsing

EVGA Announces SC17 1080 Gaming Laptop

The EVGA SC17 1080 G-SYNC Gaming Laptop has arrived. Featuring a 4K-ready IPS panel with NVIDIA G-SYNC technology, this high performance laptop was meticulously crafted from the ground up for hardcore gamers, performance enthusiasts, and overclockers alike. Breaking away from the conventional brick form factor, a unique in-house EVGA designed power supply delivers up to 240 watts of power when needed, without sacrificing function or aesthetics.

The SC17 1080 pairs an unlocked Intel Core i7 7820HK CPU with an NVIDIA GeForce GTX 1080 capable of being overclocked to offer the performance you always wanted from a gaming laptop. With performance and overclocking in mind, the EVGA SC17 1080 G-SYNC Gaming Laptop features a BIOS with mouse function to give you complete control over all aspects of performance, voltage and advanced settings to customize your gaming machine. A Clear CMOS button directly on the chassis helps you recover from an unstable overclock, and custom fan curve control keeps your laptop cool and quiet. This is the world's first TRUE overclocking laptop.

Everything AMD Launched Today: A Summary

It has been a huge weekend of product announcements and launches from AMD, which expanded not just its client computing CPU lineup on both ends, but also expanded its Radeon graphics cards family with both client- and professional-segment graphics cards. This article provides a brief summary of everything AMD launched or announced today, with their possible market-availability dates.

AMD RX Vega First Pricing Information Leaked in Sweden - "Feels Wrong"

Nordic Hardware is running a piece where they affirm their sources in the Swedish market have confirmed some retailers have already received first pricing information for AMD's upcoming RX Vega graphics cards. This preliminary pricing information places the Radeon RX Vega's price-tag at around 7,000 SEK (~$850) excluding VAT. Things take a turn towards the ugly when we take into account that this isn't even final retail price for consumers: add in VAT and the retailer's own margins, and prospective pricing is expected at about 9,000 SEK (~$1093). Pricing isn't fixed, however, as it varies between manufacturers and models (which we all know too well), and current pricing is solely a reference ballpark.

There is a possibility that the final retail prices will be different from these quoted ones, and if latest performance benchmarks are vindicated, they really should be. However, Nordic Hardware quotes their sources as saying these prices are setting a boundary for "real and final", and that the sentiment among Swedish retailers is that the pricing "Feels wrong". For reference, NVIDIA's GTX 1080 Ti is currently retailing at around 8,000 SEK (~971) including VAT, while the GTX 1080, which RX Vega has commonly been trading blows with, retails for around 5600 SEK (~$680) at the minimum. This should go without saying, but repare your body for the injection of a NaCl solution.

Source: Nordic Hardware

AMD's RX Vega Low Key Budapest Event: Vega Pitted Against GTX 1080

On the first stop in AMD's two-continent spanning RX Vega tour (which really only counts with three locations), the company pitted their upcoming RX Vega graphics card (we expect this to be their flagship offering) against NVIDIA's GTX 1080 graphics card. The event itself was pretty subdued, and there was not much to see when it comes to the RX Vega graphics card - literally. Both it and the GTX 1080 were enclosed inside PC towers, with the event-goers not being allowed to even catch a glimpse of the piece of AMD hardware that has most approximated a unicorn in recent times.

The Vega-powered system also made use of a Ryzen 7 processor, and the cards were running Battlefield 1 (or Sniper Elite 4; there's lots of discussion going on about that, but the first image below does show a first-person view) with non-descript monitors, one supporting FreeSync, the other G-Sync. The monitor's models were covered by cloth so that users weren't able to tell which system was running which graphics card, though due to ASUS' partnership in the event, both were (probably) of ASUS make. The resolution used was 3440 x 1440, which should mean over 60 FPS on the GTX 1080 on Ultra. It has been reported by users that attended the event that one of the systems lagged slightly in one portion of the demo, though we can't confirm which one (and I'd say that was AMD's intention.)

Steam Survey Update: It's All About Quad-cores, NVIDIA and Windows 10

An update to the Steam survey results is always worth noting, especially with the added, tremendous growth Valve's online store service has seen recently. And it seems that in the Steam gaming world at least, quad-core CPUs, NVIDIA graphics cards, and Windows 10 reign supreme.

Windows 10 64-bit is the most used operating system, with 50.33% of the survey. That the second most used Windows OS is the steady, hallmark Windows 7 shouldn't come as a surprise, though it does have just 32.05% of the market now. OS X has a measly 2.95% of the grand total, while Linux comes in at an even lower 0.72%. While AMD processor submits may have increased in other software, it seems that at least in Steam, those numbers aren't reflected, since AMD's processor market share in the survey has decreased from 21.89% in February to just 19.01% as of June, even though the company's Ryzen line of CPUs has been selling like hotcakes. Quad-core CPUs are the most used at time of the survey, at 52.06%, while the next highest percentage is still the dual-core CPU, with 42.23%.

AMD RX Vega Reportedly Beats GTX 1080; 5% Performance Improvement per Month

New benchmarks of an RX Vega engineering sample video card have surfaced. There have been quite a few benchmarks for this card already, which manifests with the 687F:C1 identifier. The new, GTX 1080 beating benchmark (Gaming X version, so a factory overclocked one) comes courtesy of 3D Mark 11, with the 687F:C1 RX Vega delivering 31,873 points in its latest appearance (versus 27,890 in its first). Since the clock speed of the 687F:C1 RX Vega has remained the same throughout this benchmark history, I think it's fair to say these improvements have come out purely at the behest of driver and/or firmware level performance improvements.

New Performance Benchmarks of AMD's Vega Frontier Edition Surface

You probably took a long, hard read at our article covering a single-minded user's experience of his new Vega Frontier Edition. Now, courtesy of PCPer, and charitable soul Ekin at Linus Tech Tips, we have some more performance benchmarks of AMD's latest (non gaming specific) graphics card.

Starting with 2560x1440, let's begin with the good news: in what seems to be the best performance scenario we've seen until now, the Vega Frontier Edition stands extremely close to NVIDIA's GTX 1080 Ti video card in Fallout 4. It trails it for about 10 FPS most of the test, and even surpasses it at some points. These numbers should be taken with a grain of salt regarding the RX Vega consumer cards: performance on those models will probably be higher than the Frontier Edition's results. And for the sake of AMD, they better be, because in all other tests, the Frontier Edition somewhat disappoints. It's beaten by NVIDIA's GTX 1070 in Grand Theft Auto V, mirrors its performance in The Witcher 3, and delivers slightly higher performance than the GTX 1070 on Hitman and Dirt Rally (albeit lower than the GTX 1080.)

NVIDIA "Pascal" Based Mining GPU Lineup Detailed

GPU-accelerated crypto-currency mining poses a threat to the consumer graphics industry, yet the revenues it brings to GPU manufacturers are hard to turn away. The more graphics cards are bought up by crypto-currency miners, the fewer there are left for gamers and the actual target-audience of graphics cards. This is particularly bad for AMD, as fewer gamers have Radeon graphics cards as opposed to miners; which means game developers no longer see AMD GPU market-share as an amorphous trigger to allocate developer resources in optimizing their games to AMD architectures.

To combat this, both AMD and NVIDIA are innovating graphics cards designed specifically for crypto-currency mining. These cards are built to a cost, lack display outputs, and have electrical and cooling mechanisms designed for 24/7 operation, even if not living up to the durability standards of real enterprise-segment graphics cards, such as Radeon Pro series or Quadro. NVIDIA's "Pascal" GPU architecture is inherently weaker than AMD's "Polaris" and older Graphics CoreNext architectures at Ethereum mining, owing in part to Pascal's lack of industry-standard asynchronous compute. This didn't deter NVIDIA from innovating a lineup of crypto-mining SKUs based on its existing "Pascal" GPUs. These include the NVIDIA P104 series based on the "GP104" silicon (on which the GTX 1080 and GTX 1070 are based); and P106 series based on the "GP106" silicon (GTX 1060 series is based on this chip). NVIDIA didn't tap into its larger "GP102" or smaller "GP107" chips, yet.

HP Announces a New Line of OMEN Gaming PCs

Today, HP Inc. unleveled the playing field with the launch of an entirely new, cutting-edge and comprehensive portfolio of OMEN by HP gaming products. Re-designed and re-engineered from the ground up, the new lineup gives esports athletes and competitive gamers the edge and confidence needed to perform at the highest global level. Every inch inside and out of the new OMEN PCs, displays and accessories are packed with features designed to target the needs of gamers around the world, instantly changing the game like never before.

The OMEN X Compact Desktop provides a factory-overclocked GPU from NVIDIA and a versatile form-factor, creating the ability to dock and undock quickly for gaming in any room, or attach the desktop to a backpack accessory for an unparalleled, untethered VR experience. The addition of a backpack accessory to the OMEN X Compact Desktop adds a new dimension of flexibility to the platform, and by expanding the OMEN accessory lineup with a new OMEN mechanical keyboard, a mouse with weight customization, a headset with cushioned ear cups to reduce background noise and mousepads designed for precision, HP is bringing attention to every aspect of the gaming experience.

ZOTAC Shows Off the Mek Gaming PC

ZOTAC broke new ground with its first tower-type SFF gaming PC, the ZOTAC Mek. This is one of the first ZOTAC PCs that isn't brick or box-shaped, and competes with your game console or the likes of Falcon Northwest Tiki in looks. It comes in white and black with blue LED accents. A sliding door up front covers the power button, status LEDs (ring-shaped), a pair of USB 3.0 type-A, and HDA jacks. Under the hood is some serious gaming hardware - an Intel Core i7-7700 quad-core processor, 16 GB of dual-channel DDR4 memory, and GeForce GTX 1080 graphics. Also featured is a 240 GB M.2 NVMe SSD. Powering it all is a 450W SFX power-supply.

AMD Announces Radeon Vega Frontier Edition - Not for Gamers

Where is Vega? When is it launching? On AMD's Financial Analyst Day 2017, Raja Koduri spoke about the speculation in the past few weeks, and brought us an answer: Radeon Vega Frontier Edition is the first iteration of Vega, aimed at data scientists, immersion engineers and product designers. It will be released in the second half of June for AMD's "pioneers". The wording, that Vega Frontier Edition will be released in the second half of June, makes it so that AMD still technically releases Vega in the 2H 2017... It's just not the consumer, gaming Vega version of the chip. This could unfortunately signify an after-June release time-frame for consumer GPUs based on the Vega micro-architecture.

This news comes as a disappointment to all gamers who have been hoping for Vega for gaming, because it reminds of what happened with dual Fiji. A promising design which ended up unsuitable for gaming and was thus marketed for content creators as Radeon Pro Duo, with little success. But there is still hope: it just looks like we really will have to wait for Computex 2017 to see some measure of details on Vega's gaming prowess.

Entire AMD Vega Lineup Reportedly Leaked - Available on June 5th?

Reports are doing the rounds regarding alleged AMD insiders having "blown the whistle", so to speak, on the company's upcoming Vega graphics cards. This leak also points towards retail availability of Vega cards on the 5th of June, which lines up nicely with AMD's May 31st Computex press conference. An announcement there, followed by market availability on the beginning of next week does sound like something that would happen in a new product launch.

On to the meat and bones of this story, three different SKUs have been leaked, of which no details are currently known, apart from their naming and pricing. AMD's Vega line-up starts off with the RX Vega Core graphics card, which is reportedly going to retail for $399. This graphics card is going to sell at a higher price than NVIDIA's GTX 1070, which should mean higher performance. Higher pricing with competitive performance really wouldn't stir any pot of excitement, so, higher performance is the most logical guess. The $399 pricing sits nicely in regards to AMD's RX 580, though it does mean there is space for another SKU to be thrown into the mix at a later date, perhaps at $329, though I'm just speculating on AMD's apparent pricing gap at this point.

NVIDIA To Launch New GTX 1070, GTX 1080 GPUs on the Mobile Market

NVIDIA is apparently working on some new iterations of the GTX 1070 and GTX 1080 GPUs for the mobile market. These new parts should come with lower clocks than the parts that are currently on the market, as a means for system builders to be able to reduce the profile and overall thickness of their laptops whilst still being able to keep a powerful graphics card at their heart.

The new GTX 1080 is the chip more details are floating about, with some captures from NotebookCheck showing all 2560 CUDA cores enabled, but lower clocks making up a much restrained power consumption. The 1290 MHz base clock (with an unknown boost value as of this point) points to a power consumption of just 110 W (compared to 165 W on the 1556 MHz base-clock GTX 1080; the new GTX 1070 should feature a TDP of 90 W compared to its previous 120 W fully-powered variant.) This naturally means a slower GPU - the new, revised GTX 1080 scored 17000 points on 3D Mark whereas usual implementations of the card score on the vicinity of 21,000. The change in power envelope, however, would enable new notebooks, such as the showcased Acer Predator 700, to deliver more performance than some of last gen's comparable thickness laptops. Its GTX 1080-powered 18.9 mm thickness in the leaked images allows for 600 points more than some previous-generation, 29 mm laptops.

Source: NotebookCheck, Videocardz.com

NVIDIA to Give Away Three VR Games with GeForce GTX + Oculus Bundle

NVIDIA is giving away three VR games with bundles of Oculus Rift VR headset, Oculus Touch controller, and qualifying GeForce GTX graphics cards. Game codes to three of the hottest VR titles, "The Unspoken," "SUPERHOT VR," and "Wilson's Heart" will be given away for free when you buy bundles of the Rift VR headset, Touch controller, with GeForce GTX 1060, GTX 1070, GTX 1080, or GTX 1080 Ti graphics cards. The bundles will be sold exclusively through Amazon and Newegg.

On the special promotion pages of these stores, you can match an Oculus Rift headset and Touch controller with an applicable GeForce GTX graphics card of your choice. A typical GeForce GTX 1060 6 GB + Oculus Rift + Touch controller bundle is priced around US $850, a GTX 1070 based bundle around $980, a GTX 1080 based bundle around $1,090, and a GTX 1080 Ti based bundle around $1,300.

ZOTAC Intros the GeForce GTX 1080 AMP! Extreme with 11 Gbps GDDR5X Memory

ZOTAC today introduced the GeForce GTX 1080 AMP! Extreme with faster 11 Gbps GDDR5X memory (model: ZT-P1080I-10P). The card is nearly identical to the GTX 1080 AMP! Extreme the company launched its GTX 1080 lineup with, but features 10% faster memory, which is factory-overclocked even further. The card ships with clock speeds of 1771 MHz core, 1911 MHz GPU Boost, and 11.2 GHz (GDDR5X-effective) memory, against NVIDIA reference speeds of 1607/1733/10008 MHz (core/boost/memory).

The ZOTAC GeForce GTX 1080 AMP! Extreme 11 Gbps features a custom-design PCB with a strong VRM that draws power from a pair of 8-pin PCIe power connectors, to support the massive factory-overclock. It features a triple-slot IceStorm cooling solution by ZOTAC, which features a trio of 100 mm fans that ventilate a large dual-stack aluminium fin heatsink. The company didn't reveal pricing, although it is expected that this card replaces the older 10 Gbps memory-equipped AMP! Extreme.

Inno3D Announces GeForce GTX 1080 11 Gbps and GTX 1060 9 Gbps Graphics Cards

INNO3D, a leading manufacturer of high-end hardware components and computer utilities, expands its portfolio with the new GeForce GTX 1080 11Gbps and GTX 1060 9Gbps graphics cards. The new entries will become available in the premium iChiLL range of products.

The new INNO3D GeForce GTX 1080 11 Gbps and GTX 1060 9 Gbps iChiLL gaming cards draw everything out of NVIDIA's new Pascal architecture and leave nothing go to waste. All power and cooling efforts are perfectly tuned making them the perfect choice to run the latest games, the most demanding design suites and last but not least Virtual Reality applications.

ASUS Intros Revised GTX 1080 STRIX and GTX 1060 6GB STRIX with Faster Memory

ASUS today introduced revised versions of its GeForce GTX 1080 (non-Ti) STRIX and 6 GB GTX 1060 STRIX OC Edition graphics cards, featuring faster memory, as promised by NVIDIA during its GTX 1080 Ti launch. The GTX 1080 STRIX OC now comes with 11 Gbps GDDR5X memory and the 6 GB GTX 1060 STRIX OC with 9 Gbps GDDR5. To avoid bait-and-switch complaints from the retail market, these cards are clearly designated from their 10 Gbps and 8 Gbps siblings, in the model numbers, and in the prominent GPU SKU branding. The GTX 1080 STRIX OC is labelled "ROG-STRIX-GTX1080-O8G-11GBPS," and the GTX 1060 6 GB STRIX OC "GTX1060-O6G-9GBPS."

The two cards use revised GDDR5X and GDDR5 memory chips that can sustain their memory chips thanks to improvements in the memory controller end by NVIDIA. At 11 Gbps, the GTX 1080 now has a memory bandwidth of 352 GB/s, while the GTX 1060 has 216 GB/s bandwidth with 9 Gbps memory over its 192-bit wide memory bus. The new GTX 1080 STRIX OC also comes with "max-contact" heatsink base the company introduced with its GTX 1080 Ti STRIX. The company didn't reveal pricing.

Vega Shows Up Beating a GTX 1080 in CompuBench, But Hold the Hypetrain

The Vega based line of AMD GPUs are definitely a big unknown at this point, so any sightings or benchmarks of it are highly sought after by the rumormill. Well, here is another one to add to your pile of rumor-material folks: AMD has posted a card benchmark to Compubench that bests even the GTX 1080.

Why hold the hype?

There are two obvious issues. One, this is a compute only benchmark, and has little relevance to the average gamer. Two, in the same benchmark, a 980TI also beats the 1080. Stranger yet, the 1080 is also beaten by its little brother, the 1070. Take this one with a grain of salt, for the obvious reasons. It won't stop the the hypetrain from using this info to its own end, but maybe you can avoid being smashed by it by using some critical thinking.

NVIDIA's AIC Partners to Launch GTX 1080, 1060 With Faster GDDR5, GDDR5X Memory

At their GDC event yesterday, NVIDIA announced a change to how partners are able to outfit their GTX 1080 and GTX 1060 6 GB models in regards to video memory. Due to improvements in process and scaled-down costs, NVIDIA has decided to allow partners to purchase 11 Gbps GDDR5X (up from 10 Gbps) and 9Gbps (up from 8 Gbps) GDDR5 memory from them, to pair with the GTX 1080 and GTX 1060 6 GB, respectively. These are to be sold by NVIDIA's AIB partners as overclocked cards, and don't represent a change to the official specifications on either graphics card. With this move, NVIDIA aims to give partners more flexibility in choosing memory speeds and carving different models of the same graphics card, with varying degrees of overclock, something which was particularly hard to do on conventional 10 Gbps-equipped GTX 1080's, which showed atypically low memory overclocking headroom.

On NVIDIA's Tile-Based Rendering

Looking back on NVIDIA's GDC presentation, perhaps one of the most interesting aspects approached was the implementation of tile-based rendering on NVIDIA's post-Maxwell architectures. This has been an adaptation of typically mobile approaches to graphics rendering which keeps their specific needs for power efficiency in mind - and if you'll "member", "Maxwell" was NVIDIA's first graphics architecture publicly touted for its "mobile first" design.

This approach essentially divides the screen into tiles, and then rasterizes the entire frame in a per-tile basis. 16×16 and 32×32 pixels are the usual tile sizes, but both Maxwell and Pascal can dynamically assess the required tile size for each frame, changing it on-the-fly as needed and according to the complexity of the scene. This looks to ensure that the processed data has a much smaller footprint than that of the full image rendering - small enough that it makes it possible for NVIDIA to keep the data in a much smaller amount of memory (essentially, the L2 memory), dynamically filling and flushing the available cache as possible until the full frame has been rendered. This means that the GPU doesn't have to access larger, slower memory pools as much, which primarily reduces the load on the VRAM subsystem (increasing available VRAM for other tasks), whilst simultaneously accelerating rendering speed. At the same time, a tile-based approach also lends itself pretty well to the nature of GPUs - these are easily parallelized operations, with the GPU being able to tackle many independent tiles simultaneously, depending on the available resources.

NVIDIA Announces DX12 Gameworks Support

NVIDIA has announced DX12 support for their proprietary GameWorks SDK, including some new exclusive effects such as "Flex" and "Flow." Most interestingly, NVIDIA is claiming that simulation effects get a massive boost from Async Compute, nearly doubling performance on a GTX 1080 using that style of effects. Obviously, Async Compute is a DX12 exclusive technology. The performance gains in an area where NVIDIA normally is perceived to not do so well are indeed encouraging, even if only in their exclusive ecosystem. Whether GCN powered cards will see similar gains when running GameWorks titles remains to be seen.

NVIDIA Cuts Price of its GeForce GTX 1080 Graphics Card: $499

In the wake of its $699 GeForce GTX 1080 Ti enthusiast-segment graphics card launch, NVIDIA lowered the pricing of its previous GeForce "Pascal" flagship, the GeForce GTX 1080, which is now down to USD $499, from its launch price of $599 (custom-design baseline pricing). The GTX 1070 is unflinched for now, from its $349 baseline pricing. This launch should turn up the heat on AMD, with its HBM-powered Radeon RX Vega pricing, much in the same way the GTX 980 Ti steered pricing of the Radeon R9 Fury X.

Micron's Outlook for the Future of Memory: GDDR6, QuantX in 2017

After finally reaching mature yields (comparable to those of planar NAND processes), Micron's 32-layer first generation 3D NAND has grown increasingly prominent in the company's NAND output. Now, the company is looking to ramp-up production of their (currently sampling) 64-layer 3D NAND, promising "meaningful output" by the end of December 2017, looking for an 80% increase in total GB per wafer and a 30% decrease in production costs.

When it comes to the graphics subsystem memory, Micron is looking to transition their 20nm production to a "1x nm" (most likely 16nm) node, in a bid to improve cost per GB by around $20, with introduction of 16nm GDDR5 memory to be introduced later this year. However, GDDR5X volume is expected to grow significantly, in a bid to satisfy bandwidth-hungry uses through GPUs (like NVIDIA's GTX 1080 and potentially the upcoming 1080 Ti) and networking, with GDDR6 memory being introduced by the end of 2017 or early 2018. The company is still mum on actual consumer products based on their interpretation of the 3D XPoint products through their QuantX brand, though work is already under way on the second and third generation specifications of this memory, with Micron planning an hitherto unknown (in significance and product type) presence in the consumer market by the end of this year.

NVIDIA Makes it Tougher to Trade Bundled Games

NVIDIA and AMD have, over the past five years, innovated giving away AAA game releases with their graphics cards, through game codes that can be redeemed on DRM platforms such as Steam. The two have their own internal pricing with game publishers, which makes giving away $60 (retail value) games with $400 graphics cards tolerable to their bean counters. To consumers, these games made for great tradable commodities, a sort of "discount coupons," even. Say you don't want to play the included games, already have them, or bought two graphics cards and have one game to spare; you had the ability to give away, trade, or even sell those game codes. NVIDIA is about to change that.

With its latest game bundle that lets you choose from "For Honor" and Tom Clancy's "Ghost Recon: Wildlands," on purchase of new GeForce GTX 1070 or GTX 1080 graphics cards, NVIDIA changed the game redemption method. You first need NVIDIA's GeForce Experience app installed and logged in. The app then one-time redeems the game of your choice on verifying that you have the graphics card participating in the offer (i.e. a GTX 1070 or GTX 1080). The app doesn't appear to be checking serial-numbers of the cards, but rather if the right GPU is installed in the machine. After redeeming the game, however, you are free to uninstall GeForce Experience, or even change your graphics card. The game is handled by the DRM platform its developers intended (Steam, UPlay, Origin, etc.). We tested how game code trading works under the new system.

NVIDIA Announces New Game Bundle, Requires GeForce Experience to Activate

A graphics card-game bundle is always a reason to cheer, though not this time, if you loathe GeForce Experience. The company's latest "Prepare for Battle" game bundle lets you choose between two of the season's hottest game releases - "For Honor," and Tom Clancy's "Ghost Recon: Wildlands" on purchase of new GeForce GTX 1080 or GTX 1070 graphics cards. The only catch here is that you need GeForce Experience to redeem or activate your free game. This further requires you to create a GeForce Experience login, which adds to the list of startup apps, as GeForce Experience dials home to sign-in and sync your game settings.
Return to Keyword Browsing