• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Details "Pascal" Some More at GTC Japan

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,283 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA revealed more details of its upcoming "Pascal" GPU architecture at the Japanese edition of the Graphics Technology Conference. The architecture will be designed to nearly double performance/Watt over the current "Maxwell" architecture, by implementing the latest tech. This begins with stacked HBM2 (high-bandwidth memory 2). The top "Pascal" based product will feature four 4-gigabyte HBM2 stacks, totaling 16 GB of memory. The combined memory bandwidth for the chip will be 1 TB/s. Internally, bandwidths can touch as high as 2 TB/s. The chip itself will support up to 32 GB of memory, and so enterprise variants (Quadro, Tesla), could max out the capacity. The consumer GeForce variant is expected to serve up 16 GB.

It's also becoming clear that NVIDIA will build its "Pascal" chips on the 16 nanometer FinFET process (AMD will build its next-gen chips on more advanced 14 nm process). NVIDIA is innovating a new interconnect called NVLink, which will change the way the company has been building dual-GPU graphics cards. Currently, dual-GPU cards are essentially two graphics cards on a common PCB, with PCIe bandwidth from the slot shared by a bridge-chip, and an internal SLI bridge connecting the two GPUs. With NVLink, the two GPUs will be interconnected with an 80 GB/s bi-directional data path, letting each GPU directly address memory controlled by the other. This should greatly improve memory management in games that take advantage of newer APIs such as DirectX 12 and Vulkan; and prime the graphics card for higher display resolutions. NVIDIA is expected to launch its first "Pascal" based products in the first half of 2016.



View at TechPowerUp Main Site
 
Joined
Sep 15, 2007
Messages
3,944 (0.65/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
Another $1k card, folks. Hell, with that much ram, maybe more.
 
Joined
Oct 10, 2009
Messages
929 (0.18/day)
System Name Desktop | Laptop
Processor AMD Ryzen 7 5800X3D | Intel Core i7 7700HQ
Motherboard MAG X570S Torpedo Max| Neptune KLS HM175
Cooling Corsair H100x | Twin fan, fin stack & heat pipes
Memory 32GB G.Skill F4-3600C16-8GVK @ 3600MHz / 16-16-16-36-1T | 16GB DDR4 @ 2400MHz / 17-17-17-39-2T
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra | GTX 1050 Ti 4GB
Storage Kingston KC3000 1TB + Kingston KC3000 2TB + Samsung 860 EVO 1TB | 970 Evo 500GB
Display(s) 32" Dell G3223Q (2160p @ 144Hz) | 17" IPS 1920x1080P
Case Fractal Meshify 2 Compact | Aspire V Nitro BE
Audio Device(s) ifi Audio ZEN DAC V2 + Focal Radiance / HyperX Solocast
Power Supply Super Flower Leadex V Platinum Pro 1000W | 150W
Mouse Razer Viper Ultimate | Logitech MX Anywhere 2
Keyboard Razer Huntsman V2 Optical (Linear Red)
Software Windows 11 Pro x64
Joined
Nov 4, 2005
Messages
11,655 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs and over 10TB spinning
Display(s) 56" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
With 16GB of RAM available to each GPU core what the hell are they planning on putting in the rest of the memory that it would need access to more?

We are at the point of needing 3GB of vmem with good memory management and 4GB for textures that may not be optomized, not 16 with piss poor management. 8GB would be enough with better timings to reduce latency.
 
Joined
Oct 2, 2004
Messages
13,791 (1.94/day)
Keeping developers lazy since, I don't know, too long. The only good thing with consoles is that devs actually have to code games wel in order to make them look epic and run the same. But on PC, who cares, just smack extra 8GB of RAM and a graphic card with twice the memory and you can have a sloppy coded game that will run kinda fine because of that...
 
Joined
Oct 10, 2009
Messages
929 (0.18/day)
System Name Desktop | Laptop
Processor AMD Ryzen 7 5800X3D | Intel Core i7 7700HQ
Motherboard MAG X570S Torpedo Max| Neptune KLS HM175
Cooling Corsair H100x | Twin fan, fin stack & heat pipes
Memory 32GB G.Skill F4-3600C16-8GVK @ 3600MHz / 16-16-16-36-1T | 16GB DDR4 @ 2400MHz / 17-17-17-39-2T
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra | GTX 1050 Ti 4GB
Storage Kingston KC3000 1TB + Kingston KC3000 2TB + Samsung 860 EVO 1TB | 970 Evo 500GB
Display(s) 32" Dell G3223Q (2160p @ 144Hz) | 17" IPS 1920x1080P
Case Fractal Meshify 2 Compact | Aspire V Nitro BE
Audio Device(s) ifi Audio ZEN DAC V2 + Focal Radiance / HyperX Solocast
Power Supply Super Flower Leadex V Platinum Pro 1000W | 150W
Mouse Razer Viper Ultimate | Logitech MX Anywhere 2
Keyboard Razer Huntsman V2 Optical (Linear Red)
Software Windows 11 Pro x64
Keeping developers lazy since, I don't know, too long. The only good thing with consoles is that devs actually have to code games wel in order to make them look epic and run the same. But on PC, who cares, just smack extra 8GB of RAM and a graphic card with twice the memory and you can have a sloppy coded game that will run kinda fine because of that...

Do you have a reference to such an occurrence? Maybe it's just for the next wave of insane 4K optimized games? Or perhaps the 16GB will be reserved for that whole Titan X-style scenario and the more reasonably priced SKUs will have 8GB?
 
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Do you have a reference to such an occurrence? Maybe it's just for the next wave of insane 4K optimized games? Or perhaps the 16GB will be reserved for that whole Titan X-style scenario and the more reasonably priced SKUs will have 8GB?
That's probably closer to the truth I think. Two HBM2 stacks for a performance/mainstream product would greatly reduce overall cost (final assembly). The whole Pascal presentation seems to be an adjunct to the Supercomputing Conference (SC15) just held in Texas, so I'm figuring the thrust of the Pascal presentations are more toward Tesla than GeForce.
NVIDIA will build its "Pascal" chips on the 16 nanometer FinFET process (AMD will build its next-gen chips on more advanced 14 nm process).
Oh?
1. Can't say I've seen anything that actually backs up your assertion that 14LPP is more advanced than 16FF+/16FFC
2. I also haven't seen it confirmed anywhere that AMD will tap GloFo exclusively for GPUs. Somewhere in the last 7-8 weeks "could" has turned into "will". Some sources seem to think that GloFo will be tapped for lower end GPUs and Zen, with TSMC tasked with producing the larger GPUs.
 
Last edited:
Joined
Feb 11, 2009
Messages
5,389 (0.98/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> ... nope still the same :'(
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Its that NVlink crap that has me worried.
Instead of, oh idk, having a single damn gpu capable of running games at 4k @ 120hz, they seem to focus and plan on us getting 2...

I already find it massively bs with the current gen of cards, 600 - 1000 euro for a card that feels outdated right away....

AC Unity on a mere 1080p with a freaking 1000 dollar Titan X only manages 48 fps...
BF 4 , 4k, same 1000 dollar card...a mere 41 fps....

I could go on, if I am laying down that kind of cash for the latest and greatest gpu it better damn well run atleast current ffing games at the highest settings and then some.
Its ridiculous the ask you to put down that kind of money twice.
I mean Im not even talking about the more demanding more advanced games in the future, im talking about the here and now and those cards cannot do that?

I feel we should demand better as consumers tbh but yeah that focus on better SLI using that NVlink....not a good thing to focus on imo.
 
Joined
Jul 9, 2015
Messages
3,413 (1.07/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Do you have a reference to such an occurrence?
The latest "The Longest Journey". (built on Unity engine, methinks)


1. Can't say I've seen anything that actually backs up your assertion that 14LPP is more advanced than 16FF+/16FFC

Actually, it is on the opposite as far as iThings go, some CPUs are manufactured on Samsung's 14nm, some on TSMC 16nm, the former consume more power.
 
Joined
Aug 2, 2012
Messages
1,760 (0.41/day)
Location
Netherlands
System Name TheDeeGee's PC
Processor Intel Core i7-11700
Motherboard ASRock Z590 Steel Legend
Cooling Noctua NH-D15
Memory Crucial Ballistix 3200/C16 4x8GB
Video Card(s) Nvidia RTX 4070 Ti 12GB
Storage Crucial P5 Plus 2TB / Crucial P3 Plus 2TB / Crucial P3 Plus 4TB
Display(s) EIZO CX240
Case Fractal Design Define 7
Audio Device(s) Creative Sound Blaster ZXR, AKG K601 Headphones
Power Supply Seasonic Fanless TX-700
Mouse Logitech G500s
Keyboard Keychron Q6
Software Windows 10 Pro 64-Bit
Benchmark Scores None, as long as my games runs smooth.
Joined
Sep 7, 2011
Messages
2,785 (0.61/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Its that NVlink crap that has me worried.
Why? Are you planning on building a supercomputing cluster?
I feel we should demand better as consumers tbh but yeah that focus on better SLI using that NVlink....not a good thing to focus on imo.
Sorry to bust up your indignation tirade, but since it isn't aimed at gaming SLI but workloads that are more intensive on system bus bandwidth - notably the Exascale computing initiative with IBM, Cray, and Mellanox, that really shouldn't be a problem.
 
Joined
Feb 11, 2009
Messages
5,389 (0.98/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> ... nope still the same :'(
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Why? Are you planning on building a supercomputing cluster?

Sorry to bust up your indignation tirade, but since it isn't aimed at gaming SLI but workloads that are more intensive on system bus bandwidth - notably the Exascale computing initiative with IBM, Cray, and Mellanox, that really shouldn't be a problem.

If you think this will not find its way asap in gaming...
Its sorta how the internet originated in Military development, everything seems to start with Military or Space development and then finds its way to the consumer.
Sure this is first for industry but yeah will make its way to the gamers soon enough.

And even if it does not, statement still stands, hate this focus and "need" for dual gpu setups to get anything decent going.

also on a side note....dear gawd what a presentation...
 
Joined
Apr 16, 2015
Messages
306 (0.09/day)
System Name Zen
Processor Ryzen 5950X
Motherboard MSI
Cooling Noctua
Memory G.Skill 32G
Video Card(s) RX 7900 XTX
Storage never enough
Display(s) not OLED :-(
Keyboard Wooting One
Software Linux
Did they fix async compute with the Pascal? (or was Pascal already designed and "taped out" when the scandal started?)

Did they fix the VR preemption problem? Nvidia VR preemption "possibly catastrophic"

Did they add freesync adaptive sync compatibility?


And hopefully we get 980Ti performance in 970mini/Nano/or_even_smaller form factor. :)
And does anybody know yet Arctic Islands actual product availability (not paper) launch date/time?
 
Last edited:
Joined
Apr 16, 2015
Messages
306 (0.09/day)
System Name Zen
Processor Ryzen 5950X
Motherboard MSI
Cooling Noctua
Memory G.Skill 32G
Video Card(s) RX 7900 XTX
Storage never enough
Display(s) not OLED :-(
Keyboard Wooting One
Software Linux
Joined
May 19, 2009
Messages
1,818 (0.33/day)
Location
Latvia
System Name Personal \\ Work - HP EliteBook 840 G6
Processor 7700X \\ i7-8565U
Motherboard Asrock X670E PG Lightning
Cooling Noctua DH-15
Memory G.SKILL Trident Z5 RGB Black 32GB 6000MHz CL36 \\ 16GB DDR4-2400
Video Card(s) ASUS RoG Strix 1070 Ti \\ Intel UHD Graphics 620
Storage 2x KC3000 2TB, Samsung 970 EVO 512GB \\ OEM 256GB NVMe SSD
Display(s) BenQ XL2411Z \\ FullHD + 2x HP Z24i external screens via docking station
Case Fractal Design Define Arc Midi R2 with window
Audio Device(s) Realtek ALC1150 with Logitech Z533
Power Supply Corsair AX860i
Mouse Logitech G502
Keyboard Corsair K55 RGB PRO
Software Windows 11 \\ Windows 10
1TB/sec?
Well, damn...
 
Joined
Oct 2, 2004
Messages
13,791 (1.94/day)
Did they fix async compute with the Pascal? (or was Pascal already designed and "taped out" when the scandal started?)

Did they fix the VR preemption problem? Nvidia VR preemption "possibly catastrophic"

Did they add freesync compatibility?


And hopefully we get 980Ti performance in 970mini/Nano/or_even_smaller form factor. :)
And does anybody know yet Arctic Islands actual product availability (not paper) launch date/time?

It's not a scandal. Maxwell 2 is capable of async compute with limited queue size with minimal performance drop. Once you exceede that queue size, performance starts dropping. Radeon cards have a much larger async queue and they usually don't hit the ceiling. Not much known if games really need queues beyond what Maxwell 2 can do...
 
Joined
Jun 13, 2012
Messages
1,316 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts. 180-190watt draw)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
Did they fix async compute with the Pascal? (or was Pascal already designed and "taped out" when the scandal started?)

Did they add freesync compatibility?

Kinda hard to support a tech that was AMD locked for so many years that didn't get added to DX12 til last minute which was after maxwell 2 was final. Freesync is AMD locked software solution, (go read AMD's own FAQ's before you try to call me a fanboy) Adaptive sync is the standard, freesync uses that standard in a proprietary way. Old 7000 radeon cards can do adaptive sync but not freesync. Pascal being taped out doesn't mean its final chip, it is a prototype chip that very well can change and isn't final design, But nvidia does have 5 month gap. Hope that means not 5-6 month gap between amd and nvidia's chips but could less AMD cuts corners which they will have to and likely not gonna be a good idea.
 
Joined
Jul 9, 2015
Messages
3,413 (1.07/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Freesync is AMD locked software solution, (go read AMD's own FAQ's before you try to call me a fanboy)
Adaptive sync is the standard, freesync uses that standard in a proprietary way.

1) GSync is as locked down as it gets (to "nope, won't license it to anyone" point)
2) adaptive sync is THE ONLY standard, there is no "freesync" standard
3) nothing stops any manufacturer out there to use adaptive sync (dp 1.2a), no need to involve AMD or any of its "freesync" stuff in there
 
Joined
Dec 28, 2012
Messages
3,475 (0.85/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
Do you have a reference to such an occurrence? Maybe it's just for the next wave of insane 4K optimized games? Or perhaps the 16GB will be reserved for that whole Titan X-style scenario and the more reasonably priced SKUs will have 8GB?
Batman arkham knight and black ops III come to mind.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,866 (3.00/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
I'm really looking forward to that unified memory architecture and the elimination of SLI problems.
 
Joined
Aug 13, 2010
Messages
5,380 (1.08/day)
Arma and battlefield also love super high res textures.
Basically many non-single play , non "GPU tests" scenarious just like much more VRAM
 
Joined
Apr 19, 2011
Messages
2,198 (0.46/day)
Location
So. Cal.
(AMD will build its next-gen chips on more advanced 14 nm process).

Oh?
1. Can't say I've seen anything that actually backs up your assertion that 14LPP is more advanced than 16FF+/16FFC
2. I also haven't seen it confirmed anywhere that AMD will tap GloFo exclusively for GPUs. Somewhere in the last 7-8 weeks "could" has turned into "will". Some sources seem to think that GloFo will be tapped for lower end GPUs and Zen, with TSMC tasked with producing the larger GPUs.

Yea, btarunr came out of left-field with that snippet as soon as I read it it was WTF o_O.
Thanks for that clean up.

NVIDIA is expected to launch its first "Pascal" based products in the first half of 2016.
Do we have this as confirmation? While sure it could see starting delivery's for HPC customer initiatives first (Exascale/IBM, Cray, etc), then professional products (Tesla/Quadro), while GeForce use of the GP100 should be out a ways.
 
Last edited:

FreedomEclipse

~Technological Technocrat~
Joined
Apr 20, 2007
Messages
23,314 (3.77/day)
Location
London,UK
System Name Codename: Icarus Mk.VI
Processor Intel 8600k@Stock -- pending tuning
Motherboard Asus ROG Strixx Z370-F
Cooling CPU: BeQuiet! Dark Rock Pro 4 {1xCorsair ML120 Pro|5xML140 Pro}
Memory 32GB XPG Gammix D10 {2x16GB}
Video Card(s) ASUS Dual Radeon™ RX 6700 XT OC Edition
Storage Samsung 970 Evo 512GB SSD (Boot)|WD SN770 (Gaming)|2x 3TB Toshiba DT01ACA300|2x 2TB Crucial BX500
Display(s) LG GP850-B
Case Corsair 760T (White)
Audio Device(s) Yamaha RX-V573|Speakers: JBL Control One|Auna 300-CN|Wharfedale Diamond SW150
Power Supply Corsair AX760
Mouse Logitech G900
Keyboard Duckyshine Dead LED(s) III
Software Windows 10 Pro
Benchmark Scores (ノಠ益ಠ)ノ彡┻━┻
Batman arkham knight and black ops III come to mind.

Its amazing how people were so hyped about BLOPS III - It was all over the internet and now its like its completely faded into obscurity. Nobody talks about it no more.


But then again Fallout 4 happened so even though Activision we're the first to get their game out. its Bethesda thats gettin all the pussy

:::EDIT:::

Oh, and not to forget about battlefront of course which has already been available in many countries apart from the UK. People are either playing one of the two games
 
Joined
Apr 17, 2014
Messages
228 (0.06/day)
System Name GSYNC
Processor i9-10920X
Motherboard EVGA X299-FTW
Cooling Custom water loop: D5
Memory G.Skill RipJawsZ 16GB 2133mhz 9-11-10-28
Video Card(s) (RTX2080)
Storage OCZ vector, samsung evo 950, Intel M.2 1TB SSD's
Display(s) ROG Swift PG278Q, Acer Z35 and Acer XB270H (NVIDIA G-SYNC)
Case 2x Corsair 450D, Corsair 540
Audio Device(s) sound blaster Z
Power Supply EVGA SuperNOVA 1300 G2 Power
Mouse Logitech proteus G502
Keyboard Corsair K70R cherry red
Software WIN10 Pro (UEFI)
Benchmark Scores bench score are for people who don't game.
Hell yah! bring it! Excited for Pascal!
 
Joined
Apr 16, 2015
Messages
306 (0.09/day)
System Name Zen
Processor Ryzen 5950X
Motherboard MSI
Cooling Noctua
Memory G.Skill 32G
Video Card(s) RX 7900 XTX
Storage never enough
Display(s) not OLED :-(
Keyboard Wooting One
Software Linux
Kinda hard to support a tech that was AMD locked for so many years that didn't get added to DX12 til last minute which was after maxwell 2 was final. Freesync is AMD locked software solution, (go read AMD's own FAQ's before you try to call me a fanboy) Adaptive sync is the standard, freesync uses that standard in a proprietary way. Old 7000 radeon cards can do adaptive sync but not freesync. Pascal being taped out doesn't mean its final chip, it is a prototype chip that very well can change and isn't final design, But nvidia does have 5 month gap. Hope that means not 5-6 month gap between amd and nvidia's chips but could less AMD cuts corners which they will have to and likely not gonna be a good idea.
1) GSync is as locked down as it gets (to "nope, won't license it to anyone" point)
2) adaptive sync is THE ONLY standard, there is no "freesync" standard
3) nothing stops any manufacturer out there to use adaptive sync (dp 1.2a), no need to involve AMD or any of its "freesync" stuff in there

so much misinformation.

Adaptive sync IS FreeSync.

FreeSync is the brand name for an adaptive synchronization technology for LCD displays that support a dynamic refresh rate aimed at reducing screen tearing.[2] FreeSync was initially developed by AMD in response to NVidia's G-Sync. FreeSync is royalty-free, free to use, and has no performance penalty.[3] As of 2015, VESA has adopted FreeSync as an optional component of the DisplayPort 1.2a specification.[4] FreeSync has a dynamic refresh rate range of 9–240 Hz.[3] As of August 2015, Intel also plan to support VESA's adaptive-sync with the next generation of GPU.[5]

https://en.wikipedia.org/wiki/FreeSync
 
Top