• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Develops Tile-based Multi-GPU Rendering Technique Called CFR

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,355 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA is invested in the development of multi-GPU, specifically SLI over NVLink, and has developed a new multi-GPU rendering technique that appears to be inspired by tile-based rendering. Implemented at a single-GPU level, tile-based rendering has been one of NVIDIA's many secret sauces that improved performance since its "Maxwell" family of GPUs. 3DCenter.org discovered that NVIDIA is working on its multi-GPU avatar, called CFR, which could be short for "checkerboard frame rendering," or "checkered frame rendering." The method is already secretly deployed on current NVIDIA drivers, although not documented for developers to implement.

In CFR, the frame is divided into tiny square tiles, like a checkerboard. Odd-numbered tiles are rendered by one GPU, and even-numbered ones by the other. Unlike AFR (alternate frame rendering), in which each GPU's dedicated memory has a copy of all of the resources needed to render the frame, methods like CFR and SFR (split frame rendering) optimize resource allocation. CFR also purportedly offers lesser micro-stutter than AFR. 3DCenter also detailed the features and requirements of CFR. To begin with, the method is only compatible with DirectX (including DirectX 12, 11, and 10), and not OpenGL or Vulkan. For now it's "Turing" exclusive, since NVLink is required (probably its bandwidth is needed to virtualize the tile buffer). Tools like NVIDIA Profile Inspector allow you to force CFR on provided the other hardware and API requirements are met. It still has many compatibility problems, and remains practically undocumented by NVIDIA.



View at TechPowerUp Main Site
 
Joined
Aug 20, 2007
Messages
20,759 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
This was explained back when Crossfire and SLI were making their debut, IIRC. It is not exactly new? Or am I missing something.

In all cases I recall, these techniques sucked because they mandated each gpu still had to render the complete scene geometry, and only helped with fill rate.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,355 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
This was explained back when Crossfire and SLI were making their debut, IIRC. It is not exactly new? Or am I missing something.

In all cases I recall, these techniques sucked because they mandated each gpu still had to render the complete scene geometry, and only helped with fill rate.

Yeah, I too had a lot of deja vu writing this, and had a long chat with W1zzard. Maybe it's some kind of TBR extrapolation for multi-GPU which they finally got right.
 
Joined
Aug 20, 2007
Messages
20,759 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
Yeah, I too had a lot of deja vu writing this, and had a long chat with W1zzard. Maybe it's some kind of TBR extrapolation for multi-GPU which they finally got right.

I sometimes swear they are selling us the same darn tech with new buzzwords...

Maybe the matrix is just glitching again...
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,029 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
It seems they are leveraging their (single GPU) tiled-rendering hardware in the silicon to split up the image for CFR, possibly with non 50/50 splits that could possibly dynamically change during runtime to spread the load better.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Did either AMD or Nvidia manage to get dynamic splitting to work reliably? As far as I remember all the attempts were eventually ended because solutions came with their own set of problems primarily around uneven frame times and stuttering.

Single-GPU tiled-rendering hardware would be tiles of static size but playing around with the tile count per GPU might work?
 

silentbogo

Moderator
Staff member
Joined
Nov 20, 2013
Messages
5,473 (1.44/day)
Location
Kyiv, Ukraine
System Name WS#1337
Processor Ryzen 7 3800X
Motherboard ASUS X570-PLUS TUF Gaming
Cooling Xigmatek Scylla 240mm AIO
Memory 4x8GB Samsung DDR4 ECC UDIMM
Video Card(s) Inno3D RTX 3070 Ti iChill
Storage ADATA Legend 2TB + ADATA SX8200 Pro 1TB
Display(s) Samsung U24E590D (4K/UHD)
Case ghetto CM Cosmos RC-1000
Audio Device(s) ALC1220
Power Supply SeaSonic SSR-550FX (80+ GOLD)
Mouse Logitech G603
Keyboard Modecom Volcano Blade (Kailh choc LP)
VR HMD Google dreamview headset(aka fancy cardboard)
Software Windows 11, Ubuntu 20.04 LTS
I'm wondering if that new tile-based technique will introduce artifacts in the picture, just like with tearing in SFR?
 
Joined
Jun 28, 2016
Messages
3,595 (1.26/day)
I'm wondering if that new tile-based technique will introduce artifacts in the picture, just like with tearing in SFR?
Not when doing RTRT, which is likely the reason they're developing this (and mostly for game streaming services, not local GPUs).
Did either AMD or Nvidia manage to get dynamic splitting to work reliably? As far as I remember all the attempts were eventually ended because solutions came with their own set of problems primarily around uneven frame times and stuttering.
Well, actually this is a problem that RTRT solves automatically.
In legacy game rendering techniques the input consists of instructions that must be run. There's little control over time - GPU has to complete (almost) everything or there's no image at all.
So the rendering time is a result (not a parameter) and each frame has to wait for the last tile.

In RTRT frame rendering time (i.e. number of rays) is the primary input parameter. It's not relevant how you split the frame. This is perfectly fine:

1574328651433.png
 
Joined
Dec 3, 2012
Messages
613 (0.15/day)
Processor Intel i9 9900K @5Ghz 1.32vlts
Motherboard Gigabyte Z390 Aorus Pro Wi-Fi
Cooling BeQuiet Dark Rock 4
Memory 32GB Corsair Vengeance Pro DDR4 3200Mhz (16-18-18-36)
Video Card(s) Nvidia RTX 3080 Founders Edition
Storage 512GB Gigabyte Aorus NVMe (Boot) 1TB Crucial NVMe (Games)
Display(s) LG UK850 27in 4K Freesync/G-Sync/HDR 600
Case Fractal Design Meshify C Windowed (Dark Tint)
Audio Device(s) Corsair HS70 Special Edition Wireless Headphones & 7.1 Sound
Power Supply Corsair RMx 850w Gold
Mouse HyperX Pulsefire Surge RGB
Keyboard HyperX Alloy Elite Mechanical RGB (Cherry Red)
Software Windows 10 Home
So, Nvidia are now using the technique that UK based PowerVR developed, a company that Nvidia effectively forced out of the PC GPU market with their dirty tricks, in the early 2000's... :rolleyes:
 
Joined
Jun 28, 2016
Messages
3,595 (1.26/day)
So, Nvidia are now using the technique that UK based PowerVR developed, a company that Nvidia effectively forced out of the PC GPU market with their dirty tricks, in the early 2000's... :rolleyes:
Tile-based rendering is a straightforward, natural approach. It's commonly used in non-gaming rendering engines (you see it happening in Cinebench). PowerVR didn't invent it. They may have just been the first to implement it in hardware.
 
Joined
Jul 15, 2006
Messages
977 (0.15/day)
Location
Malaysia
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B450M-S2H
Cooling Scythe Kotetsu Mark II
Memory 2 x 16GB SK Hynix OEM DDR4-3200 @ 3666 18-20-18-36
Video Card(s) Colorful RTX 2060 SUPER 8GB
Storage 250GB WD BLACK SN750 M.2 + 4TB WD Red Plus + 4TB WD Purple
Display(s) AOpen 27HC5R 27" 1080p 165Hz
Case COUGAR MX440 Mesh RGB
Audio Device(s) Creative X-Fi Titanium HD + Kurtzweil KS-40A bookshelf
Power Supply Corsair CX750M
Mouse Razer Deathadder Essential
Keyboard Cougar Attack2 Cherry MX Black
Software Windows 10 Pro 22H1 x64
ATi implement Crossfire mode called Super Tiling back in early X800/X850 days. Though it requires a 'Master' card with dedicated compositing engine to combine the input of two cards, plus a dongle.


 
Joined
Jun 3, 2010
Messages
2,540 (0.50/day)
Don't overthink it. We needed SLI in Directx 12, now we have it. The trick is running render targets seperately despite having to run equal postprocess weights throughout the screen therefore it has been difficult to scale up SFR performance. Since there is no unified dynamic lighting load in RTX mode, this might work.
 
Joined
Jun 2, 2017
Messages
7,905 (3.15/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
A question for the community; Would a VBIOS update be enough to enable crossfire on the 5700 cards?
 
Joined
Jun 5, 2009
Messages
211 (0.04/day)
Location
Germany
System Name Steam Deck LCD
Processor AMD Van Gogh 4-Core 8-Threads
Motherboard Stock
Cooling Stock + MX4 Thermal Paste
Memory 16GB DDR5 5500
Video Card(s) AMD Van Gogh 8 CUs
Storage 512GB NVMe + 512GB MicroSD
Display(s) 7" 800p | LG 34" Ultrawide + 27" both 75Hz | 58" UHD TV
Case Stock + Airflow Backplate (5C less and a lot less noise)
Audio Device(s) Jabra Elite 65T Bluetooth Headphones | 50W Sound Bar
Power Supply Stock
Mouse Logitech MX-Master 3 Bluetooth | PS4 controller (because gyro)
Keyboard Logitech MX-Keys Bluetooth
Software SteamOS
Here's a crazy idea: Why not work with M$/AMD to optimize DX12/Vulkan? Hell Vulkan has an open source SDK, it does not even need special cooperations with anyone.
Also, back when DX12 was launched there was a lot of hype on how good it would perform with multi-GPU setups using async technologies (indipendent chips & manufacturers) https://wccftech.com/dx12-nvidia-amd-asynchronous-multigpu/
Seems like everyone forgot about it...
 
Joined
Nov 6, 2016
Messages
1,573 (0.58/day)
Location
NH, USA
System Name Lightbringer
Processor Ryzen 7 2700X
Motherboard Asus ROG Strix X470-F Gaming
Cooling Enermax Liqmax Iii 360mm AIO
Memory G.Skill Trident Z RGB 32GB (8GBx4) 3200Mhz CL 14
Video Card(s) Sapphire RX 5700XT Nitro+
Storage Hp EX950 2TB NVMe M.2, HP EX950 1TB NVMe M.2, Samsung 860 EVO 2TB
Display(s) LG 34BK95U-W 34" 5120 x 2160
Case Lian Li PC-O11 Dynamic (White)
Power Supply BeQuiet Straight Power 11 850w Gold Rated PSU
Mouse Glorious Model O (Matte White)
Keyboard Royal Kludge RK71
Software Windows 10
Would this have anything to do with MCM GPUs? I hope AMD beats Nvidia to an MCM (multiple GPU Chipley, not just a GPU and HBM) GPU, I'm not an AMD fanboy in the least, I just dislike Nvidia and want them to get cut down to size like Intel has been solely due to the fact that Intel getting their ass whooped has benefitted consumers and the same happening to Nvidia would probably benefit us all.
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
16,054 (2.26/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/5za05v
So, Nvidia are now using the technique that UK based PowerVR developed, a company that Nvidia effectively forced out of the PC GPU market with their dirty tricks, in the early 2000's... :rolleyes:
Wow, I thought people had forgotten about them. Nvidia was also trashing them back then and saying tile based rendering sucked...
 
Joined
Oct 4, 2017
Messages
695 (0.29/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2
Would this have anything to do with MCM GPUs?

That's exactly what i though ! I don't see them revamping SLI after killing it , to me this has more to do with future MCM designs and might be a good indication that MCM design based gaming GPU from Nvidia is closer than what most of us believe .
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Nvidia was also trashing them back then and saying tile based rendering sucked...
:)

Edit:
For a bit of background, this was a presentation to OEMs.
Kyro was technically new and interesting but as an actual gaming GPU on desktop cards, it sucked both due to spotty support as well as lackluster performance. It definitely had its bright moments but they were too few and far between. PowerVR could not develop their tech fast enough to compete with Nvidia an ATi at the time.
PowerVR itself went along just fine, the same architecture series was (or is) a strong contender in mobile GPUs.
 
Last edited:

64K

Joined
Mar 13, 2014
Messages
6,104 (1.65/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
I see MCM as the way forward and not just another version of SLI. For one thing the cost of buying 2 cards must be more expensive than a single card with MCM that could rival the performance of multi-GPU. Granted the MCM will cost more than a regular GPU but with SLI you have to buy 2 of everything. 2 PCBs, 2 GPUs, 2 sets of VRAM, 2 of all the components on the PCB, 2 Shrouds, 2 coolers, 2 boxes, etc
 
Last edited:

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.59/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
They are trying to justify the rtx lineup lol
 
Joined
Apr 8, 2010
Messages
992 (0.19/day)
Processor Intel Core i5 8400
Motherboard Gigabyte Z370N-Wifi
Cooling Silverstone AR05
Memory Micron Crucial 16GB DDR4-2400
Video Card(s) Gigabyte GTX1080 G1 Gaming 8G
Storage Micron Crucial MX300 275GB
Display(s) Dell U2415
Case Silverstone RVZ02B
Power Supply Silverstone SSR-SX550
Keyboard Ducky One Red Switch
Software Windows 10 Pro 1909
Also, back when DX12 was launched there was a lot of hype on how good it would perform with multi-GPU setups using async technologies (indipendent chips & manufacturers) https://wccftech.com/dx12-nvidia-amd-asynchronous-multigpu/
Seems like everyone forgot about it...
With good reason. In order for it to really work, the programmer would need to optimize every time for a specific system. If I am writing some GPGPU software for a solution that I'm selling bundled with a computer (which hardware I get to specify), it could be worth the effort. For games that can have any combination of rendering hardware? Eh, no thanks. The world is just much simpler when we just have to think about 2 exact same gpu to balance the workload equally. Even then we get cans of worms thrown at our faces from time to time.
 
Joined
Nov 4, 2005
Messages
11,681 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs and over 10TB spinning
Display(s) 56" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
It seems they are leveraging their (single GPU) tiled-rendering hardware in the silicon to split up the image for CFR, possibly with non 50/50 splits that could possibly dynamically change during runtime to spread the load better.


A good idea with complexity, how is full screen AA processed if only half the resources are on each card?

Or could this possibly be a Zen like chiplet design to save money and loss on the newest node?

NVlink as the fabric for communication, if only half the resources are actually required maybe I'm out in left field but put 12GB or 6GB for each chiplet and interleave the memory.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
A good idea with complexity, how is full screen AA processed if only half the resources are on each card?
Or could this possibly be a Zen like chiplet design to save money and loss on the newest node?
NVlink as the fabric for communication, if only half the resources are actually required maybe I'm out in left field but put 12GB or 6GB for each chiplet and interleave the memory.
AA would probably be one of the postprocessing methods done at the end of rendering a frame.

You can't get off with shared memory like that. You are still going to need a sizable part of assets accessible by both/all GPUs. Any memory far away from GPU is evil and even a fast interconnect like NVLink won't replace local memory. GPUs are very bandwidth-constrained so sharing memory access through something like Zen2's IO die is not likely to work on GPUs at this time. With big HBM cache for each GPU, maybe, but that is effectively still each GPU having its own VRAM :)

Chiplet design has been the end goal for a while and all the GPU makers have been trying their hand on this. So far, unsuccessfully. As @Apocalypsee already noted - even tiled distribution of work is not new.
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
16,054 (2.26/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/5za05v
:)

Edit:
For a bit of background, this was a presentation to OEMs.
Kyro was technically new and interesting but as an actual gaming GPU on desktop cards, it sucked both due to spotty support as well as lackluster performance. It definitely had its bright moments but they were too few and far between. PowerVR could not develop their tech fast enough to compete with Nvidia an ATi at the time.
PowerVR itself went along just fine, the same architecture series was (or is) a strong contender in mobile GPUs.

It wasn't that bad, I tested the cards myself at the time, in fact, I'm still on good terms with Imagination Technologies PR director, who at the time used to come by the office with things to test. But yes, they did have drivers issues, which was one of the big flaws, but performance wasn't as terrible as that old Nvidia presentation makes it out to be.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.59/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
It wasn't that bad, I tested the cards myself at the time, in fact, I'm still on good terms with Imagination Technologies PR director, who at the time used to come by the office with things to test. But yes, they did have drivers issues, which was one of the big flaws, but performance wasn't as terrible as that old Nvidia presentation makes it out to be.

Of course nv will commit libel just like intel.
 
Top