• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

1024 bit memory bus!!!

jbunch07

New Member
Joined
Feb 22, 2008
Messages
5,260 (0.83/day)
Location
Chattanooga,TN
Processor i5-2500k
Motherboard ASRock z68 pro3-m
Cooling Corsair A70
Memory Kingston HyperX 8GB 2 x 4GB 1600mhz
Storage OCZ Agility3 60GB(boot) 2x320GB Raid0(storage)
Display(s) Samsung 24" 1920x1200
Case Custom
Power Supply PC Power and Cooling 750w
Software Win 7 x64
ok so i was thinking the other day...
if ati used 2 3870 cores (256 bus each) for a combined bus of 512 on the x2
wonder what it would be like if they used the 2900's 512 bit bus and combined them for a combined bus of 1024 bit bus...how sweet would this be...

i know its not practical but hey i guy can dream right! :rolleyes:
 
By comparing the results of the 38x0 and the 2900 you can notice that the memory interface doesn't mean much. This is because their graphics chips are not starving for memory bandwidth. That's why ATi reduced the bus width from 512bit to 256bit. nVidia did the same, although their cards are more bandwidth hungry. It's a compromise between a low performance penalty and a much cheaper production of the PCB.
With the 1024bit memory interface, you'll only end up with a more than 2 times more xpensive and very complex PCB that would bring almost 0 performance improvement over the 512bit.
 
and 2x512 bit doesnt exactly make it 1024-bit. its still 2x512.

just like the GX2/X2, theyre both 256 bit x2 and 512 mb x2, as oppose to 512 bit 1024 mb.
 
as said both nvidia and AMD are shader starved, though Nvidia is less shader starved right now. As AMD's opperate as a 64 shader core most of the time
 
i think nvidias got the right idea clocking the shaders really high and having less of them. makes for some kickass fun overclocking too :D
 
and 2x512 bit doesnt exactly make it 1024-bit. its still 2x512.

just like the GX2/X2, theyre both 256 bit x2 and 512 mb x2, as oppose to 512 bit 1024 mb.
Well, effectively the 2x512bit is making it a 1024bit, but with the frame buffer reduced to half.
 
Lol, it doesnt matter if you have 8x256bit, you think its a full 256bit? no, its 16x16bit sub-interfaces, and broken down further. 2x512 bit would perform the same as a 1024 bit, someone please correct me if im wrong im sure someone will try :P
 
Fuckin' funny beginning to a rather, pointless thread.

^^
 
agreed. closed.
 
ok so i was thinking the other day...
if ati used 2 3870 cores (256 bus each) for a combined bus of 512 on the x2
wonder what it would be like if they used the 2900's 512 bit bus and combined them for a combined bus of 1024 bit bus...how sweet would this be...

i know its not practical but hey i guy can dream right! :rolleyes:

That's not how it works. It's two GPU's with their own memory sub-systems, their own 512 MB of memory with a 256-bit wide memory bus. A HD3870 X2 has 1024 MB of memory for sure, but it's 2x 512 MB...in other words, 512 MB of memory mirrored just like any other multi-GPU setup.....again depending on the modes, like tiling, AFR, etc.
 
Lol, it doesnt matter if you have 8x256bit, you think its a full 256bit? no, its 16x16bit sub-interfaces, and broken down further. 2x512 bit would perform the same as a 1024 bit, someone please correct me if im wrong im sure someone will try :P
It makes a difference when both cores are fetching the same data redundantly. It really isn't the same as a single unified core with a broader bus.
 
What's awesome is that in near future an el-cheapo 128bit bus will outperform a typical 256bit bus used on modern high-end cards like nVIDIA's 9-line. Say hello to 4GHz+ GDDR5. ;)

Exciting times ahead...
 
What's awesome is that in near future an el-cheapo 128bit bus will outperform a typical 256bit bus used on modern high-end cards like nVIDIA's 9-line. Say hello to 4GHz+ GDDR5. ;)

Exciting times ahead...

makes me wanna wet my pants! :D
 
well 256bit is practiclly free these days and doesnt cost much at all over 128 or 64bit.

I think the next lowed cards will be GDDR3 equipped 128bit bus cards


as for memory, the reason they use the memory like that is to conserrve bandwith. The Crossbar Memory controller was very needed.
 
the only thing i see bad about gddr4 or gddr5 over gddr3 is right now some of them have higher latency but i guess if the clock out weighs the timings then its not really a prob!
 
candle_86,
Surely, those GDDR3 chips are not free. 5$ won't get you even a single GDDR3 chip...

jbunch07,
Latency = cycle time × timings
When frequency goes up, cycle time goes down. GDDR5 doubles the frequencies so the latency cycles can be doubled too without increasing the absolute latency which is measured in nanoseconds. And it's the absolute latency that matters, not the amount of latency cycles.
 
as said both nvidia and AMD are shader starved, though Nvidia is less shader starved right now. As AMD's opperate as a 64 shader core most of the time

So true. maybe if they had enough shaders the 512bit bus would help more. BUT heat and higher volts power usage seem to be the problem.
 
The 2900xt had that 512bit wonderful stuff & look at what happened to it in terms of power slurping :rolleyes: ...although the 80nm core was mostly the cause of that.
 
Last edited:
Megasty,
And as a 1GHz GDDR3 chip can draw ~5W peak it does indeed add up when there's 16 chips total...

btw, R600 is 80nm. ;)
 
Megasty,
And as a 1GHz GDDR3 chip can draw ~5W peak it does indeed add up when there's 16 chips total...

btw, R600 is 80nm. ;)

geh, I have the comparison pic with the 3870 & x2 right on my desktop :laugh:

I think screwing around with the slow gx2 & crysis has fried my brain, btw its 5hrs & I'm still not done yet, its doing 2-8 fps on the Assault level :mad:
 
geh, I have the comparison pic with the 3870 & x2 right on my desktop :laugh:

I think screwing around with the slow gx2 & crysis has fried my brain, btw its 5hrs & I'm still not done yet, its doing 2-8 fps on the Assault level :mad:

at what settings?
 
Back
Top