• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 4XX Series Discussion

Status
Not open for further replies.
Joined
Jul 19, 2006
Messages
43,587 (6.72/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
Did you realise that the Crossfire setup is consuming 25W more than single HD5870 in that graph? That's from the AC source so that means the second card is consuming 25W, despite what AMD says.

I suppose it keeps the fan spinning.:ohwell:
 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
Ok, so even if we assume the second card uses 0w when idle(I doubt it will be that low, but lets assume).

The power savings is only $31 over the course of a year. Again, nothing that anyone buying these cards is going to be concerned with.

LOL Pays for toilet paper for a month! :laugh:
 
W

wahdangun

Guest
Did you realise that the Crossfire setup is consuming 25W more than single HD5870 in that graph? That's from the AC source so that means the second card is consuming 25W, despite what AMD says.

hey, i didn't say it consume 0(zero) watt, i'm just saying the second card use less power(like hibernating state), and i'm just quoted what tom'shardware say's, and still GTX 295 consume 29 watt more than CF HD 5870.

and beside that tom'shardware know what they are doing, they are professional bencher after all, and know more about technical design than us
 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
hey, i didn't say it consume 0(zero) watt, i'm just saying the second card use less power(like hibernating state), and i'm just quoted what tom'shardware say's, and still GTX 295 consume 29 watt more than CF HD 5870.

and beside that tom'shardware know what they are doing, they are professional bencher after all, and know more about technical design than us


We are comparing GPU's from two different realms though. Nvidias offering has not showed up yet. Its a matter of days after the ATi's release. Nvidias current offering that is out, is older tech.
 
W

wahdangun

Guest
but, the faq is NVDIA doesn't show anything yet, no early benches, no performance graph, not even a leaked photo from random Chinese website

so how we do compare HD 5870? are u saying we just use our imagination to compare it, with rumored GT300 spec?
 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
but, the faq is NVDIA doesn't show anything yet, no early benches, no performance graph, not even a leaked photo from random Chinese website

so how we do compare HD 5870? are u saying we just use our imagination to compare it, with rumored GT300 spec?

It would be to have a longer patience than under a week. :laugh:
 
D

Deleted member 24505

Guest
I just found this,is it real or fake?
http://translate.google.com/translate?hl=en&sl=auto&tl=en&u=http://news.mydrivers.com/1/143/143284.htm&prev=hp&rurl=translate.google.com

Heres another asian site with the same pic on it
http://cd.qq.com/a/20090831/000124.htm
 
Last edited by a moderator:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.24/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Did you realise that the Crossfire setup is consuming 25W more than single HD5870 in that graph? That's from the AC source so that means the second card is consuming 25W, despite what AMD says.

Very nice catch, I didn't even notice that. If you consider an 80% PSU efficiency, thats about 20w...

So is AMD lying here? Seems the second card doesn't use any less power than the first.

but, the faq is NVDIA doesn't show anything yet, no early benches, no performance graph, not even a leaked photo from random Chinese website

so how we do compare HD 5870? are u saying we just use our imagination to compare it, with rumored GT300 spec?

And how long before HD5870 did we see anything more than rumor? When did the first pictures appear? A week before launch? When did we get solid specs that wasn't rumors? A week before launch?

Comparing it to the current tech on the market is one thing, but assuming that nVidia won't release anything else and running out into the streets claiming an ATi victory is stupid and ignorant, but that is what most are doing with the HD5870.

It beat last generation tech! Lets have a parade!

The problem is, IMO, beating the previous generation's tech isn't good enough. It needs to be able to compete with the current generation tech, and this has been ATi's failure since RV600, and the reason the consumer has to pay outragous prices for high end tech.
 

cooler

New Member
Joined
Nov 6, 2007
Messages
124 (0.02/day)
could we stop bashing/praising ati or nvidia, all this fanboy talk get tiring

so let hear what this gt3xx could do
 
Joined
May 19, 2007
Messages
7,662 (1.24/day)
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
Very nice catch, I didn't even notice that. If you consider an 80% PSU efficiency, thats about 20w...

So is AMD lying here? Seems the second card doesn't use any less power than the first.



And how long before HD5870 did we see anything more than rumor? When did the first pictures appear? A week before launch? When did we get solid specs that wasn't rumors? A week before launch?

Comparing it to the current tech on the market is one thing, but assuming that nVidia won't release anything else and running out into the streets claiming an ATi victory is stupid and ignorant, but that is what most are doing with the HD5870.

It beat last generation tech! Lets have a parade!

The problem is, IMO, beating the previous generation's tech isn't good enough. It needs to be able to compete with the current generation tech, and this has been ATi's failure since RV600, and the reason the consumer has to pay outragous prices for high end tech.



 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional


I think the 5870 is currently the best offering for the money in a single gpu. But we do not know for how long.

Give the green team time to show the cards. Who knows what they have to offer. Both have made solid offerings in the last generation so assuming Nvidia won't this time too would be silly. Competition saves us money and helps push technology forward.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.

That's very interesting and weird at the same time. We can take some info from that picture and we could take more if it wasn¡t so confusing. Let me explain, according to this picture of GT200:



We can relate some shapes with the actual units like this:




1- Shader Processor SIMD cluster. In G92 it had 16 SP (2x8), in GT200 it had 24 (8x3) and in GT300 it should have 32 (4x8). In GT300 we can see 16 of such clusters and that would ammount to 32x16= 512 SP. Everything seems OK.

2- Texture Units. In G80, G92 and GT200 always next to the SPs. Consists of 8 texture processors. In GT200 8x10=80. In GT300 we see 8x16=128 (ok according to leaked specs). But is that correct? We can see many other units with the same appearance all around (marked with dark green) in the middle and to the left end of the chip.

3- Raster Units. The yellow rectangle should contain 8 of them in GT200 (4x8=32), same for GT300 I'd suppose. But that way we would be talking about 10 clusters or 80 ROPs. That unless some of those are not ROPs and are something else, because all of them have a unit similar to a Texture Cluster next to them.

4- Memory. Caches and registers. In GT300 there are many areas between the different units that could be caches too, GT300 is much less organized than GT200, probably to save space.

5- That must be the setup engine and thread dispatch processor. While we see two diferrent recognizable shapes in GT200, in GT300 we can see 5.

So what do you guys think those units that I marked in blue and dark green are? IMO the units that look like what's rounded in blue are the same thing, and they are very similar to the ROPs in GT200, but they are next to what looks like Texture units (dark green), and there are too many of them too. Share your thoughts.
 
Joined
Jul 2, 2008
Messages
3,638 (0.63/day)
Location
California
Let's just take a look at the picture, and ignore all the other leaked info...



There are 5 yellow boxes, 5 blue boxes, 3 white boxes, 3 black (center, ignore the top and bottom black) boxes.

But, if you look at the yellow lines that i drew, every yellow box has a set of white, blue, and black boxes.
I don't know much about technical stuffs, but if you just look at the picture this way, isn't it make sense?
 
Last edited:

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.23/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
kid you've got a good eye for patterns.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Codename: FERMI

Some more news snippets on the new chip:

The chip supports GDDR5 memory, has billions of transistors and it should be bigger and faster than Radeon HD 5870. The GX2 dual version of card is also in the pipes. It's well worth noting that this is the biggest architectural change since G80 times as Nvidia wanted to add a lot of instructions for better parallel computing.

Read the rest at Fudzilla.
 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
Some more news snippets on the new chip:

The chip supports GDDR5 memory, has billions of transistors and it should be bigger and faster than Radeon HD 5870. The GX2 dual version of card is also in the pipes. It's well worth noting that this is the biggest architectural change since G80 times as Nvidia wanted to add a lot of instructions for better parallel computing.

Read the rest at Fudzilla.

I think it will most likely repeat what happened last generation. Nvidia creating a more powerful GPU at a higher price point. Nvidia likes to be king of the hill, but performance per dollar wins over the most of us. Therefore, ATi will easily be more competitive on the price side.

The only reason I buy Nvidia products more is because I always have driver problems using ATi cards. Like the flickering even after calibrating my monitor. It might just be my luck or simply having a more sensitive optic nerve than most? I had those problems with 4-series ATi and not 3-series or before.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
I think it will most likely repeat what happened last generation. Nvidia creating a more powerful GPU at a higher price point. Nvidia likes to be king of the hill, but performance per dollar wins over the most of us. Therefore, ATi will easily be more competitive on the price side.

I'll +1 that. This is exactly the situation with the 4870 & 4890 cards.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
The only reason I buy Nvidia products more is because I always have driver problems using ATi cards. Like the flickering even after calibrating my monitor. It might just be my luck or simply having a more sensitive optic nerve than most? I had those problems with 4-series ATi and not 3-series or before.

I don't understand about the flickering. The amount you see is dependent on the refresh rate (on a CRT monitor only), your eyes and the lighting environment. 60Hz will look horrible and 85Hz is generally unnoticeable. 100Hz is the daddy, but running that at decent resolutions with clarity is beyond most CRTs (those that are still around anyway). The cards blur out too at higher analog bandwidths.
 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
I don't understand about the flickering. The amount you see is dependent on the refresh rate (on a CRT monitor only), your eyes and the lighting environment. 60Hz will look horrible and 85Hz is generally unnoticeable. 100Hz is the daddy, but running that at decent resolutions with clarity is beyond most CRTs (those that are still around anyway). The cards blur out too at higher analog bandwidths.

Monitor compatibility is not 100% with ATi to my knowledge. It doesn't have a problem with my high end graphics CRT's but my BenQ 1080p LCD and my 65" Samsung DLP have slight flickering in 3d gaming with a couple different 4850's I have had. The problem was apparent with all driver versions available at the time. My samsung LCD did fine too.

Same displays with numerous Nvidia offerings did fine though. Like I said it might have been bad luck so I am not saying everybody else encounters it because its obvious it doesn't present itself with many displays.


BTW they might have fixed it with the recent 5 series because I remember them saying some sort of cycle being shortened to fix the problem I was experiencing and making it unnoticeble. I read it in the recent tomshardware article on 5870's. I will link it when I have the time to find it again.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.99/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Monitor compatibility is not 100% with ATi to my knowledge. It doesn't have a problem with my high end graphics CRT's but my BenQ 1080p LCD and my 65" Samsung DLP have slight flickering in 3d gaming with a couple different 4850's I have had. The problem was apparent with all driver versions available at the time. My samsung LCD did fine too.

Same displays with numerous Nvidia offerings did fine though. Like I said it might have been bad luck so I am not saying everybody else encounters it because its obvious it doesn't present itself with many displays.


BTW they might have fixed it with the recent 5 series because I remember them saying some sort of cycle being shortened to fix the problem I was experiencing and making it unnoticeble. I read it in the recent tomshardware article on 5870's. I will link it when I have the time to find it again.

Sounds like driver bugs to me - I've seen this sort of thing at work a couple of times (desktop, no games of course). Turning the monitor off and on again cured it I think; I can't quite remember now. If ATI don't fix them, they deserve to lose customers.

The 5-series thing is the clock switching between 2D & 3D having been improved, to prevent a visible glitch artifact being visible, or making it less visible. The glitch was due to the use of GDDR5.
 
Joined
Aug 13, 2008
Messages
146 (0.03/day)
Location
Dresden, Germany, Europe, Earth
System Name HTPC
Processor Intel Core i5
Motherboard Asrock ITX
Cooling Water
Memory 8GB
Video Card(s) Radeon R9 290
Storage Samsung 850 EVO 1TB
Display(s) Sharp Aquos 70" TV
Case Coolermaster ITX
Audio Device(s) Radeon Onboard HDMI
Software Windows 10 64bit
Benchmark Scores -242 whatsoever
Some more news snippets on the new chip:

The chip supports GDDR5 memory, has billions of transistors and it should be bigger and faster than Radeon HD 5870. The GX2 dual version of card is also in the pipes. It's well worth noting that this is the biggest architectural change since G80 times as Nvidia wanted to add a lot of instructions for better parallel computing.

Read the rest at Fudzilla.

hmmm, the chip supports GDDR5, who would have thought that, now billions of transistors :banghead:, nice indeed

bigger?(card length and or chip?) and faster maybe and pretty likely since they always had the better shader architecture, or at least the more efficent ...

and are they telling me that the GX2 card is a complete new card and not just simply 2x GT300, if thats the case you will have to pay more than royal for a card like that

i'll wait and see since i like both teams in some way, nVidia for their performance and drivers, and ATi for their, mostly nice price-performance ratio
 
W

wahdangun

Guest
Some more news snippets on the new chip:

The chip supports GDDR5 memory, has billions of transistors and it should be bigger and faster than Radeon HD 5870. The GX2 dual version of card is also in the pipes. It's well worth noting that this is the biggest architectural change since G80 times as Nvidia wanted to add a lot of instructions for better parallel computing.

Read the rest at Fudzilla.

what the hell it's suppose to be, GT 200 already have billions of transistors, and it still bigger than HD 5870, and GT300 must be faster than HD 5870:rolleyes:, if nvdia still want to compete with ATI.

@DaedalusHelios : i think ATI HD 5870 use error correction and a powerful memory controller, and that's why now ati can lowering their GDDR5 frequency and still don't suffer from flickering or generating artifact (but it makes OC a bit more difficult tho)

i hope NVdia can bring GT300 ASAP, so i can compare the performance/performace, and decide witch card will i buy
 
Last edited by a moderator:

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.23/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
Last year's threads over the next gen gpus were so much more thought provoking... a lot of people would say they were sticking with the company that they've used forever, and others waited patiently and could make reason out of the reports they were recieving. It seems like people need a translator. Usually whisper down the lane of 4-5 people make the information a bit haggared, but come on! It's like directly from the news source people don't quite get the picture.

GT300 will be DX11, it has a high transistor count (NDA people), it uses GDDR5, and the architecture has changed significantly. The leaked photos of the chip seem to suggest something to that effect. There's also a number of sources saying nV is still on track for their Nov 27th release and may have some info leaked before to generate hype. While it's speculation at best they're still saying something and it's as much as anyone can say without breaking NDA.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,356 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
That's very interesting and weird at the same time. We can take some info from that picture and we could take more if it wasn¡t so confusing. Let me explain, according to this picture of GT200:

http://img.techpowerup.org/090927/gt200b.jpg

We can relate some shapes with the actual units like this:

http://img.techpowerup.org/090927/gt200a.jpg
http://img.techpowerup.org/090927/gt300a.jpg

1- Shader Processor SIMD cluster. In G92 it had 16 SP (2x8), in GT200 it had 24 (8x3) and in GT300 it should have 32 (4x8). In GT300 we can see 16 of such clusters and that would ammount to 32x16= 512 SP. Everything seems OK.

Comparing those two die shots, the alleged GT300 die is holding 384 shaders, not 512. Look at those shader core blocks (red rectangle you marked on the GT200). There are 10 of those on the GT200. So 240/10 = 24. They bear a strong resemblance to the ones on the alleged GT300 die shot. There are 16 of those on the die. 16 x 24 = 384.
 
Status
Not open for further replies.
Top