• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GT300 ''Fermi'' Detailed

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
41,826 (6.61/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
since they call this card reactor, i guess it will be as hot as a Nuclear reactor under meltdown conditions, aka GF 5800.
 
Joined
Aug 9, 2006
Messages
1,065 (0.16/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
Joined
Aug 7, 2007
Messages
2,723 (0.43/day)
Processor i5-7600k
Motherboard ASRock Z170 Pro4
Cooling CM Hyper 212 EVO w/ AC MX-4
Memory 2x8GB DDR4 2400 Corsair LPX Vengeance 15-15-15-36
Video Card(s) MSI Twin Frozr 1070ti
Storage 240GB Corsair Force GT
Display(s) 23' Dell AW2310
Case Corsair 550D
Power Supply Seasonic SS-760XP2 Platinum
Software Windows 10 Pro 64-bit
Still not monopolizing the market, so your whining won't do anything. This is not the thread for GPU war conspiracy theories.

No monopolizing, correct, but they are involved in highly unethical business practices. And if this said rumor by Zone is true, forcing a company to not allow a game to utilize a certain DX level just because Nvidia doesn't support it is pretty f'ed up. I could understand if it was an Nvidia-only thing, but its freakin directx!! The consumer gets f'ed in the end. That is not the way to persuade customers to go and buy their product over competition. But I'm sure AMD/ATI does the same, but the light doesn't get shed on them because they're the "small guy". I WANT TRANSPARENCY ON BOTH SIDES DAMMIT!!!
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
41,826 (6.61/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Situation is that bad, eh? May I recommend injecting ads into every post on TPU as a signature and banning people who use ad-blockers (like TechReport) :D

screw you dude :nutkick::slap:
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
Is it a rumour that Nvidia bought Ageia and left all Ageia card owners for dead?

They were not so many people. Around 10.000 PhysX PPU were sold. And sometimes a company has to do the best for the most. Spending as much to support 10.000 cards as you do to support 100++ million GPUs makes no sense at all, it's spending twice for nothing in the big squeme of things. I'm not saying that was good, it's a pity for those who bought the card, but for those who want PhysX acceleration, they can now have if for free, by just going Nvidia in their next purchase and all those who already had a Nvidia card got it for free.

Is it a rumour that Nvidia made sure that an ATI + Nvidia (for PhysX) configuration is no longer do-able?

It wasn't doable in every OS and did Ati want to share QA costs? Would Ati deliver Nvidia the newer ureleased cards, so that Nvidia could test compatibility and create the drivers before they were launched? Or they would have had to wait, with problematic drivers and taking all the blames for bad functioning setups??

Is it a rumour that AA does not work on ATI cards in Batman AA even though it is able to do so?

BS. And it has been discussed to death in other threads. Ati cards don't do that kind of AA, which was added exclusively for Nvidia, paid by Nvidia and QA'd by Nvidia. It even has the Nvidia AA label written all over it.

The only rumour is that Nvidia made Ubisoft remove DX10.1 support from AC because AC was running to awesome on it with of course ATI cards (Nvidia does not support DX10.1)

BS again. If Nvidia didn't want DX10.1 in that game,, it would have never have released with DX10.1 to begin with. HD3xxx had been released moths before the game launched, they already knew how they were going to perform, it just takes a travel to the local store and buying a damn card FFS! :laugh:
 

Atom_Anti

New Member
Joined
Sep 10, 2007
Messages
176 (0.03/day)
Location
Captiva Isaland, Florida USA
System Name 3dfx High End PC
Processor Intel Pentium 4 3.06GHz@3.7GHz
Motherboard Abit SA7
Cooling AC Freezer 4
Memory 512MB Patriot DDR600 TCCD
Video Card(s) 3dfx Voodoo 5 5500 AGP 193/193MHz
Storage 80GB 7200RPM Pata
Display(s) 19" LG Wide TFT-LCD
Case Cooler Master
Audio Device(s) integrated
Power Supply 400W
Software Windows Millenium Edition
Benchmark Scores 3DMark2000: 7100 3DMark2001: 4100
since they call this card reactor, i guess it will be as hot as a Nuclear reactor under meltdown conditions, aka GF 5800.

Yeah, yeah like Chornobyl reaktor did it in 1986:wtf:.
Anyway, there is still no release time, so Nvidia may won't be able to make it:rolleyes:. Remember what happened with 3dfx, they were the biggest and most famous 3d vga maker, but they are could not come up enough soon with the Rampage. Maybe Reactor=Rampage:laugh:.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.12/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Rumours?
Is it a rumour that Nvidia bought Ageia and left all Ageia card owners for dead?
Is it a rumour that Nvidia made sure that an ATI + Nvidia (for PhysX) configuration is no longer do-able?
Is it a rumour that AA does not work on ATI cards in Batman AA even though it is able to do so?

The only rumour is that Nvidia made Ubisoft remove DX10.1 support from AC because AC was running to awesome on it with of course ATI cards (Nvidia does not support DX10.1)

Ageia was on the brink of bankrupcy when nVidia bought them. If Ageia went under, were would it have left the Ageia card owners then? They should all consider themselve lucky that nVidia bought Ageia and at least continued support for some time longer than what they would have gotten if Ageia was left to die.

Who cares if you can't use PhysX with an ATi card doing the graphics? ATi pretty much assured this when they denied nVidia to run PhysX natively on ATi hardware. Thats right, despite all your whining, you forget that initially, nVidia wanted to make PhysX run natively on ATi hardware, no nVidia hardware required. ATi was the one that shut the door on peaceful PhysX support, not nVidia.

And if you actually paid attention to the Batman issue, you would know a few things. 1.) Enabling AA on ATi cards not only doesn't actually work, it also breaks the game. 2.) nVidia paid for AA to be added in the game, the game engine does not natively support AA. If nVidia hadn't done so, AA would not have been included in the game, so ATi is no worse off. There is no reason that nVidia should allow ATi to use a feature that nVidia spent the money to develope and include in the game. So, it is a false rumor that nVidia paid to have a feature disabled for ATi card. The truth is that they paid to have the feature added for their cards. There is a big difference between those two.
 
Joined
Jan 26, 2009
Messages
347 (0.06/day)
Location
Jitra, Kedah, Malaysia
System Name KXa DrakeN
Processor Intel Core i7 3770k
Motherboard Asus Maximus IV Extreme-Z Z68 1155
Cooling NZXT Kraken X60
Memory 8GB Ripjaw-X Red 1866mhz
Video Card(s) 2X ASUS GTX780 DCUII OC
Storage Crucial M4 128gb+Intel 320 160gb Gen3+2TB BLack WD+1TB Black WD
Display(s) DELL S2740L 27'' LED LCD
Case NZXT Phantom 410
Audio Device(s) ASUS XONAR D2X
Power Supply Corsair HX1050wt
Software Windows 7 Ultimate SP1 64bit
oh well.. stick with my recent gpu first..this thing will burn my money:ohwell:
 
Joined
Sep 25, 2007
Messages
5,966 (0.96/day)
Location
New York
Processor AMD Ryzen 9 5950x, Ryzen 9 5980HX
Motherboard MSI X570 Tomahawk
Cooling Be Quiet Dark Rock Pro 4(With Noctua Fans)
Memory 32Gb Crucial 3600 Ballistix
Video Card(s) Gigabyte RTX 3080, Asus 6800M
Storage Adata SX8200 1TB NVME/WD Black 1TB NVME
Display(s) Dell 27 Inch 165Hz
Case Phanteks P500A
Audio Device(s) IFI Zen Dac/JDS Labs Atom+/SMSL Amp+Rivers Audio
Power Supply Corsair RM850x
Mouse Logitech G502 SE Hero
Keyboard Corsair K70 RGB Mk.2
VR HMD Samsung Odyssey Plus
Software Windows 10
so if its 384bit of GDDR5, then we are looking at a card with a max of 48 rops and 512 shaders

at least nvidia is being innovative, I think this card is going to be a monster when it comes out.
 
Joined
Jul 19, 2006
Messages
43,602 (6.53/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
Ageia was on the brink of bankrupcy when nVidia bought them. If Ageia went under, were would it have left the Ageia card owners then? They should all consider themselve lucky that nVidia bought Ageia and at least continued support for some time longer than what they would have gotten if Ageia was left to die.

Who cares if you can't use PhysX with an ATi card doing the graphics? ATi pretty much assured this when they denied nVidia to run PhysX natively on ATi hardware. Thats right, despite all your whining, you forget that initially, nVidia wanted to make PhysX run natively on ATi hardware, no nVidia hardware required. ATi was the one that shut the door on peaceful PhysX support, not nVidia.

And if you actually paid attention to the Batman issue, you would know a few things. 1.) Enabling AA on ATi cards not only doesn't actually work, it also breaks the game. 2.) nVidia paid for AA to be added in the game, the game engine does not natively support AA. If nVidia hadn't done so, AA would not have been included in the game, so ATi is no worse off. There is no reason that nVidia should allow ATi to use a feature that nVidia spent the money to develope and include in the game. So, it is a false rumor that nVidia paid to have a feature disabled for ATi card. The truth is that they paid to have the feature added for their cards. There is a big difference between those two.

Good point with Ageia, but Ageia isn't the only company. Many of us are still burned from 3dFX. Now that sucked.

I can enable AA in CCC for the Batman demo. It works, performance sucks but it works. There is no way a game developer should be taking kickbacks for exclusivity in games. It alienates many of their customers. There is also no way you know for sure that they couldn't of added AA for all graphics cards with Batman. Honestly, what? Are game developers going to start charging graphics card companies more money to include the color blue in their games? Ridiculous.

Fact of the matter is though, I'm sure if we had a poll, both Nvidia and ATi owners would agree that features should be included for both cards. Continuing down this road will do nothing but make certain features that should already be in the game, exclusive to a particular brand more and more. It needs to stop, it's bad business and most importantly bad for the consumer, limiting our choices.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
AFAIK that means that you can just #include C for CUDA and work with c++ like you would do with any other library and same for fortran. That's very good for some programers indeed, but only works on Nvidia hardware.

I said that, but after reading information in other places thatmight be innacurate. They're saying that it can run C/C++ and Fortran code directly. That means there's no need to deal with CUDA, OpenCL or DX11 compute shaders, because you could just program something in Visual Studio and the code would just work. I don't know if that's true but it would be amazing.
 

Easy Rhino

Linux Advocate
Staff member
Joined
Nov 13, 2006
Messages
15,576 (2.37/day)
Location
Mid-Atlantic
System Name Desktop
Processor i5 13600KF
Motherboard AsRock B760M Steel Legend Wifi
Cooling Noctua NH-U9S
Memory 4x 16 Gb Gskill S5 DDR5 @6000
Video Card(s) Gigabyte Gaming OC 6750 XT 12GB
Storage WD_BLACK 4TB SN850x
Display(s) Gigabye M32U
Case Corsair Carbide 400C
Audio Device(s) On Board
Power Supply EVGA Supernova 650 P2
Mouse MX Master 3s
Keyboard Logitech G915 Wireless Clicky
Software The Matrix
given nvidias past this will most likely be $100 dollars more than ATis flagship. obviously to the sane person its performance would have to justify the cost. this time around though nvidia is coming out after ati so they may indeed have to keep their prices competitive.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
Good point with Ageia, but Ageia isn't the only company. Many of us are still burned from 3dFX. Now that sucked.

I can enable AA in CCC for the Batman demo. It works, performance sucks but it works. There is no way a game developer should be taking kickbacks for exclusivity in games. It alienates many of their customers. There is also no way you know for sure that they couldn't of added AA for all graphics cards with Batman. Honestly, what? Are game developers going to start charging graphics card companies more money to include the color blue in their games? Ridiculous.

Fact of the matter is though, I'm sure if we had a poll, both Nvidia and ATi owners would agree that features should be included for both cards. Continuing down this road will do nothing but make certain features that should already be in the game, exclusive to a particular brand more and more. It needs to stop, it's bad business and most importantly bad for the consumer, limiting our choices.

Stop the bitching allready. Fact is that 10+ games have been released using UE3 and none of them had in game AA. If you wanted AA, you had to enable it in the control panel, had you Ati or had you Nvidia. With this game Nvidia asked the developer to add AA and helped developing and Quality Assurancing the feature. AMD didn't even contact the developer so should they obtain something they didn't pay for? Should the developers risk their reputation by releasing something that has not had any QA*? Or should Nvidia pay so that the developer did QA in AMD's cards? AMD is not helping developers on purpose, but they are expecting to get all the benefits, is that any moral? Not in my book. Only reason that GPUs are sold is because of games, so helping the ones that are helping you is the natural thing to do.

Regarding your last paragraph, I would say yes, but only if both GPU developers were helping to include the features.

* The feature breaks the game, so instead of the crap about Nvidia disabling te feature, we would be hearing crap about Nvidia paying for the game crashing with AMD cards. This crap will never end.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
41,826 (6.61/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
all I can say is its hurting their sales due to the Game having the TWIMTBP badge on it not getting the expected sales from AMD users. With that badge they getpaid a little here and there. With business practices like this it makes me glad I switched to ATI back in 2002
 
Last edited:
Joined
Feb 18, 2006
Messages
5,147 (0.75/day)
Location
AZ
System Name Thought I'd be done with this by now
Processor i7 11700k 8/16
Motherboard MSI Z590 Pro Wifi
Cooling Be Quiet Dark Rock Pro 4, 9x aigo AR12
Memory 32GB GSkill TridentZ Neo DDR4-4000 CL18-22-22-42
Video Card(s) MSI Ventus 2x Geforce RTX 3070
Storage 1TB MX300 M.2 OS + Games, + cloud mostly
Display(s) Samsung 40" 4k (TV)
Case Lian Li PC-011 Dynamic EVO Black
Audio Device(s) onboard HD -> Yamaha 5.1
Power Supply EVGA 850 GQ
Mouse Logitech wireless
Keyboard same
VR HMD nah
Software Windows 10
Benchmark Scores no one cares anymore lols
And I was wondering why I was seeing L1 and L2 caches on an upcoming video card. I thought I was going crazy hahaha.

This will be VERY interesting, if the price is right, I may skip the 5k and go GT300 :rockout:

yeah I though that was odd as well, I'm very curious to see the performance of these bad boys. the way things are going it will be while before I have the cash for a new card anyway so i might as weel wait and see what both sides have to offer.

all I can say is its hurting their sales due to the Game having the TWIMTBP badge on it not getting the expected sales from AMD users. With that badge they get paid a little here and there.

funny even when I had my 2900xt I barely noticed the twimtbp badge either on the case or in the loading of the game. I've yet to find a game that will not work on either sides cards and i never buy a game because it works better on one than another. I hope most of you don't either. I buy games that i like for storyline, graphics, gameplay, and replayability. other reasons make no sense to me.
 
Last edited:
Joined
Jun 18, 2008
Messages
356 (0.06/day)
Processor AMD Ryzen 3 1200 @ 3.7 GHz
Motherboard MSI B350M Gaming PRO
Cooling 2x Dynamic X2 GP-12
Memory 2x4GB GeIL EVO POTENZA AMD PC4-17000
Video Card(s) GIGABYTE Radeon RX 560 2GB
Storage Samsung SSD 840 Series (250GB)
Display(s) Asus VP239H-P (23")
Case Fractal Design Define Mini C TG
Audio Device(s) ASUS Xonar U3
Power Supply CORSAIR CX450
Mouse Logitech G500
Keyboard Corsair Vengeance K65
Software Windows 10 Pro (x64)
Prices will be competitive because they have to be, but nVidia is going to lose big due to the cost of producing this behemoth. And what is their plan for scaling? All of this talk about a high-end chip while they're completely mum about everything else. Are they going to re-use G92 again, or just elect not to compete?

Once Juniper comes out (still scheduled Q4 2009), expect nVidia to shed a few more of their previously-exclusive AIB partners...
 
Joined
Jul 19, 2006
Messages
43,602 (6.53/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
Stop the bitching allready. Fact is that 10+ games have been released using UE3 and none of them had in game AA. If you wanted AA, you had to enable it in the control panel, had you Ati or had you Nvidia. With this game Nvidia asked the developer to add AA and helped developing and Quality Assurancing the feature. AMD didn't even contact the developer so should they obtain something they didn't pay for? Should the developers risk their reputation by releasing something that has not had any QA*? Or should Nvidia pay so that the developer did QA in AMD's cards? AMD is not helping developers on purpose, but they are expecting to get all the benefits, is that any moral? Not in my book. Only reason that GPUs are sold is because of games, so helping the ones that are helping you is the natural thing to do.

Regarding your last paragraph, I would say yes, but only if both GPU developers were helping to include the features.

* The feature breaks the game, so instead of the crap about Nvidia disabling te feature, we would be hearing crap about Nvidia paying for the game crashing with AMD cards. This crap will never end.

I'm not bitching, you're bitching.:p (You are also walking thin ice with that comment. I am free to express my views on this forum like anyone else.) I'm stating how I see it. Tell me all mighty one, do you work for Nvidia? Do you know if Nvidia is approachng the developers or is it the other way around? Is Nvidia paying the developers to add features to the games for their cards, or are they throwing developers cash to keep features exclusive to their cards. Unless you work in one of their back rooms, you have no idea.

*Ugh, I'm sounding like a conspiracy theorist. I hate conspiracy theorists. I'll shut up and just not play this game. :toast:
 
Last edited:
Joined
Jun 30, 2008
Messages
31 (0.01/day)
Location
Jamaica
System Name EDGE IIIv2
Processor Intel Core 2 Duo E8400 @ 3Ghz
Motherboard ASUS P5E-VM HDMI
Cooling Stock intel Cooler
Memory G.Skill 2GB DDR2 800Mhz OCZ Technology 2GB DDR2 800Mhz@ 5.5.5.15
Video Card(s) MSi Radeon HD 4770 (Stock)
Storage 500GB Seagate Barracuda & 160GB WD Caviar
Display(s) BenQ FP241WZ 24"
Case Thermaltake LANbox
Audio Device(s) ASUS Xonar D2
Power Supply Zalman ZM600-HP Heatpipe cooled
Benchmark Scores 3D Mark Vantage:P3366 3D Mark 06:8112
Prices will be competitive because they have to be, but nVidia is going to lose big due to the cost of producing this behemoth. And what is their plan for scaling? All of this talk about a high-end chip while they're completely mum about everything else. Are they going to re-use G92 again, or just elect not to compete?

Once Juniper comes out (still scheduled Q4 2009), expect nVidia to shed a few more of their previously-exclusive AIB partners...

NV is just gonna recycle old G92/GT200 Parts to fill that gap!!!
 
Joined
Nov 6, 2005
Messages
480 (0.07/day)
Location
Silver Spring, MD
Processor Core i7 4770K
Motherboard Asrock Z87E-ITX
Cooling Stock
Memory 16GB Gskill 2133MHz DDR3
Video Card(s) PNY GeForce GTX 670 2GB
Storage 256GB Corsair M4, 240GB Samsung 840
Display(s) 27" 1440p Achevia Shimian
Case Fractal Node 304
Audio Device(s) Audioquest Dragonfly USB DAC
Power Supply Corsair Builder 600W
Software Windows 7 Pro x64
Native fortran support!!!!!!




YESSSSSS!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!






:ohwell:
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
I'm not bitching, you're bitching.:p I'm stating how I see it. Tell me all mighty one, do you work for Nvidia? Do you know if Nvidia is approachng the developers or is it the other way around? Is Nvidia paying the developers to add features to the games for their cards, or are they throwing developers cash to keep features exclusive to their cards. Unless you work in one of their back rooms, you have no idea.

Do you know anything of presumption of innocence? When there's no proofs of guilty the natural reaction is to presume innocence. That's what a honest non biased person thinks. And that's what any legal system is based on.

You have as much proof of that happening as I do of it not happening. have you any proof except that it runs better on one than the other? So you have the reason and I don't, because? Enlighten me.

I know Nvidia is playing fair, because that's what the developers say. When docens of Developers say that Nvidia is approaching them to help them develop, optimize and test their games, my natural reaction is to believe them, because I don't presume that all the people lie. I don't presume that all of them take money underhand. But most importantly I know that at least one out of the 100 that form a game development team, out of the 100 developers that have developed under TWIMTBP, one at least, would have said something already if it was happening something shaddy.

I've been part of an small developer that did small java games, and I know how much it costs to optimize and make even a simple game bug free, so it doesn't surprise me a bit that games that have been optimized on one card runs better than in the one that hasn't*. That's the first thing that make people jump about TWIMTBP, so being that's what they think and have been claiming bad behavior out of that, it doesn't surprice me.

*It also happened that one game run flawlessly on one cellphone and crashed on others, while both should be able to run the same java code.
 
Joined
Feb 23, 2008
Messages
1,064 (0.17/day)
Location
Montreal
System Name Aryzen / Sairikiki / Tesseract
Processor 5800x / i7 920@3.73 / 5800x
Motherboard Steel Legend B450M / GB EX58-UDP4 / Steel Legend B550M
Cooling Mugen 5 / Pure Rock / Glacier One 240
Memory Corsair Something 16 / Corsair Something 12 / G.Skill 32
Video Card(s) AMD 6800XT / AMD 6750XT / Sapphire 7800XT
Storage Way too many drives...
Display(s) LG 332GP850-B / Sony w800b / Sony X90J
Case EVOLV X / Carbide 540 / Carbide 280x
Audio Device(s) SB ZxR + GSP 500 / board / Denon X1700h + ELAC Uni-Fi 2 + Senn 6XX
Power Supply Seasonic PRIME GX-750 / Corsair HX750 / Seasonic Focus PX-650
Mouse G700 / none / G602
Keyboard G910
Software w11 64
Benchmark Scores I don't play benchmarks...
...They're saying that it can run C/C++ and Fortran code directly. That means there's no need to deal with CUDA, OpenCL or DX11 compute shaders, because you could just program something in Visual Studio and the code would just work. I don't know if that's true but it would be amazing.

As amazingly awesome that is, save for some very rare cases I'm quite questioning the utility of this. Besides raising the transistor count (almost wrote trannies there...), what use is that to the gaming population? And I doubt even pros would have much reason to jump on it.

L1 and L2 caches sounds like fun though...
 
Joined
Feb 21, 2008
Messages
5,000 (0.82/day)
Location
NC, USA
System Name Cosmos F1000
Processor Ryzen 9 7950X3D
Motherboard MSI PRO B650-S WIFI AM5
Cooling Corsair H100x, Panaflo's on case
Memory G.Skill DDR5 Trident 64GB (32GBx2)
Video Card(s) MSI Gaming Radeon RX 7900 XTX 24GB GDDR6
Storage 4TB Firecuda M.2 2280
Display(s) 32" OLED 4k 240Hz ASUS ROG Swift PG32UCD
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR RM1000e 1000watt
Mouse G400s Logitech, white Razor Mamba
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
VR HMD Steam Valve Index
Software Win10 Pro, Win11
I'm not bitching, you're bitching.:p (You are also walking thin ice with that comment. I am free to express my views on this forum like anyone else.) I'm stating how I see it. Tell me all mighty one, do you work for Nvidia? Do you know if Nvidia is approachng the developers or is it the other way around? Is Nvidia paying the developers to add features to the games for their cards, or are they throwing developers cash to keep features exclusive to their cards. Unless you work in one of their back rooms, you have no idea.

*Ugh, I'm sounding like a conspiracy theorist. I hate conspiracy theorists. I'll shut up and just not play this game. :toast:

Erocker, I think his "bitching" response was directed at the portion of the community as a whole. I know you feel a little burned by the trouble you had with Physx configurations. It really did take extra time to program selective AA into the game. I think we all agree on that right? With that being said, we know that Nvidia helped fund the creation of the game from the start. So if the developer says,"Nvidia, that money helped so much in making our production budget that we will engineer a special selective AA for your hardware line". Is that morally wrong? Do you say Nvidia cannot pay for extra engineering to improve the experience of their own end users? If Nvidia said they would send ham sandwiches to everybody that bought Nvidia cards in the last year would you say that its not fair unless they send them to ATi users too?

What it comes down to is it was a game without selective AA. Nvidia helps develop the game and gets a special feature for their hardware line when running the game. Where is the moral dillemma?
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
As amazingly awesome that is, save for some very rare cases I'm quite questioning the utility of this. Besides raising the transistor count (almost wrote trannies there...), what use is that to the gaming population? And I doubt even pros would have much reason to jump on it.

L1 and L2 caches sounds like fun though...

Who says that in 2009 a GPU is a gaming only device? It has never been anyway. Nvidia has GeForce, Tesla and Cuadro brands and all of them are based on the same chip. As long as the graphics card is being competitive do you care about anything else? You shouldn't. And appart from this the ability to run C++ code can help in all kinds of applications. You have never encoded a video? Wouldn't you like to be able to do it 20x faster?

Pros on the other hand have 1000s and every reason to jump into something like this.
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,909 (3.78/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -080mV PL max @225w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
Shame that bus isn't a full 512 bits wide. They've increased the bandwidth with GDDR5, yet taken some back with a narrower bus. Also, 384 bits has the consequence that the amount of RAM is that odd size like on the 8800 GTX, when it would really be best at a power of two.

GDDR5 effectively doubles the bandwidth in any case (unlike 3), there is no card on the planet that will be remoteley hampered by effectively a 768MBits throughput, really, with GDDR5 there is absolutely no need to go to the additional expense.
 
Top