• We've upgraded our forums. Please post any issues/requests in this thread.

NVIDIA GT300 ''Fermi'' Detailed

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
19,266 (5.04/day)
Likes
4,833
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
#76
since they call this card reactor, i guess it will be as hot as a Nuclear reactor under meltdown conditions, aka GF 5800.
 
Joined
Aug 9, 2006
Messages
1,003 (0.24/day)
Likes
156
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
#77
Joined
Aug 7, 2007
Messages
2,655 (0.70/day)
Likes
776
Processor Intel i5-2500k @ 3.8GHz
Motherboard ASRock Z68 Extreme4 Gen3
Cooling CM Hyper 212 EVO w/ AC MX-2
Memory 8GB DDR3 1866 Gskill Snipers 9-10-9-28
Video Card(s) MSI GTX 1060 Gaming X
Storage 240GB Corsair Force GT
Display(s) 23' Dell AW2310
Case Corsair 550D
Audio Device(s) Creative SB X-Fi Fatality Professional
Power Supply Seasonic SS-760XP2 Platinum
Software Windows 7 Pro 64-bit
#78
Still not monopolizing the market, so your whining won't do anything. This is not the thread for GPU war conspiracy theories.
No monopolizing, correct, but they are involved in highly unethical business practices. And if this said rumor by Zone is true, forcing a company to not allow a game to utilize a certain DX level just because Nvidia doesn't support it is pretty f'ed up. I could understand if it was an Nvidia-only thing, but its freakin directx!! The consumer gets f'ed in the end. That is not the way to persuade customers to go and buy their product over competition. But I'm sure AMD/ATI does the same, but the light doesn't get shed on them because they're the "small guy". I WANT TRANSPARENCY ON BOTH SIDES DAMMIT!!!
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
19,266 (5.04/day)
Likes
4,833
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
#79
Situation is that bad, eh? May I recommend injecting ads into every post on TPU as a signature and banning people who use ad-blockers (like TechReport) :D
screw you dude :nutkick::slap:
 
Joined
Sep 11, 2009
Messages
2,680 (0.89/day)
Likes
693
Location
Reaching your left retina.
#80
Is it a rumour that Nvidia bought Ageia and left all Ageia card owners for dead?
They were not so many people. Around 10.000 PhysX PPU were sold. And sometimes a company has to do the best for the most. Spending as much to support 10.000 cards as you do to support 100++ million GPUs makes no sense at all, it's spending twice for nothing in the big squeme of things. I'm not saying that was good, it's a pity for those who bought the card, but for those who want PhysX acceleration, they can now have if for free, by just going Nvidia in their next purchase and all those who already had a Nvidia card got it for free.

Is it a rumour that Nvidia made sure that an ATI + Nvidia (for PhysX) configuration is no longer do-able?
It wasn't doable in every OS and did Ati want to share QA costs? Would Ati deliver Nvidia the newer ureleased cards, so that Nvidia could test compatibility and create the drivers before they were launched? Or they would have had to wait, with problematic drivers and taking all the blames for bad functioning setups??

Is it a rumour that AA does not work on ATI cards in Batman AA even though it is able to do so?
BS. And it has been discussed to death in other threads. Ati cards don't do that kind of AA, which was added exclusively for Nvidia, paid by Nvidia and QA'd by Nvidia. It even has the Nvidia AA label written all over it.

The only rumour is that Nvidia made Ubisoft remove DX10.1 support from AC because AC was running to awesome on it with of course ATI cards (Nvidia does not support DX10.1)
BS again. If Nvidia didn't want DX10.1 in that game,, it would have never have released with DX10.1 to begin with. HD3xxx had been released moths before the game launched, they already knew how they were going to perform, it just takes a travel to the local store and buying a damn card FFS! :laugh:
 

Atom_Anti

New Member
Joined
Sep 10, 2007
Messages
176 (0.05/day)
Likes
21
Location
Captiva Isaland, Florida USA
System Name 3dfx High End PC
Processor Intel Pentium 4 3.06GHz@3.7GHz
Motherboard Abit SA7
Cooling AC Freezer 4
Memory 512MB Patriot DDR600 TCCD
Video Card(s) 3dfx Voodoo 5 5500 AGP 193/193MHz
Storage 80GB 7200RPM Pata
Display(s) 19" LG Wide TFT-LCD
Case Cooler Master
Audio Device(s) integrated
Power Supply 400W
Software Windows Millenium Edition
Benchmark Scores 3DMark2000: 7100 3DMark2001: 4100
#81
since they call this card reactor, i guess it will be as hot as a Nuclear reactor under meltdown conditions, aka GF 5800.
Yeah, yeah like Chornobyl reaktor did it in 1986:wtf:.
Anyway, there is still no release time, so Nvidia may won't be able to make it:rolleyes:. Remember what happened with 3dfx, they were the biggest and most famous 3d vga maker, but they are could not come up enough soon with the Rampage. Maybe Reactor=Rampage:laugh:.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
24,292 (5.51/day)
Likes
10,402
Location
Indiana, USA
Processor Intel Core i7 8700K@4.8GHz(Quick and dirty)
Motherboard AsRock Z370 Taichi
Cooling Corsair H110i GTX
Memory 32GB Corsair DDR4-3000
Video Card(s) PNY XLR8 GTX1060 6GB
Storage 480GB Crucial MX200 + 2TB Seagate Solid State Hybrid Drive with 128GB OCZ Synapse SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply Corsair HX850
Software Windows 10 Pro x64
#82
Rumours?
Is it a rumour that Nvidia bought Ageia and left all Ageia card owners for dead?
Is it a rumour that Nvidia made sure that an ATI + Nvidia (for PhysX) configuration is no longer do-able?
Is it a rumour that AA does not work on ATI cards in Batman AA even though it is able to do so?

The only rumour is that Nvidia made Ubisoft remove DX10.1 support from AC because AC was running to awesome on it with of course ATI cards (Nvidia does not support DX10.1)
Ageia was on the brink of bankrupcy when nVidia bought them. If Ageia went under, were would it have left the Ageia card owners then? They should all consider themselve lucky that nVidia bought Ageia and at least continued support for some time longer than what they would have gotten if Ageia was left to die.

Who cares if you can't use PhysX with an ATi card doing the graphics? ATi pretty much assured this when they denied nVidia to run PhysX natively on ATi hardware. Thats right, despite all your whining, you forget that initially, nVidia wanted to make PhysX run natively on ATi hardware, no nVidia hardware required. ATi was the one that shut the door on peaceful PhysX support, not nVidia.

And if you actually paid attention to the Batman issue, you would know a few things. 1.) Enabling AA on ATi cards not only doesn't actually work, it also breaks the game. 2.) nVidia paid for AA to be added in the game, the game engine does not natively support AA. If nVidia hadn't done so, AA would not have been included in the game, so ATi is no worse off. There is no reason that nVidia should allow ATi to use a feature that nVidia spent the money to develope and include in the game. So, it is a false rumor that nVidia paid to have a feature disabled for ATi card. The truth is that they paid to have the feature added for their cards. There is a big difference between those two.
 
Joined
Jan 26, 2009
Messages
347 (0.11/day)
Likes
19
Location
Jitra, Kedah, Malaysia
System Name KXa DrakeN
Processor Intel Core i7 3770k
Motherboard Asus Maximus IV Extreme-Z Z68 1155
Cooling NZXT Kraken X60
Memory 8GB Ripjaw-X Red 1866mhz
Video Card(s) 2X ASUS GTX780 DCUII OC
Storage Crucial M4 128gb+Intel 320 160gb Gen3+2TB BLack WD+1TB Black WD
Display(s) DELL S2740L 27'' LED LCD
Case NZXT Phantom 410
Audio Device(s) ASUS XONAR D2X
Power Supply Corsair HX1050wt
Software Windows 7 Ultimate SP1 64bit
#84
oh well.. stick with my recent gpu first..this thing will burn my money:ohwell:
 
Joined
Sep 25, 2007
Messages
5,822 (1.56/day)
Likes
618
Processor Core I7 3770K@4.3Ghz
Motherboard AsRock Z77 Extreme
Cooling Cooler Master Seidon 120M
Memory 12Gb G.Skill Sniper
Video Card(s) MSI GTX 1070
Storage Sandisk SSD + 1TB Seagate Barracuda 7200
Display(s) IPS Asus 26inch
Case Antec 300
Audio Device(s) Xonar DG
Power Supply EVGA Supernova 650 G2
Software Windows 10/Windows 7
#85
so if its 384bit of GDDR5, then we are looking at a card with a max of 48 rops and 512 shaders

at least nvidia is being innovative, I think this card is going to be a monster when it comes out.
 

erocker

Senior Moderator
Staff member
Joined
Jul 19, 2006
Messages
42,395 (10.17/day)
Likes
18,034
Processor Intel i7 8700k
Motherboard Gigabyte z370 AORUS Gaming 7
Cooling Water
Memory 16gb G.Skill 4000 MHz DDR4
Video Card(s) Evga GTX 1080
Storage 3 x Samsung Evo 850 500GB, 1 x 250GB, 2 x 2TB HDD
Display(s) Nixeus EDG27
Case Thermaltake X5
Power Supply Corsair HX1000i
Mouse Zowie EC1-B
Software Windows 10
#86
Ageia was on the brink of bankrupcy when nVidia bought them. If Ageia went under, were would it have left the Ageia card owners then? They should all consider themselve lucky that nVidia bought Ageia and at least continued support for some time longer than what they would have gotten if Ageia was left to die.

Who cares if you can't use PhysX with an ATi card doing the graphics? ATi pretty much assured this when they denied nVidia to run PhysX natively on ATi hardware. Thats right, despite all your whining, you forget that initially, nVidia wanted to make PhysX run natively on ATi hardware, no nVidia hardware required. ATi was the one that shut the door on peaceful PhysX support, not nVidia.

And if you actually paid attention to the Batman issue, you would know a few things. 1.) Enabling AA on ATi cards not only doesn't actually work, it also breaks the game. 2.) nVidia paid for AA to be added in the game, the game engine does not natively support AA. If nVidia hadn't done so, AA would not have been included in the game, so ATi is no worse off. There is no reason that nVidia should allow ATi to use a feature that nVidia spent the money to develope and include in the game. So, it is a false rumor that nVidia paid to have a feature disabled for ATi card. The truth is that they paid to have the feature added for their cards. There is a big difference between those two.
Good point with Ageia, but Ageia isn't the only company. Many of us are still burned from 3dFX. Now that sucked.

I can enable AA in CCC for the Batman demo. It works, performance sucks but it works. There is no way a game developer should be taking kickbacks for exclusivity in games. It alienates many of their customers. There is also no way you know for sure that they couldn't of added AA for all graphics cards with Batman. Honestly, what? Are game developers going to start charging graphics card companies more money to include the color blue in their games? Ridiculous.

Fact of the matter is though, I'm sure if we had a poll, both Nvidia and ATi owners would agree that features should be included for both cards. Continuing down this road will do nothing but make certain features that should already be in the game, exclusive to a particular brand more and more. It needs to stop, it's bad business and most importantly bad for the consumer, limiting our choices.
 
Joined
Sep 11, 2009
Messages
2,680 (0.89/day)
Likes
693
Location
Reaching your left retina.
#87
AFAIK that means that you can just #include C for CUDA and work with c++ like you would do with any other library and same for fortran. That's very good for some programers indeed, but only works on Nvidia hardware.
I said that, but after reading information in other places thatmight be innacurate. They're saying that it can run C/C++ and Fortran code directly. That means there's no need to deal with CUDA, OpenCL or DX11 compute shaders, because you could just program something in Visual Studio and the code would just work. I don't know if that's true but it would be amazing.
 

Easy Rhino

Linux Advocate
Joined
Nov 13, 2006
Messages
14,405 (3.55/day)
Likes
4,257
System Name VHOST01 | Desktop
Processor i7 980x | i5 7500 Kaby Lake
Motherboard Gigabyte x58 Extreme | AsRock MicroATX Z170M Exteme4
Cooling Prolimatech Megahelams | Stock
Memory 6x4 GB @ 1333 | 2x 8G Gskill Aegis DDR4 2400
Video Card(s) Nvidia GT 210 | Nvidia GTX 970 FTW+
Storage 4x2 TB Enterprise RAID5 |Corsair mForce nvme 250G
Display(s) N/A | Dell 27" 1440p 8bit GSYNC
Case Lian Li ATX Mid Tower | Corsair Carbide 400C
Audio Device(s) NA | On Board
Power Supply SeaSonic 500W Gold | Seasonic SSR-650GD Flagship Prime Series 650W Gold
Mouse N/A | Logitech G900 Chaos Spectrum
Keyboard N/A | Posiden Z RGB Cherry MX Brown
Software Centos 7 | Windows 10
#88
given nvidias past this will most likely be $100 dollars more than ATis flagship. obviously to the sane person its performance would have to justify the cost. this time around though nvidia is coming out after ati so they may indeed have to keep their prices competitive.
 
Joined
Sep 11, 2009
Messages
2,680 (0.89/day)
Likes
693
Location
Reaching your left retina.
#89
Good point with Ageia, but Ageia isn't the only company. Many of us are still burned from 3dFX. Now that sucked.

I can enable AA in CCC for the Batman demo. It works, performance sucks but it works. There is no way a game developer should be taking kickbacks for exclusivity in games. It alienates many of their customers. There is also no way you know for sure that they couldn't of added AA for all graphics cards with Batman. Honestly, what? Are game developers going to start charging graphics card companies more money to include the color blue in their games? Ridiculous.

Fact of the matter is though, I'm sure if we had a poll, both Nvidia and ATi owners would agree that features should be included for both cards. Continuing down this road will do nothing but make certain features that should already be in the game, exclusive to a particular brand more and more. It needs to stop, it's bad business and most importantly bad for the consumer, limiting our choices.
Stop the bitching allready. Fact is that 10+ games have been released using UE3 and none of them had in game AA. If you wanted AA, you had to enable it in the control panel, had you Ati or had you Nvidia. With this game Nvidia asked the developer to add AA and helped developing and Quality Assurancing the feature. AMD didn't even contact the developer so should they obtain something they didn't pay for? Should the developers risk their reputation by releasing something that has not had any QA*? Or should Nvidia pay so that the developer did QA in AMD's cards? AMD is not helping developers on purpose, but they are expecting to get all the benefits, is that any moral? Not in my book. Only reason that GPUs are sold is because of games, so helping the ones that are helping you is the natural thing to do.

Regarding your last paragraph, I would say yes, but only if both GPU developers were helping to include the features.

* The feature breaks the game, so instead of the crap about Nvidia disabling te feature, we would be hearing crap about Nvidia paying for the game crashing with AMD cards. This crap will never end.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
19,266 (5.04/day)
Likes
4,833
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
#90
all I can say is its hurting their sales due to the Game having the TWIMTBP badge on it not getting the expected sales from AMD users. With that badge they getpaid a little here and there. With business practices like this it makes me glad I switched to ATI back in 2002
 
Last edited:
Joined
Feb 18, 2006
Messages
5,103 (1.18/day)
Likes
1,258
Location
AZ
System Name Thought I'd be done with this by now
Processor i7 4790K 4.4GHZ turbo currently at 4.6GHZ at 1.16v
Motherboard MSI Z97-G55 SLI
Cooling Scythe Mugen 2 rev B (SCMG-2100), stock on gpu's.
Memory 8GB G.SKILL Ripjaws Z Series DDR3 2400MHZ 10-12-12-31
Video Card(s) EVGA GTX 760 Superclocked replaced HIS R9 290 that was artifacting
Storage 1TB MX300 M.2 OS + Games, 4x ST31000524NS in Raid 10 Storage and Backup, external 2tb backup,
Display(s) BenQ GW2255 surprisingly good screen for the price.
Case Raidmax Scorpio 668
Audio Device(s) onboard HD
Power Supply EVGA 750 GQ
Software Windows 10
Benchmark Scores no one cares anymore lols
#91
And I was wondering why I was seeing L1 and L2 caches on an upcoming video card. I thought I was going crazy hahaha.

This will be VERY interesting, if the price is right, I may skip the 5k and go GT300 :rockout:
yeah I though that was odd as well, I'm very curious to see the performance of these bad boys. the way things are going it will be while before I have the cash for a new card anyway so i might as weel wait and see what both sides have to offer.

all I can say is its hurting their sales due to the Game having the TWIMTBP badge on it not getting the expected sales from AMD users. With that badge they get paid a little here and there.
funny even when I had my 2900xt I barely noticed the twimtbp badge either on the case or in the loading of the game. I've yet to find a game that will not work on either sides cards and i never buy a game because it works better on one than another. I hope most of you don't either. I buy games that i like for storyline, graphics, gameplay, and replayability. other reasons make no sense to me.
 
Last edited:
Joined
Jun 18, 2008
Messages
356 (0.10/day)
Likes
37
Processor AMD Ryzen 3 1200 @ 3.7 GHz
Motherboard MSI B350M Gaming PRO
Cooling 2x Dynamic X2 GP-12
Memory 2x4GB GeIL EVO POTENZA AMD PC4-17000
Video Card(s) GIGABYTE Radeon RX 560 2GB
Storage Samsung SSD 840 Series (250GB)
Display(s) Asus VP239H-P (23")
Case Fractal Design Define Mini C TG
Audio Device(s) ASUS Xonar U3
Power Supply CORSAIR CX450
Mouse Logitech G500
Keyboard Corsair Vengeance K65
Software Windows 10 Pro (x64)
#92
Prices will be competitive because they have to be, but nVidia is going to lose big due to the cost of producing this behemoth. And what is their plan for scaling? All of this talk about a high-end chip while they're completely mum about everything else. Are they going to re-use G92 again, or just elect not to compete?

Once Juniper comes out (still scheduled Q4 2009), expect nVidia to shed a few more of their previously-exclusive AIB partners...
 

erocker

Senior Moderator
Staff member
Joined
Jul 19, 2006
Messages
42,395 (10.17/day)
Likes
18,034
Processor Intel i7 8700k
Motherboard Gigabyte z370 AORUS Gaming 7
Cooling Water
Memory 16gb G.Skill 4000 MHz DDR4
Video Card(s) Evga GTX 1080
Storage 3 x Samsung Evo 850 500GB, 1 x 250GB, 2 x 2TB HDD
Display(s) Nixeus EDG27
Case Thermaltake X5
Power Supply Corsair HX1000i
Mouse Zowie EC1-B
Software Windows 10
#93
Stop the bitching allready. Fact is that 10+ games have been released using UE3 and none of them had in game AA. If you wanted AA, you had to enable it in the control panel, had you Ati or had you Nvidia. With this game Nvidia asked the developer to add AA and helped developing and Quality Assurancing the feature. AMD didn't even contact the developer so should they obtain something they didn't pay for? Should the developers risk their reputation by releasing something that has not had any QA*? Or should Nvidia pay so that the developer did QA in AMD's cards? AMD is not helping developers on purpose, but they are expecting to get all the benefits, is that any moral? Not in my book. Only reason that GPUs are sold is because of games, so helping the ones that are helping you is the natural thing to do.

Regarding your last paragraph, I would say yes, but only if both GPU developers were helping to include the features.

* The feature breaks the game, so instead of the crap about Nvidia disabling te feature, we would be hearing crap about Nvidia paying for the game crashing with AMD cards. This crap will never end.
I'm not bitching, you're bitching.:p (You are also walking thin ice with that comment. I am free to express my views on this forum like anyone else.) I'm stating how I see it. Tell me all mighty one, do you work for Nvidia? Do you know if Nvidia is approachng the developers or is it the other way around? Is Nvidia paying the developers to add features to the games for their cards, or are they throwing developers cash to keep features exclusive to their cards. Unless you work in one of their back rooms, you have no idea.

*Ugh, I'm sounding like a conspiracy theorist. I hate conspiracy theorists. I'll shut up and just not play this game. :toast:
 
Last edited:
Joined
Jun 30, 2008
Messages
31 (0.01/day)
Likes
8
Location
Jamaica
System Name EDGE IIIv2
Processor Intel Core 2 Duo E8400 @ 3Ghz
Motherboard ASUS P5E-VM HDMI
Cooling Stock intel Cooler
Memory G.Skill 2GB DDR2 800Mhz OCZ Technology 2GB DDR2 800Mhz@ 5.5.5.15
Video Card(s) MSi Radeon HD 4770 (Stock)
Storage 500GB Seagate Barracuda & 160GB WD Caviar
Display(s) BenQ FP241WZ 24"
Case Thermaltake LANbox
Audio Device(s) ASUS Xonar D2
Power Supply Zalman ZM600-HP Heatpipe cooled
Benchmark Scores 3D Mark Vantage:P3366 3D Mark 06:8112
#94
Prices will be competitive because they have to be, but nVidia is going to lose big due to the cost of producing this behemoth. And what is their plan for scaling? All of this talk about a high-end chip while they're completely mum about everything else. Are they going to re-use G92 again, or just elect not to compete?

Once Juniper comes out (still scheduled Q4 2009), expect nVidia to shed a few more of their previously-exclusive AIB partners...
NV is just gonna recycle old G92/GT200 Parts to fill that gap!!!
 
Joined
Nov 6, 2005
Messages
480 (0.11/day)
Likes
38
Location
Silver Spring, MD
Processor Core i7 4770K
Motherboard Asrock Z87E-ITX
Cooling Stock
Memory 16GB Gskill 2133MHz DDR3
Video Card(s) PNY GeForce GTX 670 2GB
Storage 256GB Corsair M4, 240GB Samsung 840
Display(s) 27" 1440p Achevia Shimian
Case Fractal Node 304
Audio Device(s) Audioquest Dragonfly USB DAC
Power Supply Corsair Builder 600W
Software Windows 7 Pro x64
#95
Native fortran support!!!!!!




YESSSSSS!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!






:ohwell:
 
Joined
Sep 11, 2009
Messages
2,680 (0.89/day)
Likes
693
Location
Reaching your left retina.
#96
I'm not bitching, you're bitching.:p I'm stating how I see it. Tell me all mighty one, do you work for Nvidia? Do you know if Nvidia is approachng the developers or is it the other way around? Is Nvidia paying the developers to add features to the games for their cards, or are they throwing developers cash to keep features exclusive to their cards. Unless you work in one of their back rooms, you have no idea.
Do you know anything of presumption of innocence? When there's no proofs of guilty the natural reaction is to presume innocence. That's what a honest non biased person thinks. And that's what any legal system is based on.

You have as much proof of that happening as I do of it not happening. have you any proof except that it runs better on one than the other? So you have the reason and I don't, because? Enlighten me.

I know Nvidia is playing fair, because that's what the developers say. When docens of Developers say that Nvidia is approaching them to help them develop, optimize and test their games, my natural reaction is to believe them, because I don't presume that all the people lie. I don't presume that all of them take money underhand. But most importantly I know that at least one out of the 100 that form a game development team, out of the 100 developers that have developed under TWIMTBP, one at least, would have said something already if it was happening something shaddy.

I've been part of an small developer that did small java games, and I know how much it costs to optimize and make even a simple game bug free, so it doesn't surprise me a bit that games that have been optimized on one card runs better than in the one that hasn't*. That's the first thing that make people jump about TWIMTBP, so being that's what they think and have been claiming bad behavior out of that, it doesn't surprice me.

*It also happened that one game run flawlessly on one cellphone and crashed on others, while both should be able to run the same java code.
 
Joined
Feb 23, 2008
Messages
511 (0.14/day)
Likes
95
Location
Montreal
System Name Sairikiki / Tesseract
Processor i7 920@3.56/ i5 4690k@4.2
Motherboard GB EX58-UDP4 / GB Z97MX-G5
Cooling H60 / LQ-310
Memory Corsair Something 12 / Corsair 16
Video Card(s) TriX 290 / Devil RX480
Storage Way too many...
Display(s) QNIX 1440p 96Hz / Sony w800b
Case AzzA 1000 / Carbide 240
Audio Device(s) Auzen Forte / board + Yamaha RX-V475 + Pioneer AJ
Power Supply Corsair HX750 / Dark Power PRO10
Software w10 64 / w10 64
Benchmark Scores I don't play benchmarks...
#97
...They're saying that it can run C/C++ and Fortran code directly. That means there's no need to deal with CUDA, OpenCL or DX11 compute shaders, because you could just program something in Visual Studio and the code would just work. I don't know if that's true but it would be amazing.
As amazingly awesome that is, save for some very rare cases I'm quite questioning the utility of this. Besides raising the transistor count (almost wrote trannies there...), what use is that to the gaming population? And I doubt even pros would have much reason to jump on it.

L1 and L2 caches sounds like fun though...
 
Joined
Feb 21, 2008
Messages
4,980 (1.39/day)
Likes
788
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor 6700K Skylake
Motherboard ASRock Z170 OC Formula
Cooling Corsair H100i, Panaflo's on case
Memory G.SKILL TridentZ 2x8GB DDR4 3400
Video Card(s) 2x EVGA ACX 2.0 Superclocked GTX 980 ti SLI'd
Storage 2TB Samsung 850 Pro , 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) Realtek® ALC1150 with logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G500 Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win7 x64 Professional
#98
I'm not bitching, you're bitching.:p (You are also walking thin ice with that comment. I am free to express my views on this forum like anyone else.) I'm stating how I see it. Tell me all mighty one, do you work for Nvidia? Do you know if Nvidia is approachng the developers or is it the other way around? Is Nvidia paying the developers to add features to the games for their cards, or are they throwing developers cash to keep features exclusive to their cards. Unless you work in one of their back rooms, you have no idea.

*Ugh, I'm sounding like a conspiracy theorist. I hate conspiracy theorists. I'll shut up and just not play this game. :toast:
Erocker, I think his "bitching" response was directed at the portion of the community as a whole. I know you feel a little burned by the trouble you had with Physx configurations. It really did take extra time to program selective AA into the game. I think we all agree on that right? With that being said, we know that Nvidia helped fund the creation of the game from the start. So if the developer says,"Nvidia, that money helped so much in making our production budget that we will engineer a special selective AA for your hardware line". Is that morally wrong? Do you say Nvidia cannot pay for extra engineering to improve the experience of their own end users? If Nvidia said they would send ham sandwiches to everybody that bought Nvidia cards in the last year would you say that its not fair unless they send them to ATi users too?

What it comes down to is it was a game without selective AA. Nvidia helps develop the game and gets a special feature for their hardware line when running the game. Where is the moral dillemma?
 
Joined
Sep 11, 2009
Messages
2,680 (0.89/day)
Likes
693
Location
Reaching your left retina.
#99
As amazingly awesome that is, save for some very rare cases I'm quite questioning the utility of this. Besides raising the transistor count (almost wrote trannies there...), what use is that to the gaming population? And I doubt even pros would have much reason to jump on it.

L1 and L2 caches sounds like fun though...
Who says that in 2009 a GPU is a gaming only device? It has never been anyway. Nvidia has GeForce, Tesla and Cuadro brands and all of them are based on the same chip. As long as the graphics card is being competitive do you care about anything else? You shouldn't. And appart from this the ability to run C++ code can help in all kinds of applications. You have never encoded a video? Wouldn't you like to be able to do it 20x faster?

Pros on the other hand have 1000s and every reason to jump into something like this.
 

Tatty_One

Super Moderator
Staff member
Joined
Jan 18, 2006
Messages
19,765 (4.54/day)
Likes
6,036
Location
Worcestershire, UK
Processor Skylake Core i7 6700k @ 4.6gig
Motherboard MSI Z170A Tomahawk
Cooling Cooler Master Seidon 240V AIO/Viper140's
Memory 16GB Corsair Vengeance LPX 3000mhz CL14
Video Card(s) Sapphire 4gb R9 290X VaporX @1150mhz
Storage SkHynix SL308 120GB/CrucialM4/1TB WD Black
Display(s) LG 29inch 2560x1080 Curved Ultrawide IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Xifi Elite Pro 7.1/VideoLogic ZXR550's
Power Supply XFX Pro Black Edition 750W Gold modular
Keyboard CM Storm Octane Combo
Software Win 10 Home x64
Shame that bus isn't a full 512 bits wide. They've increased the bandwidth with GDDR5, yet taken some back with a narrower bus. Also, 384 bits has the consequence that the amount of RAM is that odd size like on the 8800 GTX, when it would really be best at a power of two.
GDDR5 effectively doubles the bandwidth in any case (unlike 3), there is no card on the planet that will be remoteley hampered by effectively a 768MBits throughput, really, with GDDR5 there is absolutely no need to go to the additional expense.