• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 4XX Series Discussion

Status
Not open for further replies.

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.23/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
but will they develop for the next xbox, . . . . I am very sure they won't.

This direction is a bit off from my intention. It doesn't matter to people on TPU so much or the GTX300 series discussion who gets M$'s next console, but it matters which company makes the best GPU to support the API M$ uses in it's next console.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.62/day)
but it matters which company makes the best GPU to support the API M$ uses in it's next console.

Yes, well that's the whole thing...DX11 will be six months old when Fermi finally launches, meaning that developers have had thier hands on DX11 code for much, much longer(as well as ATi's DX11 hardware, which went out as early as April of last year), yet we know ATi does well with DX11...and nV's apparant lack of hardware tesselation is the nail in the coffin for thier acceptance by M$ for console gpus.

You've got to remember, that before DX10 came out, and nV was saying:

"Well, let's get something straight. Microsoft makes APIs (Application Programming Interfaces- Ed) not hardware. WGF is a specification for an API specification - it's software, not hardware.

"For them, implementing Unified Shaders means a unified programming model. Since they don't build hardware, they're not saying anything about hardware.


"Debating unified against separate shader architecture is not really the important question. The strategy is simply to make the vertex and pixel pipelines go fast. The tactic is how you build an architecture to execute that strategy. We're just trying to work out what is the most efficient way.

"It's far harder to design a unified processor - it has to do, by design, twice as much. Another word for 'unified' is 'shared', and another word for 'shared' is 'competing'. It's a challenge to create a chip that does load balancing and performance prediction. It's extremely important, especially in a console architecture, for the performance to be predicable. With all that balancing, it's difficult to make the performance predictable. I've even heard that some developers dislike the unified pipe, and will be handling vertex pipeline calculations on the Xbox 360's triple-core CPU."

- David Kirk

http://www.bit-tech.net/bits/2005/07/11/nvidia_rsx_interview/4

NOTE: WGF was DX10 before it was called DX10, for those unaware.



But, also in the same interview, from 4.5 years ago:
"Our commonality with Sony has led to a number of product areas that go beyond PlayStation 3. The business deal is structured so that both companies benefit. It's a really good realtionship."

So I put it to David that there might be more to come from Sony and NVIDIA: "The deal goes beyond PS3. The future is looking good."


I do not think nv is a contender at all for M$. DX10 being a failure has EVERYTHING to do with nV's snubbing it during it's inception, but that was a childish tactic to get revenge from M$ choosing ATi over nV for the current XBOX. You can literally see the bitterness in David's words. Anyway, if nV is gonna be in the next Sony box, it definately WON'T be in M$'s.
 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
nvidia might just develop the chip for the next gen psp you never know

but will they develop for the next xbox, . . . . I am very sure they won't.

it definitely isn't going to be a G300 or a HD6000 because by the time those consoles come out those cards will be old

Nvidia would develop for the next PSP much better looking at the mobile phone market research and development they have been doing. Now when it comes to PS4 and the next xbox I don't know.

I would assume it would be a split between ATi developing for one and Nvidia for the other since the volume needed for console chips would be too much for one single company to handle with the amount that will be sold of each.
 

troyrae360

New Member
Joined
Feb 2, 2009
Messages
1,129 (0.20/day)
Location
Christchurch New Zealand
System Name My Computer!
Processor AMD 6400+ Black @ 3.5
Motherboard Gigabyte AM2+ GA-MA790X DS4
Cooling Gigabyte G-Power 2 pro
Memory 2x2 gig Adata 800
Video Card(s) HD3870x2 @ 900gpu and 999mem
Storage 2x wd raid edition 120gig + 1 samsung 320 + samsung 250
Display(s) Samsung 40inch series6 full HD 1080p
Case NZXT Lexa
Audio Device(s) ALC889A HD audio with Enables a Superior Audio Experience (on board)
Power Supply Vantec ION2+ 550w
Software Vista Home pream 64
Still no Fremi?
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.23/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000
I'm not going to ignore that ATI has a lead and is innovating into new APIs, but to keep the discussion about the APIs on topic with the GT300 then I am forced to point out that the success of the GPU in the next most popular API is what we're looking to know. This is non-objective.

It might be boring not to wave a flag around for one company or the other, but for the sake of keeping it on topic you can easily choose to agree or add something to the discussion of the importance of API standards. Anything is better than focusing on the already staggering lead ATI has with compliance to new API.

This might seem like a stand-off, but if you posed an independent political thought, devoid of mainstream party ideals, to a group of people would you want to be the guy who jumps in with information on "the party's" new policy? The fact is that an external influence will drive gaming and force change in the hardware to meet a demand set by this external influence. DX11 may not even take on, but it might-- Food for thought is yes one side is a contender in the latest field and the other side is catching up. Does it make sense that whichever side performs better in the API most accepted by the console industry will end up the leader in PC gaming graphics? Do you have anything to say about the chance of DX11 taking hold of the console market? What are developers looking for as tools to succeed in the market? NV has obviously accepted in action and not in press that they are unable to resist changing with the standards set by M$. The GT300 is going to follow DX11 standards. It's performance, cost, and release date are all speculation for us. Is this significant? Will Fermi based enthusiast GPUs theoretically offer superior support for games written for the most recent API?

It's a bit hard for me to come off as unbiased when obviously I have questions about the GT300, but there are underlying thoughts most of which are important to leave up to further speculation. Being closed minded is useless when talking about the future, and it seems as accurate as Tarot cards in the hands of a stoner at the beach.
 

Bo_Fox

New Member
Joined
May 29, 2009
Messages
480 (0.09/day)
Location
Barack Hussein Obama-Biden's Nation
System Name Flame Vortec Fatal1ty (rig1), UV Tourmaline Confexia (rig2)
Processor 2 x Core i7's 4+Gigahertzzies
Motherboard BL00DR4G3 and DFI UT-X58 T3eH8
Cooling Thermalright IFX-14 (better than TRUE) 2x push-push, Customized TT Big Typhoon
Memory 6GB OCZ DDR3-1600 CAS7-7-7-1T, 6GB for 2nd rig
Video Card(s) 8800GTX for "free" S3D (mtbs3d.com), 4870 1GB, HDTV Wonder (DRM-free)
Storage WD RE3 1TB, Caviar Black 1TB 7.2k, 500GB 7.2k, Raptor X 10k
Display(s) Sony GDM-FW900 24" CRT oc'ed to 2560x1600@68Hz, Dell 2405FPW 24" PVA (HDCP-free)
Case custom gutted-out painted black case, silver UV case, lots of aesthetics-souped stuff
Audio Device(s) Sonar X-Fi MB, Bernstein audio riser.. what??
Power Supply OCZ Fatal1ty 700W, Iceberg 680W, Fortron Booster X3 300W for GPU
Software 2 partitions WinXP-32 on 2 drives per rig, 2 of Vista64 on 2 drives per rig
Benchmark Scores 5.9 Vista Experience Index... yay!!! What??? :)
I suspect that if NV does develop a GT300-like architecture for the PS4, it will be on a 28-32nm process. This time around, Sony will be much more budget-minded and pressuring on NV to make a great, power-efficient chip that is not too costly to manufacture.

The chip that NV made for PS3 was not any better than the one that ATI made one year earlier for Xbox360. ATI's chip was roughly equivalent to an X1900 (not XT or XTX, but just plain X1900 like the All-in-Wonder version) plus tessellation and a bit of unified shader support like the R600. NV's chip was like a vanilla 7900GT. Of course, there was less memory for the consoles, with lightning fast embedded RAM on the PS3.

2011 is only a year from now, and we probably will not be seeing the PS4 until 2012 as Sony works hard on ensuring that the console is not going to cost an arm and a leg like the PS3 did. The PS3 still has not yet broken even--not even with the recently released Slim version. 2012 leaves enough time for NV to design a budget GT400 chip (one generation after the Fermi) for the PS4, developed on a slightly mature 28nm process. Sony is also working hard on venturing into improving the user-interactive experience on the PS4 after suffering the amazing success of the Wii that was so cheap to make, so do not expect Sony to rush it at all.

I expect the graphics race to continue between the next Xbox and PS4 because more people will have HDTV's, and everybody will want to game at 1080p with great graphics (most Xbox360 and PS3 games are not true 1080p, and do not even have antialiasing). Since PS3 had a hard time competing with Xbox360, even after it was supposed to have a one year advantage in technology (like when it destroyed Sega's Dreamcast released one year before the PS2), Sony will definitely not let NV have it as easy this time around when bargaining for the PS4.
 
Last edited:

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.62/day)

:rockout:

Much respect to you, sir. That's some astute thinking right there.

My point, really, in all that, was that nV has already snubbed the API without hardware tesselation, as well as pointing out that nV's sights are on with Sony, not M$. So there's only so much speculation to be had...as you said, there are alot of factors to be considered in ALL MARKETS that Fermi will be launched in.

But, in the console market, things are too tight with APIs or anything like that when it come to Sony....I seem to recall a "We didn't want programming the PS3 to be easy", and given those comments from years ago about unified graphics seem to still ring true from nV ("Making gpus is DAMN hard"), I do feel that there is far more info out there about what Fermi and it's derivatives will bring than most admit, but companies' policies of "we do not comment on unreleased products" gets in the way of the real hard facts coming to light. API's don't matter to nV, as quote in that article...they just want to bring the best solution adn the best performance they can...and if they cannot do it within a specifc API, then they won't. But that's not gonna stop them from releasing products...and thankfully they have the established market as well as the seeded staff @ development house to ensure they do well.

I mean, you can read the whitepaper from the nV site if you want specifics. I have, and it sounds good to me.

I may post about ATi stuff primarily, but that doesn't mean I don't have nV products in my house...I most certainly do use nV products, but don't spend as much time with them, so choose not to discuss them too much.

This gen though..."S3D is for me". I'm buying Fermi-based products...and my current interest is merely to find out how much I'm gonna need to set aside for them...and how many I'm gonna need to get the performance I desire.


nV, as a business, is tough to knock. I must :respect: to Jen Hsun, because he does a fantastic job @ bringing success to him and those around him. With that in mind, I have no doubts Fermi will eb a success...and I'll help contribute to that success...I just want to know how much it's gonna cost, becuase I'm in whether it beats ATI or not. I don't just want performance..I want a great gaming experience.
 
Joined
Oct 6, 2009
Messages
2,820 (0.53/day)
Location
Midwest USA
System Name My Gaming System
Processor Intel i7 4770k @ 4.4 Ghz
Motherboard Asus Maximus VI Impact (ITX)
Cooling Custom Full System Water cooling Loop
Memory G.Skill 1866 Mhz Sniper 8 Gb
Video Card(s) EVGA GTX 780 ti SC
Storage Samsung SSD EVO 120GB - Samsung SSD EVO 500GB
Display(s) ASUS W246H - 24" - widescreen TFT active matrix LCD
Case Bitfenix Prodigy
Power Supply Corsair AX760 Modular PSU
Software Windows 8.1 Home Primeum
This post has been a long time formulating, and I welcome any criticisms.

How many of us have gotten at least 3 different claims as to the performance or release of this card? Cynically I've decided that I'm not going to bat my eyelashes at any claims that come out of CES. There's bound to be a little more truth circling the bowl, but most people will excuse me if I assume the cycle of bullsh!t has yet to flush. I'm not sure if the majority of posters/readers will excuse my overall indifference because that isn't very exciting. Likewise it's not hard to speculate that NV may have a true performer to take a crown in 2010, but ATI has a firm place in this generation's the line-up which could mean good or bad things in the future. With the downturn of the global economy there is enough of a depressant force present in a number of software companies to recycle old engines, or adopt some sort of broad design utility. The mainstream GPUs will see more action than chopsticks during the Chinese new-year. I think it's wise to assume that it's getting dangerously close to a point in time where the GPUs must offer stellar performance in a new API because Microsoft not only authors DX runtimes, but they are also a console competitor. Realistically (and correct me if I'm wrong) they're going to merge development of their runtimes with console development. The paradigm shift will be when enough of the software industry is willing to move.

If you accept any of these ideas then I offer a summary of my thoughts.

-the GT300/GT100 series cards are going to take a crown in performance, but this generation will offer little more than a spitting contest between ATI/NV.
-3D environment software development will become further compartmentalized, and game developers will buy into a smart, economic, standard before leaning heat-on into a new API which is not yet mature/affordable in hardware support.
-Microsoft (3v!L3) will most likely decide which generation of GPU will hold the standard for a life determined by their next console.

I'm a bit off topic, and a little on topic. It's pretty obvious, but I figured this is a nice mix of topic all rooted around the importance of the GT300.

I would agree with most of your Post. Especially with the part that there won't be much difference between the performance of ATI and Nvidia this time around. I also agree with the fact that the Global economy will play a roll in the type of card and chips we see released in the near Future.
That last part I will comment a little on by myself. Regardless if Nvidia takes the performance crown back or not this round. I think the control of the market will be based on which card gives the best performance for price. (again because of the global economy.) So if that is the case and Nvidia follows it's normal rule of releasing cards that are Uber expensive. Then I believe ATI will give Nvidia a whooping this round.
I am sure that Femi is a great and really powerful card. Lets also pretend for a second that it really can beat a 5870 by 48%. That would be some amazing technology. But that would also be some very expensive technology. Also wioth the amount of time and Money Nvidia has spent on FEMI I seriously doubt it is going to be an affordable card that can compete with the 5870 Price per performance. Sure there are a lot of Enthusiasts out there that want a card like what I described up there. But in today's world how many of those same enthusiasts can afford that same card? I would bet a lot less than there used to be.
So unless Nvidia brings out a very great performing cheap card...... I think that Nvidia might have some problems.

The rest of your comment resounds loudly in my head...but knowing that ATi hardware was used for development of DX9, DX10, and now DX11, I find it hard to beleive that nVidia has any chance in the console market, as they've snubbed M$ too many times.

Now this comment above was made by someone else.... but I agree with him more than I agree with your views. ATI has had a racket on the console market for quite some time. I really don't see Nvidia being able to take that away from them.
Perhaps that is why they are going so hard after the GPU that does computations. That in my eyes is still a very open market that Nvidia has a wonderful chance of taking.

Well these are just some of my Idea's take them for what you will.


Yes, well that's the whole thing...DX11 will be six months old when Fermi finally launches, meaning that developers have had thier hands on DX11 code for much, much longer(as well as ATi's DX11 hardware, which went out as early as April of last year), yet we know ATi does well with DX11...and nV's apparant lack of hardware tesselation is the nail in the coffin for thier acceptance by M$ for console gpus.

You've got to remember, that before DX10 came out, and nV was saying:

That's what I thought Nvidia doesn't have anything for Hardware Tessellation right at the moment do they :)
 

Binge

Overclocking Surrealism
Joined
Sep 15, 2008
Messages
6,979 (1.23/day)
Location
PA, USA
System Name Molly
Processor i5 3570K
Motherboard Z77 ASRock
Cooling CooliT Eco
Memory 2x4GB Mushkin Redline Ridgebacks
Video Card(s) Gigabyte GTX 680
Case Coolermaster CM690 II Advanced
Power Supply Corsair HX-1000

;) The teams are happy to have you as a consumer. Thanks for understanding, and especially for not taking it the wrong way.

P.S. I do love my 5XXX series cards! RED POWER :laugh:

You're particularly observant when it comes to previous trends in hardware folk trying to nose up the software guys. Adhering to hardware logic isn't as profitable as making it easy on the software side. Devs today need to be catered to as much as us end users. It's just that complex.
 
Joined
Jan 26, 2009
Messages
488 (0.09/day)
I am also worried about the price of Fermi...
I hope a single gf 100 models will compete with 5850 and 5770
and a single gf 104 models (q2 2010) will compete with 5870 and 5970
 

bobzilla2009

New Member
Joined
Oct 7, 2009
Messages
455 (0.09/day)
System Name Bobzilla the second
Processor AMD Phenom II 940
Motherboard Asus M3A76-CM
Cooling 3*120mm case fans
Memory 4GB 1066GHz DDR2 Kingston HyperX
Video Card(s) Sapphire Radeon HD5870 1GB
Storage Seagate 7200RPM 500GB
Display(s) samsung T220HD
Case Guardian 921
Power Supply OCZ MODXSTREAM Pro 700w (2*25A 12v rail)
Software Windows 7 Beta
Benchmark Scores 19753 3dmark06 15826 3dmark vantage 38.4Fps crysis benchmarking tool (1680x1050, 4xAA)
I would bet that fermi will cost an arm and a leg. I would imagine a £400 price tag MINIMUM for the gtx380, probably more like £450-£500, and around £300-350 for the gtx360. I also don't think the performance gain will be worth the premium, but it's always the same with nvidia for a few months :) I would hope for a lower price to drive down hd5xxx prices, but due to the huge delay on fermi, i can't see nV being able to feasibly do so. However, if these two cards do compete with the 5970 and 5870 respectively on performance, it could be very interesting indeed.
 
Last edited:

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.62/day)
Personally, I think the addition of Phys-X(even though I hate that it's a closed API, and that they won't liscence it to ATI) is justification enough for higher prices over the competition.

I mean sure, ATI supports Havoc gpu physics, but nearly noone uses it in thier apps. Phys-X has pretty good market saturation, and it does bring added features that ATi has no claim to.

So, slightly more performance, and more features = higher prices. Doesn't bother me one bit.

And while I may knock nV for not supporting DX10, I completely understand why they took so long, but I am also very much aware that the present install base for DX11 includes all Vista and Win7 machines, whereas DX10 just had the few Vista boxes that sold.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
and that they won't liscence it to ATI)

This has been said til death, but Nvidia not only wanted to licende PhysX to AMD, they offered it for free!! I was AMD who said no to PhysX in the first place, probably because they were (and still are tbh) significantly behind in GPGPU. Adopting PhysX would have opened the door to PhysX benchmarks, in which Nvidia would stomp Ati, just like they do in F@H, for example. On the other hand Ati cards were more than capable of doing the PhysX present in the games, because those can easily be run in a 8400 GS, so if the thing would have been about giving the best to customers they would have just accepted. Shaddy bussiness practices go both ways, sadly...

On topic

Regarding the Fermi numbers posted above, once again, they are more than believable and are a match to the numbers I have predicted based on how past Nvidia cards have scaled according to their specs.

Seriously, I don't know why people is so reluctant to believe that Fermi is going to be twice as fast as a GTX285, when its specs point to a card that is 2.5x times faster. The only reasons people are giving is "what happened in the past" in regards to GT200 vs. RV770... Well, RV770 was the best Ati card in years while GT200 was a semi-failure, it didn't meet the expectations at all: they had to decrease SP number early on on the development of the chip, missed target clocks by at least a 10%, used a memory that was one half the speed of the competition and it was developed at 65nm because they were too cautious and scared, (Huang's own words this last one). Those things made GT200 far slower than it should have been in Nvidia's mind. Remember one of their guidelines is "twice the performance every 12-18 months". Fermi has none of those problems: they went with the new process, they went for the high SP number they trully wanted (2.15x times a GTX285 to be precise) and according to the various specs posted fromdifferent sources, they met the clocks (650 core/1700 SPs). Those things make the chip "big" (smaller than GT200 tho, take that into account too) and difficult to produce on the ill 40 nm process, but that's the only drawback to their decision. This time they took the risk of making the chip they wanted, they didn't cut anything, something they did with GT200 and they are trully paying for that decision in the form of a delay. But the rest is not going to fail just because some people want it to fail. Common sense and history, tells that Fermi will be around twice as fast as a GTX285, because its specs say so and there's not a single thing that points to the contrary.
 
Joined
Feb 21, 2008
Messages
4,985 (0.84/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
What are you doing Benetanegia! This thread is for Nvidia bashing and not rational thought. How dare you turn it intellectual again. :laugh:

BTW thank you to those who are still throwing ideas and info out there. Many people(myself included) are looking for good analysis from those who know what they are talking about. Thats hard to find on the internet these days. Thank you.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.62/day)
they offered it for free!!
You don't know the full details of how that deal was to be made, and what stipulations would have been placed on AMD. I don't either, exactly, but I definately would like to know. Seems to me that the terms of sucha deal was something that AMD would not accept, and they were left waiting for Havoc, which is developed by Intel, also thier competitor. Fact of the matter is that if AMD/ATI could take the exclusivity of Phys-X away from nV, it would hurt nV's sales.

Case and point, as soon as nV took over Ageia, Phys-X drivers have disabled Phys-X with ATI cards as the primary dispaly device...even the add-in Ageia cards. And now that these updated libraries are shipping with games, and install with the game, nV has completely broken Phys-X for anyone but themselves.

If they truly wanted to offer it for free, why'd they break it with the add-in Ageia cards? There's not even any need for them to do any stability testing...or qualification...was already done...but I cannot play Phys-X with GRAW 1 & 2 any more, and I used to be able to...

Why'd they cut ATI from anti-aliasing in Batman:AA?

It's business, and giving your stuff away for free isn't good business. And nV is a OVERLY good business.

On topic

Regarding the Fermi numbers posted above, once again, they are more than believable and are a match to the numbers I have predicted based on how past Nvidia cards have scaled according to their specs.

Seriously, I don't know why people is so reluctant to believe that Fermi is going to be twice as fast as a GTX285, when its specs point to a card that is 2.5x times faster. The only reasons people are giving is "what happened in the past" in regards to GT200 vs. RV770... Well, RV770 was the best Ati card in years while GT200 was a semi-failure, it didn't meet the expectations at all: they had to decrease SP number early on on the development of the chip, missed target clocks by at least a 10%, used a memory that was one half the speed of the competition and it was developed at 65nm because they were too cautious and scared, (Huang's own words this last one). Those things made GT200 far slower than it should have been in Nvidia's mind. Remember one of their guidelines is "twice the performance every 12-18 months". Fermi has none of those problems: they went with the new process, they went for the high SP number they trully wanted (2.15x times a GTX285 to be precise) and according to the various specs posted fromdifferent sources, they met the clocks (650 core/1700 SPs). Those things make the chip "big" (smaller than GT200 tho, take that into account too) and difficult to produce on the ill 40 nm process, but that's the only drawback to their decision. This time they took the risk of making the chip they wanted, they didn't cut anything, something they did with GT200 and they are trully paying for that decision in the form of a delay. But the rest is not going to fail just because some people want it to fail. Common sense and history, tells that Fermi will be around twice as fast as a GTX285, because its specs say so and there's not a single thing that points to the contrary.

Yield issues, and delayed launches(2 now, first with Win7, and 2ndly, November). If it wasn't for the yield issues for ATI, with less transistor density, there'd be no questions about Fermi. Jen Hsun holding up a fake card didn't help either.

None of the delays are truly 100% up to nV, but thier design choices have greatly affected yields far more than ever before. ATi probably would have been just as complex, but had prior experience with the process that dictated how thier current gen was designed. None of what ATI has done really reflects on what nV is doing, however, the differences in design do say alot about what's affecting nV currently. nV admitted they need 0% leakage to hit thier target, already, which I linked to earlier in the thread.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Fact of the matter is that if AMD/ATI could take the exclusivity of Phys-X away from nV, it would hurt nV's sales.

Case and point, as soon as nV took over Ageia, Phys-X drivers have disabled Phys-X with ATI cards as the primary dispaly device...even the add-in Ageia cards. And now that these updated libraries are shipping with games, and install with the game, nV has completely broken Phys-X for anyone but themselves.

If they truly wanted to offer it for free, why'd they break it with the add-in Ageia cards? There's not even any need for them to do any stability testing...or qualification...was already done...but I cannot play Phys-X with GRAW 1 & 2 any more, and I used to be able to...

They disabled it LONG after the adquisition, a year and a half after the adquisition and the reason they disabled it was lack of support and QA from AMD's end. I'm not going to discuss what has been discussed like 1000 times already. Lack of support for QA is more than enough to not implement something. GPU and PhysX drivers have to work together. What was Nvidia supposed to do? Inverse engineer every single new AMD driver? Put spies in AMD HQ months inadvance, so that they could start working on PhysX compatibility for new AMD cards (i.e RV870) with enough time to make it work properly? They have a lab with 500 PCs to test stability and QA on Nvidia cards, should they pay another 500 so that they work well with AMD or should they allow an incomplete solution be widely available and take the blame when it doesn't work? Did they really had to pay so that it worked on the competitors hardware when the competitor didn't care at all from the beginning? Was AMD willing at least to pay for that QA and give some info on new releases so that everything could go smoothly on AMD's end? Answer is NO NO NO and NO to everything. Hence the only thing they could do is disable PhysX except for Nvidia cards or keep paying for doing a service to AMD, plain and simple.

Why'd they cut ATI from anti-aliasing in Batman:AA?

Same as above. Batman uses Unreal Engine 3. UE3 has no AA. They paid and QA an special AA for Nvdia cards. The developer contacted with AMD to do the same with AMD, AMD refused to work with the developer since the beginning (not only regarding that feature) because it was TWIMTBP game. End of story: you don't help you don't get. And that's something Nvidia had nothing to do with.

It's business, and giving your stuff away for free isn't good business.

It's absolutely good business when you know you are faster on that specific thing AND when that thing has the posibility of making you sell a second card per computer. Not to mention setting a precedent in the power GPGPU, when your next release is heavily based on GPGPU.

Yield issues, and delayed launches(2 now, first with Win7, and 2ndly, November). If it wasn't for the yield issues for ATI, with less transistor density, there'd be no questions about Fermi. Jen Hsun holding up a fake card didn't help either.

None of the delays are truly 100% up to nV, but thier design choices have greatly affected yields far more than ever before. ATi probably would have been just as complex, but had prior experience with the process that dictated how thier current gen was designed. None of what ATI has done really reflects on what nV is doing, however, the differences in design do say alot about what's affecting nV currently. nV admitted they need 0% leakage to hit thier target, already, which I linked to earlier in the thread.

Bad 40nm a mayor issue that is been heavily underestimated. Sure the nature of chip has it's influence, but 90% f te blame is on TSMC. You need an absolute minimum of 2 years to design a new chip. If you are told that you will have a working (implies good or decent yields) process by date X, you develop for that process. If you are designing an sports car that is suposed to be used in highways and when you reach the place and there's no asphalt and you car is shit on dirt... well I'm sure there are plenty of interpretations of that, but I know what to think. Fermi was designed for a mature (1-1.5 year old, 80-90% yields expected) 40nm process and they found a 30-40% yield process, who's really to blame? You can design your chip thinking on a failure that puts yileds at 60-70%, but 30-40% come on... Fact of the matter is that AMD has serious problems with the process too, to the point of selling a lame 300.000 HD58xx cards in 3 months when the chip has been in production since June (6 months of stock= 300.000 chips = shameful on TSMC's part*).

*Just to see how low of a numner of cards that is, in comparison, when 8800 GT and HD3xxx cards were released in Q4 2007 (same quarter, holidays, important) the number of discreete cards jumped from a typical 20-25 million units to a whooping 31 millions, which suggests lots and losts (millions) of those new cards were sold and remember that shortage was claimed for those too. It clearly was a different kind of shortage though...

Source: http://www.xbitlabs.com/news/video/...rds_Continue_to_Drop_Jon_Peddie_Research.html
 
Last edited:
Joined
Jul 2, 2008
Messages
3,638 (0.63/day)
Location
California
Nvidia helping to bring PhysX to ATi Cards
http://www.bit-tech.net/news/2008/07/09/nvidia-helping-to-bring-physx-to-ati-cards/1

AMD comments on NVIDIA dropping PhysX support when ATI hardware is present
http://tech.icrontic.com/news/amd-c...ng-physx-support-when-ati-hardware-is-present

AMD rather lives with Intel's Havok than nVIDIA's PhysX :laugh:.

NVIDIA is one big bad boy with a healthy relationship with games developers. There are more TWIMTBP than AMD games. One of the reason driver usually is not the problem with NVIDIA cards.

Fermi won't launch anytime soon, and there aren't any DX11 games that I want right now. Battlefield Bad Company 2 coming out on March, hope Fermi will come out around the same time :laugh:.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.62/day)

Firstly, please do not think I'm bashing nV here...that's not my intent.

2ndly, when your foundry company has NEVER released a chip with zero leakage, to expect them to be able to is foolish at best, and outright stupid at worst.


the reason they disabled it was lack of support and QA from AMD's end

Explain GRAW 1&2 then. Explain 3DMark Vantage. No Q/A to be done. Ageia had already done all of that, and I personally know when and how it transpired, as I bought a Phys-X card very shortly after release. Worked fine until the new libraries were released, and the very FIRST nV Phys-X driver broke it. Sure, it took some time before this happened, but we got nV guys saying "First, we make sure that what we release doesn't break anything", and this couldn't be further from the truth when talking about the titles released before nV took over. Now, NOTHING works. And them breaking that function by disabling it required no cash, but they could have also wirtten code just as short to see if it was an Ageia card, and popping up a disclaimer that functionality could not be guaranteed, rather than allowing the driver for the Ageia card to be installed, look like it's working, and then have it do NOTHING.

However, an intrepid team of software developers over at NGOHQ.com have been busy porting Nvidia's CUDA based PhysX API to work on AMD Radeon graphics cards, and have now received official support from Nvidia - who is no doubt delighted to see it's API working on a competitor's hardware (as well as seriously threatening Intel's Havok physics system.)

This cost nV money(the "support")...they actually paid the NGO guys off to stop development/threatened lawsuits upon release. Here's your cost-less solution for ATI cards, and nV squashed it like a bug. So much for offering it for free...it cost money to ensure that AMD did NOT get it free.

If you go back to then(first nV driver for Phys-X, August 2008), and check my posts on XS, you'll find both myself and others complaining about the issue. I just find it comical that it took this long for the media to pick it up...nV breaking Phys-X w/ ATI cards isn't something new...far from it. They have just recently completely disabled it with ATI and nV cards together, but they broke the Ageia cards almost 1.5 years ago...they bought Ageia february 2008, and by August, released the driver than broke the add-in Ageia cards, a short 6 months, not 1.5 years. And the only way to fix the function, after installing the nV driver, was a full OS re-install...I spent many months on this issue back then, then finally gave up and sold my Ageia card.




When it comes to thier own cards, paired with ATI cards, and gpu phys-x, I can hold no fault, but them disabling configs that already had been tested, and were working, is in-excusable.


As to the yields...if Fermi wasn't so complex, we'd have had a release of the lower-bin models by now, but even that hasn't happened, so yes, this issue is VERY underestimated. But shortly after Jen Hsun visted TSMC in person(first half of OCtober), we had TSMC's CEO publically state that they had yield issues, and that they were fixed. We are now almost 3 months later, and given the supply issue ATI is having, clearly the problem as only fixed a couple of weeks ago, if at all.
 

Attachments

  • DSC03217a.jpg
    DSC03217a.jpg
    84.8 KB · Views: 512
Last edited:

Bo_Fox

New Member
Joined
May 29, 2009
Messages
480 (0.09/day)
Location
Barack Hussein Obama-Biden's Nation
System Name Flame Vortec Fatal1ty (rig1), UV Tourmaline Confexia (rig2)
Processor 2 x Core i7's 4+Gigahertzzies
Motherboard BL00DR4G3 and DFI UT-X58 T3eH8
Cooling Thermalright IFX-14 (better than TRUE) 2x push-push, Customized TT Big Typhoon
Memory 6GB OCZ DDR3-1600 CAS7-7-7-1T, 6GB for 2nd rig
Video Card(s) 8800GTX for "free" S3D (mtbs3d.com), 4870 1GB, HDTV Wonder (DRM-free)
Storage WD RE3 1TB, Caviar Black 1TB 7.2k, 500GB 7.2k, Raptor X 10k
Display(s) Sony GDM-FW900 24" CRT oc'ed to 2560x1600@68Hz, Dell 2405FPW 24" PVA (HDCP-free)
Case custom gutted-out painted black case, silver UV case, lots of aesthetics-souped stuff
Audio Device(s) Sonar X-Fi MB, Bernstein audio riser.. what??
Power Supply OCZ Fatal1ty 700W, Iceberg 680W, Fortron Booster X3 300W for GPU
Software 2 partitions WinXP-32 on 2 drives per rig, 2 of Vista64 on 2 drives per rig
Benchmark Scores 5.9 Vista Experience Index... yay!!! What??? :)
They disabled it LONG after the adquisition, a year and a half after the adquisition and the reason they disabled it was lack of support and QA from AMD's end. I'm not going to discuss what has been discussed like 1000 times already. Lack of support for QA is more than enough to not implement something. GPU and PhysX drivers have to work together. What was Nvidia supposed to do? Inverse engineer every single new AMD driver? Put spies in AMD HQ months inadvance, so that they could start working on PhysX compatibility for new AMD cards (i.e RV870) with enough time to make it work properly? They have a lab with 500 PCs to test stability and QA on Nvidia cards, should they pay another 500 so that they work well with AMD or should they allow an incomplete solution be widely available and take the blame when it doesn't work? Did they really had to pay so that it worked on the competitors hardware when the competitor didn't care at all from the beginning? Was AMD willing at least to pay for that QA and give some info on new releases so that everything could go smoothly on AMD's end? Answer is NO NO NO and NO to everything. Hence the only thing they could do is disable PhysX except for Nvidia cards or keep paying for doing a service to AMD, plain and simple.

QFT.. most companies do not want to keep on supporting legacy hardware for compatibility reasons. Even ATI has dropped support for their X1900XTX with newer drivers, while NV continues to support cards older than that (6800 Ultra), IIRC.. if I'm wrong, then please correct me.

Same as above. Batman uses Unreal Engine 3. UE3 has no AA. They paid and QA an special AA for Nvdia cards. The developer contacted with AMD to do the same with AMD, AMD refused to work with the developer since the beginning (not only regarding that feature) because it was TWIMTBP game. End of story: you don't help you don't get. And that's something Nvidia had nothing to do with.

ATI did add AA support to the UE3 engine after a month or two, after the game was released. It was causing quite a stir among the community, so ATI eventually did it with their HD2900XT back then. Now, there are several reports that many of the UE3 engine games do not have AA with ATI cards, so I do not know for sure. I havent tried an UE3-game (other than Batman which is obviously not supported for AA) with my 4870 and recent drivers--should I check to make sure for you guys?

BatmanAA is a much, much more bland game without PhysX--even worse than Mirror's Edge without PhysX anyways.


It's absolutely good business when you know you are faster on that specific thing AND when that thing has the posibility of making you sell a second card per computer. Not to mention setting a precedent in the power GPGPU, when your next release is heavily based on GPGPU.

Good point there.. but since Nvidia basically owned PhysX after buying Ageia, ATI probably did not want to fall into the trap and be charged royalty fees later on (there are probably millions of lines of fine print in the business stipulation that you have not yet read). It is not to be expected for NV to unconditionally give it away for free for a period of say, 10 years.

Bad 40nm a mayor issue that is been heavily underestimated. Sure the nature of chip has it's influence, but 90% f te blame is on TSMC. You need an absolute minimum of 2 years to design a new chip. If you are told that you will have a working (implies good or decent yields) process by date X, you develop for that process. If you are designing an sports car that is suposed to be used in highways and when you reach the place and there's no asphalt and you car is shit on dirt... well I'm sure there are plenty of interpretations of that, but I know what to think. Fermi was designed for a mature (1-1.5 year old, 80-90% yields expected) 40nm process and they found a 30-40% yield process, who's really to blame? You can design your chip thinking on a failure that puts yileds at 60-70%, but 30-40% come on... Fact of the matter is that AMD has serious problems with the process too, to the point of selling a lame 300.000 HD58xx cards in 3 months when the chip has been in production since June (6 months of stock= 300.000 chips = shameful on TSMC's part*).

*Just to see how low of a numner of cards that is, in comparison, when 8800 GT and HD3xxx cards were released in Q4 2007 (same quarter, holidays, important) the number of discreete cards jumped from a typical 20-25 million units to a whooping 31 millions, which suggests lots and losts (millions) of those new cards were sold and remember that shortage was claimed for those too. It clearly was a different kind of shortage though...

Source: http://www.xbitlabs.com/news/video/...rds_Continue_to_Drop_Jon_Peddie_Research.html

+1 on that. I do not expect 28nm process to be any easier than 40nm. I still hold in my hypothesis that 22 nm (or 18-ish nm) process is probably going to be the last process we'll be seeing this decade (yes, the new decade). There will need to be different materials other than Si-Ger, different fabbing tech (extreme lith), and there might be ways to make up for the "choke", like when the clock speeds stopped increasing so quickly after 2004 with the first implementation of dual-core CPU's--but in this case, stackable dies or "distributed" silicon to fix the super-dense problem of hot spots in one bundle.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Firstly, please do not think I'm bashing nV here...that's not my intent.

2ndly, when your foundry company has NEVER released a chip with zero leakage, to expect them to be able to is foolish at best, and outright stupid at worst.




Explain GRAW 1&2 then. Explain 3DMark Vantage. No Q/A to be done. Ageia had already done all of that, and I personally know when and how it transpired, as I bought a Phys-X card very shortly after release. Worked fine until the new libraries were released, and the very FIRST nV Phys-X driver broke it. Sure, it took some time before this happened, but we got nV guys saying "First, we make sure that what we release doesn't break anything", and this couldn't be further from the truth when talking about the titles released before nV took over. Now, NOTHING works. And them breaking that function by disabling it required no cash, but they could have also wirtten code just as short to see if it was an Ageia card, and popping up a disclaimer that functionality could not be guaranteed, rather than allowing the driver for the Ageia card to be installed, look like it's working, and then have it do NOTHING.

This cost nV money(the "support")...they actually paid the NGO guys off to stop development/threatened lawsuits upon release. Here's your cost-less solution for ATI cards, and nV squashed it like a bug. So much for offering it for free...it cost money to ensure that AMD did NOT get it free.

If you go back to then(first nV driver for Phys-X, August 2008), and check my posts on XS, you'll find both myself and others complaining about the issue. I just find it comical that it took this long for the media to pick it up...nV breaking Phys-X w/ ATI cards isn't something new...far from it. They have just recently completely disabled it with ATI and nV cards together, but they broke the Ageia cards almost 1.5 years ago...they bought Ageia february 2008, and by August, released the driver than broke the add-in Ageia cards, a short 6 months, not 1.5 years. And the only way to fix the function, after installing the nV driver, was a full OS re-install...I spent many months on this issue back then, then finally gave up and sold my Ageia card.

http://forums.techpowerup.com/attachment.php?attachmentid=31890&stc=1&d=1262819541


When it comes to thier own cards, paired with ATI cards, and gpu phys-x, I can hold no fault, but them disabling configs that already had been tested, and were working, is in-excusable.


As to the yields...if Fermi wasn't so complex, we'd have had a release of the lower-bin models by now, but even that hasn't happened, so yes, this issue is VERY underestimated. But shortly after Jen Hsun visted TSMC in person(first half of OCtober), we had TSMC's CEO publically state that they had yield issues, and that they were fixed. We are now almost 3 months later, and given the supply issue ATI is having, clearly the problem as only fixed a couple of weeks ago, if at all.

The ageia card was broken EVEN if you had a Nvidia card and had nothing to do with having an Ati card. They said they would not support the card from the beginning. Face it, not even Ageia was supporting the card since long time ago and they were bankrupt. You got 6 mnths of support instead of nothing, you should be more than happy. Not that it really matters too much in the big squeme of things, less than 100.000 ageia cards were sold. Sorry for those who actually bought one, but 100.000 people, when you have 500 millions to care off is not financially viable for anyone.

This cost nV money(the "support")...they actually paid the NGO guys off to stop development/threatened lawsuits upon release. Here's your cost-less solution for ATI cards, and nV squashed it like a bug. So much for offering it for free...it cost money to ensure that AMD did NOT get it free.

BS. Or you have any proof? Actually Nvidia wanted it implemented and AMD didn't. To the point of not even sending a HD4xxx card to test on. A card!! FFS!! AMD scared them with lawsuits. I know Nvidia didn't because PhysX was actually free for everyone and many many developers were using it and can respond for that. Nvidia would lose any lawsuit based on that. And they were paid? Yeah, as much as TWIMTBP games are paid to break features on AMD. BS after BS. Seriously if anything of that had ever happened we would have a lawsuit against Nvidia already or one developer (a person not a company) that would have said something. Thousands of developers working under TWIMTBP and no one said anything? People that can be fired and that by statistics HAS been fired and no one takes his revenge with something so simple? There's nothing there, except lies from those who have something to win from a bad image of Nvidia, yep Ati/AMD. BS, BS, BS...
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.62/day)
The ageia card was broken EVEN if you had a Nvidia card and had nothing to do with having an Ati card.
Worked fine for me with nV cards.:confused:

And i thought it took 1.5 years? Now the story changes?

:laugh:

J/K. Just bugging ya man, don't take it personal. I only know what's in the public domain + my own experience, and I purposely try to keep my personal feelings out of it. Don't shoot the msnger...although I wonder why you are so quick to defend nV here...fact is, nV didn't have to make it so the driver, even after removal, still prevented Ageia cards from working. Now this bit of code in packaged in games, and hence we get AMD crying about it, as there's no way for the end user to know that installing a new game might break his previously working config.



BS. Or you have any proof?

Nah, just restating what others stated. The link is above. Aparantly teh NGO guys said nV told them to stop, paid some of them, and threatened lawsuit. You got any proof?

:rolleyes:

:D

Again, don't shoot the messenger, tis not MY info, just what's out there, that anyone can find on other sites.
 

bobzilla2009

New Member
Joined
Oct 7, 2009
Messages
455 (0.09/day)
System Name Bobzilla the second
Processor AMD Phenom II 940
Motherboard Asus M3A76-CM
Cooling 3*120mm case fans
Memory 4GB 1066GHz DDR2 Kingston HyperX
Video Card(s) Sapphire Radeon HD5870 1GB
Storage Seagate 7200RPM 500GB
Display(s) samsung T220HD
Case Guardian 921
Power Supply OCZ MODXSTREAM Pro 700w (2*25A 12v rail)
Software Windows 7 Beta
Benchmark Scores 19753 3dmark06 15826 3dmark vantage 38.4Fps crysis benchmarking tool (1680x1050, 4xAA)
thousands of developers? I do feel you are going off your rocker slightly. Have you any proof amd didn't send a hd4xxx card? And of course TWIMTBP games don't break features on ati cards, they just perform better on nvidia cards due to the drivers, as a result of better relationships with the devs. Although i have seen a couple of amd games lately, since the introduction of dx11. 'tis strange indeed and makes me wonder how this generation will pan out, since AMD are now the dx11 devs' best friend, even bfbc2 is going to be an amd game.

Still, i wouldn't base my purchase of physX, because i think it will either becomes totally open source or it'll flop in the next few years due to open source gpu physics, which everyone will have access to. Proprietary solutions rarely work out.

EDIT: out of interest i checked out the batman: AA physx changes, and i can't say that its making me want to sell my hd5870 and go for a nvidia card atm, still, if fermi turns out to be the best thing since sliced bread it'll be a nice addition. Although i still believe that it will be open source gpu physics that will win out, and when it does, we will see some amazing effects in games. Well, little additions, since most games are just delayed console ports.
 
Last edited:

TheMailMan78

Big Member
Joined
Jun 3, 2007
Messages
22,599 (3.67/day)
Location
'Merica. The Great SOUTH!
System Name TheMailbox 5.0 / The Mailbox 4.5
Processor RYZEN 1700X / Intel i7 2600k @ 4.2GHz
Motherboard Fatal1ty X370 Gaming K4 / Gigabyte Z77X-UP5 TH Intel LGA 1155
Cooling MasterLiquid PRO 280 / Scythe Katana 4
Memory ADATA RGB 16GB DDR4 2666 16-16-16-39 / G.SKILL Sniper Series 16GB DDR3 1866: 9-9-9-24
Video Card(s) MSI 1080 "Duke" with 8Gb of RAM. Boost Clock 1847 MHz / ASUS 780ti
Storage 256Gb M4 SSD / 128Gb Agelity 4 SSD , 500Gb WD (7200)
Display(s) LG 29" Class 21:9 UltraWide® IPS LED Monitor 2560 x 1080 / Dell 27"
Case Cooler Master MASTERBOX 5t / Cooler Master 922 HAF
Audio Device(s) Realtek ALC1220 Audio Codec / SupremeFX X-Fi with Bose Companion 2 speakers.
Power Supply Seasonic FOCUS Plus Series SSR-750PX 750W Platinum / SeaSonic X Series X650 Gold
Mouse SteelSeries Sensei (RAW) / Logitech G5
Keyboard Razer BlackWidow / Logitech (Unknown)
Software Windows 10 Pro (64-bit)
Benchmark Scores Benching is for bitches.
I have an Ageia card and Nvidia stopped supporting it a long time ago which I always felt was BS. I mean it still has processing power why not use it? Because they want you to but their GPUs.

Anyway how do you use an Nvidia card with an ATI card in Windows 7? I see a lot of people doing it but I thought it was only in XP.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Worked fine for me with nV cards.:confused:

And i thought it took 1.5 years? Now the story changes?

:laugh:

J/K. Just bugging ya man, don't take it personal. I only know what's in the public domain + my own experience, and I purposely try to keep my personal feelings out of it. Don't shoot the msnger...although I wonder why you are so quick to defend nV here...fact is, nV didn't have to make it so the driver, even after removal, still prevented Ageia cards from working. Now this bit of code in packaged in games, and hence we get AMD crying about it, as there's no way for the end user to know that installing a new game might break his previously working config.


Nah, just restating what others stated. The link is above. Aparantly teh NGO guys said nV told them to stop, paid some of them, and threatened lawsuit. You got any proof?

:rolleyes:

:D

Again, don't shoot the messenger, tis not MY info, just what's out there, that anyone can find on other sites.

Erm. It took 1.5 years to disable it, before that it was not supported or simply broken. Nvidia's video out feature has been broken plenty of times with plnety of drivers and for a long period of time. Did Nvidia purportedly break it? Let me think about it?? NO?

For your information I am so quick to defend Nvidia against the BS, simply because I hate BS and institucionalized ignorance. FYI up to 1950 black people were not considered inferior beings (many people had many arguments to support that idea and many people were just saying what other said too), I wasn't there but if I had, I would have defended them with the same passion as I'm fighting BS here. I defend gays against similar BS with the same passion and I fight agreession to women and animals too. For the record, I'm not black, I'm not gay, I can guarantee you I'm not a woman (I would have noticed) and I'm definately not an animal (apart from the fact that I am a human being)... I don't know if you get me... ;)
 

TheMailMan78

Big Member
Joined
Jun 3, 2007
Messages
22,599 (3.67/day)
Location
'Merica. The Great SOUTH!
System Name TheMailbox 5.0 / The Mailbox 4.5
Processor RYZEN 1700X / Intel i7 2600k @ 4.2GHz
Motherboard Fatal1ty X370 Gaming K4 / Gigabyte Z77X-UP5 TH Intel LGA 1155
Cooling MasterLiquid PRO 280 / Scythe Katana 4
Memory ADATA RGB 16GB DDR4 2666 16-16-16-39 / G.SKILL Sniper Series 16GB DDR3 1866: 9-9-9-24
Video Card(s) MSI 1080 "Duke" with 8Gb of RAM. Boost Clock 1847 MHz / ASUS 780ti
Storage 256Gb M4 SSD / 128Gb Agelity 4 SSD , 500Gb WD (7200)
Display(s) LG 29" Class 21:9 UltraWide® IPS LED Monitor 2560 x 1080 / Dell 27"
Case Cooler Master MASTERBOX 5t / Cooler Master 922 HAF
Audio Device(s) Realtek ALC1220 Audio Codec / SupremeFX X-Fi with Bose Companion 2 speakers.
Power Supply Seasonic FOCUS Plus Series SSR-750PX 750W Platinum / SeaSonic X Series X650 Gold
Mouse SteelSeries Sensei (RAW) / Logitech G5
Keyboard Razer BlackWidow / Logitech (Unknown)
Software Windows 10 Pro (64-bit)
Benchmark Scores Benching is for bitches.
Erm. It took 1.5 years to disable it, before that it was not supported or simply broken. Nvidia's video out feature has been broken plenty of times with plnety of drivers and for a long period of time. Did Nvidia purportedly break it? Let me think about it?? NO?

For your information I am so quick to defend Nvidia against the BS, simply because I hate BS and institucionalized ignorance. FYI up to 1950 black people were not considered inferior beings (many people had many arguments to support that idea and many people were just saying what other said too), I wasn't there but if I had, I would have defended them with the same passion as I'm fighting BS here. I defend gays against similar BS with the same passion and I fight agreession to women and animals too. For the record, I'm not black, I'm not gay, I can guarantee you I'm not a woman (I would have noticed) and I'm definately not an animal (apart from the fact that I am a human being)... I don't know if you get me... ;)

Yeah, yeah you're a real fucking saint. Anyway how do you get Physx to work with ATI cards in 64-bit Win7?
 
Status
Not open for further replies.
Top