• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Drivers Holding Back GTX TITAN-Z Launch

Proof that Nvidia drivers Suck as much as Amd's. 3k price for a card that isnt much faster than 780ti is stupidity.

I have cards from both manufacturers and Nvidia's drivers have been more stable overall for me.

As for blaming the drivers for the delay of the Titan-Z? I don't buy that, the card is just two GK110 in one board, my GK110s in SLI have been rock steady for over a year now, yes you need to create new fan profiles and throttle algorithms when your dealing with dual GPU cards, but Nvidia announced this card months ago, they've had plenty of time to do that.

Also, I fully agree that going for dual 780Ti (or 290Xs for that matter) makes a lot more sense :)
 
This is just getting better and better......
 
nVIDIA, nVIDIA, nVIDIA. Give up. You lost this round of the dual-GPUs-on-a-stick wars, but it doesn't matter because your 2-year-old architecture is still holding up pretty well against your competitor's 6-month-old one in all other respects. So do us all a favour, admit defeat gracefully, and get back to work delivering Maxwell.
 
nVIDIA, nVIDIA, nVIDIA. Give up. You lost this round of the dual-GPUs-on-a-stick wars, but it doesn't matter because your 2-year-old architecture is still holding up pretty well against your competitor's 6-month-old one in all other respects. So do us all a favour, admit defeat gracefully, and get back to work delivering Maxwell.

It's a point well made, a couple of custom 780 Ti's are already cheaper than AMD's power hungry water heater. The market for these cards is tiny and they make far more money on lesser spec'ed pro cards anyway.

Still, new drivers are always welcome :P
 
nVIDIA, nVIDIA, nVIDIA. Give up. You lost this round of the dual-GPUs-on-a-stick wars, but it doesn't matter because your 2-year-old architecture is still holding up pretty well against your competitor's 6-month-old one in all other respects. So do us all a favour, admit defeat gracefully, and get back to work delivering Maxwell.
? GCN was released before The 6XX series cards from nvidia? Hawaii is still GCN just with a few new features but it is still the old GCN.

What nvidia is saying is that they are trying to find an easily stable driver overclock for the card that will allow the boost clocks to exceed a point to justify the price. Problem is that on a single axial fan it's not going to get to far as is. They are not going to justify the price like this still in the end even if they get the boost clock to 1100mhz (I'd say that would be an extremely best scenario).
 
This is just getting better and better......

Where?

To me (and not only) the world is becoming ever worse and it will get even worse.

I am so disappointed with so many things, my vision for the world is completely different. Those assholes who lead the world are so stupid. They thing that there is value in things if they are neither good, nor bad, something in the middle....

nvidia is a shitty company and the only way to stop them from being ever more shitty, is to stop giving them any money.
 
Where?

To me (and not only) the world is becoming ever worse and it will get even worse.

I am so disappointed with so many things, my vision for the world is completely different. Those assholes who lead the world are so stupid. They thing that there is value in things if they are neither good, nor bad, something in the middle....

nvidia is a shitty company and the only way to stop them from being ever more shitty, is to stop giving them any money.

okay! o_O

images
 
nVIDIA, nVIDIA, nVIDIA. Give up. You lost this round of the dual-GPUs-on-a-stick wars, but it doesn't matter because your 2-year-old architecture is still holding up pretty well against your competitor's 6-month-old one in all other respects. So do us all a favour, admit defeat gracefully, and get back to work delivering Maxwell.

What do you mean 6 month-old architecture? GCN came out back in January 2012. GCN is actually a few months older than Nvidia's kepler in that regard.
Oh. and the word architecture denotes the instruction set and register layout of a given finite state machine or processor in this case. Changing the arrangement of components or making them wider/more complex doesn't equate to changing the architecture, it equate to changing the microarchitecture of a given architecture implementation.
 
Where?

To me (and not only) the world is becoming ever worse and it will get even worse.

I am so disappointed with so many things, my vision for the world is completely different. Those assholes who lead the world are so stupid. They thing that there is value in things if they are neither good, nor bad, something in the middle....

nvidia is a shitty company and the only way to stop them from being ever more shitty, is to stop giving them any money.

Hear a hint, it's called "TECH" Power up! "World vision" has no place in this discussion ANYWHERE...
 
Nvidia won't cut the price or modify anything.
I have a feeling the secrete will be in the drivers....can't wait to see what they will do....
 
I think reading between the lines they are saying, shit we cant make this abortion of a product perform any where near its price point no matter what we do with the drivers.
 
This is what happens when your Marketing department and Financial department start getting trigger happy and delirious over profit margins while the engineers are frantically glueing chips together from an old architecture and trying to cool it with a single 90mm fan.

Get the abortion over with and learn your damn lesson. No one wants your "engineering feat" crap any more. This driver nonsense is BS, 295X2 takes the cake, but nobody will buy that either because the market is such a small niche.
 
I reckon it has more to do with NVIDIA's Gujarati VP of Product Marketing. Gujaratis are like India's Ferengi.

(that was a compliment)
He likes his ears rubbed and kidnaps alien women?

h2E2F7B11
 
If they do somehow manage to boost drivers, does that mean these performance enhancements will trickle down to other cards with same architecture? Perhaps SLI scaling and stability will be greatly improved too? I mean, I can only really see two things they can do; make drivers much more efficient for the entire architecture (Kepler) and/or improve SLI performance. Both those should benefit all users that meet the criteria.
 
Nvidia won't cut the price or modify anything.
I have a feeling the secrete will be in the drivers....can't wait to see what they will do....
So, how much does Nvidia pay you to post this crap? 1 + 1 =2, and that is if scaling is perfect, and it never is, so more like 1.7-1.9 at best, so driver schmivers its still going to be slower with a hot air blower.


I am so tempted to find old AMD/ATI threads to post links to where they turned into napalm exhibits from the rabid fanboys, I must say this whole forum is getting clean, almost too clean. :D Even TMM is calm.
 
Why delay a compute GPU to optimize its gaming performance ? After all enabling FP64 disables boost.

/sarcasm
 
Hear a hint, it's called "TECH" Power up! "World vision" has no place in this discussion ANYWHERE...

except the general nonsense forums

also, that card better come with at least 5 AAA games.
 
except the general nonsense forums

also, that card better come with at least 5 AAA games.

I'd prefer licenses for CUDA renderers TBH.
 
Yeah drivers, of course. If you put your logic aside, drivers can save you.
But if you think with logic then you understand that 3k for a gaming gpu is plain extravagant.
Even the 295x2 is overpriced.
 
Most people don't understand the only reason the 295X does so well vs SLI Titan Blacks is because of the way the chip thermal regulates the clock speeds.
On water where the card run 1100+ on the boost they beat the 295X 90% of the time, I would not be shocked to see them change the drivers so the temps don't cause such a huge clock speed lose as we do now.

The 295X2 does so well because it's a 500W TDP card. A 375W card isn't going to catch it. and nVidia isn't going to dissipate 500W with a single 90mm fan sitting on top of some cooling fins.

I think AMD kept the TDP of the 295X2 a well guarded secret. Even if there were ES versions in the wild that nVidia got reports on, or even working samples, all AMD had to do was have them set to a ~375W TDP. With the dual 8 pin connectors who would have guessed the card, when released, was going to be a 500W card?

The performance target nVidia would have assumed for the Titan-Z would have been way off.

It's just a marketing ploy, Nvidia thinks if they keep it delayed people will wait for it instead of buying the AMD unit now.

It wouldn't be the first time nVidia made that ploy work

How is that the fault of the drivers? They downclocked the shit out of it.

It's not the drivers. It's the power/thermal limit.

They are probably trying to tweak the turbo boost speeds such that it can somehow beat the 295 handedly while staying within TDP, and keeping fan noise down.

The probable problem is that after a few minutes the thing throttles back and performance tanks.

Will they tune the boost settings to boost just long enough to complete a benchmark?

AMD was wise to go for liquid cooling.

The other GK110 models already do that, I don't think they are going to be able to pull this off this time.

It's a point well made, a couple of custom 780 Ti's are already cheaper than AMD's power hungry water heater. The market for these cards is tiny and they make far more money on lesser spec'ed pro cards anyway.

Still, new drivers are always welcome :p

Read THIS review. Play real games, not 3 minute benchmarks, so the cards can warm up properly and settle into their stable boost clocks and the 295X2 kills 780 ti SLI.
 
That power consumption :P
 
^ Yeah we've reached the point of non-return with 28nm :roll:
 
Back
Top