• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces NVIDIA Quadro CX - The Accelerator For Adobe Creative Suite 4

Joined
Sep 15, 2007
Messages
3,944 (0.65/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
First of all, the only way to get it to accelerate would be to somehow illegally get your hands on the software that adds CUDA acceleration to Premier. The only legal way to get and use this software is to buy this Quadro card(saddly). Without that software, the GTX260 renamed to the Quadro will not accelerate. However, once you get that software, any CUDA enabled GPU will accelerate, so renaming the GTX260 is really useless at that point.

:roll:

I never knew installing the quadro driver was so complicated.

In fact, since I have cs4, maybe I should make a nice screenie of the accel at work.
 

aiaN

New Member
Joined
Oct 21, 2008
Messages
2 (0.00/day)
Sorry to tell you are all wrong. Softmodding isnt possible as it where before. But still we got some geniouses out there who have fixed this.

Now let me set your straight! Cause i only registered here to tell you how this works.

So! Well earlier on people did unlock shaders and other stuf by fysicaly modify the cards. Later on it where possible to unlock stuff with a simple bios tweak! now it isnt that easy!

And when it comes to Quadro cards the biggest difference is usualy the rams YES! but whan you realy are paying for is a beter guarantee that the cards wont crash on you. And the huge driver pacage that comes with it for proffesional CAD work.

So, when it comes to Geforce cards there is an even simpler solution to get it working as a quadro card! This you can do with a never Quadrosoftware script. Witch actually works together with rivatuner http://www.guru3d.com/index.php?page=rivatuner !!

Earlier this wherent working good at all, and still it does not increase the power of a geforce card to a quadro. But it increases!

Wish you good luck!

BTW. The script is a plugin for Rivatuner.
 

TheMailMan78

Big Member
Joined
Jun 3, 2007
Messages
22,599 (3.66/day)
Location
'Merica. The Great SOUTH!
System Name TheMailbox 5.0 / The Mailbox 4.5
Processor RYZEN 1700X / Intel i7 2600k @ 4.2GHz
Motherboard Fatal1ty X370 Gaming K4 / Gigabyte Z77X-UP5 TH Intel LGA 1155
Cooling MasterLiquid PRO 280 / Scythe Katana 4
Memory ADATA RGB 16GB DDR4 2666 16-16-16-39 / G.SKILL Sniper Series 16GB DDR3 1866: 9-9-9-24
Video Card(s) MSI 1080 "Duke" with 8Gb of RAM. Boost Clock 1847 MHz / ASUS 780ti
Storage 256Gb M4 SSD / 128Gb Agelity 4 SSD , 500Gb WD (7200)
Display(s) LG 29" Class 21:9 UltraWide® IPS LED Monitor 2560 x 1080 / Dell 27"
Case Cooler Master MASTERBOX 5t / Cooler Master 922 HAF
Audio Device(s) Realtek ALC1220 Audio Codec / SupremeFX X-Fi with Bose Companion 2 speakers.
Power Supply Seasonic FOCUS Plus Series SSR-750PX 750W Platinum / SeaSonic X Series X650 Gold
Mouse SteelSeries Sensei (RAW) / Logitech G5
Keyboard Razer BlackWidow / Logitech (Unknown)
Software Windows 10 Pro (64-bit)
Benchmark Scores Benching is for bitches.
Joined
Sep 12, 2008
Messages
33 (0.01/day)
Location
Novi Sad, Serbia, Europe, Earth, Milky Way, PM 380
System Name marulambrei7
Processor i7 4770k
Motherboard GA Z87
Cooling CM hyper 212 evo
Memory 16gb ddr3
Video Card(s) gtx 660 2gb
Storage Samsung 840 evo SSD 250, 2x 1tb...
Display(s) Hannspree 42" 1080p SJ42DMBB, benq 24" 1200p G2400WAD
Case CM 690
Power Supply cooler master GM 550w
Software win 7 64, max 2014,15, Adobe PS/AE cs6/cc, starcraft2 HotS
Quadro cards the biggest difference is usualy the rams

Here’s a simple comparison of 8800gtx and its variants:

Quadro FX 4600 is renamed and underclocked 8800gtx that costs 1,100$.

Quadro FX 5600 is renamed and overclocked 8800gtx with 2x more memory- 1.5tb opposed to "low" 768mb. This one is 2,200$

8800gtx uses the exactly same chip, altho a bit different in core/shaders/memory speed, and while its actually faster than its underclocked quadro variant it now costs below 200$- if found. While qfx 5600 is a bit faster than 8800gtx, 8800 ultra is faster than both - since its even more overclocked. 8800 ultra is also sub 200$ today... if found.

Now lets look at the performance difference between the highest-end Quadro cards and 2 generations old geforce:

Which one is faster for 3ds Max, AutoCad or Maya?

Here's the kicker- 8800!

In all tests and every possible way, 8800 is faster in real work. How? Its actually physically faster than quadro variants and added ram on 5600 model does nothing since 768mb is way more than any of the users/ tests use to begin with. 1,5tb is nothing more than marketing ploy for the clueless- or nice if you have a CPU from the future(are we there yet Doc?) that can handle more than a billion polygon scene, that is needed to use up 1,5tb of ram.

but this surely cant be, just look at SpecCheatTests
(i realize that im quoting myself on a yet unstated sentence)

Its true that SpecCheatTests wont reflect what I've just said, but these tests are synthetic and OpenGL based. OpenGL is not used by any of the mentioned professional programs for ANY serious work today.

8800, aside from being crippled in drivers for windowed OpenGL work, is also treated badly in the SpeCheatTest- these conditions are next to impossible to recreate in actual work with many of the new professional programs that have long since ditched the outdated and backward OpenGL methodology. OpenGL, aside from being old/slow/ugly is actually additionally crippled in all non-workstation cards, as if OpenGL wasn’t slow and ugly enough to begin with.

To see what I’m talking about in OpenGL vs D3D performance, just test Max/Maya/ACad with a workstation card in both OpenGL and D3D. You’ll surely notice that speed is 10x slower at best, and 100x slower on average when using OGL. It might be a bit more difficult to notice to the novice, but light and shadow effects/quality are sub-par or nonexistent when using OGL as well.

This is all on “professional” cards that don’t have OpenGL drivers crippled. When you see that a traditional driver crippling takes away another 100x performance in same class geforce chip, you get to 1k-10k x slower performance than D3D while, looking worse too.

Who uses OpenGL for anything anymore? Poor, obsolete CAD users and people who are forced to use OpenGL due to lack of choice on their OS(Mac, Linux...)
 
Last edited:

aiaN

New Member
Joined
Oct 21, 2008
Messages
2 (0.00/day)
Yeah to good that people who takes those tests done sit and work with highpoly models through the roof and work with textures who are frigging huge at the same time working with physics and animation.... cause that is pre calculated in computer games. But in maya and 3d max these are things that the GPU needs to calculate in a high quality render. And when you do a Mental Ray with faces over the million you can start swallowing ram chips as they where h2o atoms.... and you know you need a lot of them unless it/you will chocke....

So, no need for loads of ram? well.... dont come here and say 4gb ram on MOBO and 1gb ram on GPU is enough. I know for shure when working with programs as Maya and Max in a professional manner! That aint enough....

But exept for that i wont correct you wrong!
 
Joined
Sep 12, 2008
Messages
33 (0.01/day)
Location
Novi Sad, Serbia, Europe, Earth, Milky Way, PM 380
System Name marulambrei7
Processor i7 4770k
Motherboard GA Z87
Cooling CM hyper 212 evo
Memory 16gb ddr3
Video Card(s) gtx 660 2gb
Storage Samsung 840 evo SSD 250, 2x 1tb...
Display(s) Hannspree 42" 1080p SJ42DMBB, benq 24" 1200p G2400WAD
Case CM 690
Power Supply cooler master GM 550w
Software win 7 64, max 2014,15, Adobe PS/AE cs6/cc, starcraft2 HotS
Mental Ray probably will one day be done using GPU power. That day isnt today and by the looks of it, it isnt that close yet- than again it isnt very far away either.

All scene requirements at rendering time is handled by the CPU and system RAM. You can have integrated graphics with not a single mb of dedicated video ram or the quadro fx 5600 with 1,5tb ram. It wont make 0.01% difference.

Scene previewing is another matter all together, and all x8xx cards that have came out for the alst 2 years are capable of over billion polygon scenes with 100k textures. Moderns CPUs arent. They might be, if the viewports were optimized to use more than one core, but sadly, that is not yet the case.

For over 2 years, CPUs were the bottleneck of high-end 3d programs, not the GPUs. Still, interestingly enough, GPUs keep getting ridiculously better while CPUs get minute improvements.
 
Joined
Sep 15, 2007
Messages
3,944 (0.65/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
Sounds like those douche coders need to learn a lesson from gaming. OpenGL rules.
 

panchoman

Sold my stars!
Joined
Jul 16, 2007
Messages
9,595 (1.57/day)
Processor Amd Athlon X2 4600+ Windsor(90nm) EE(65W) @2.9-3.0 @1.45
Motherboard Biostar Tforce [Nvidia] 550
Cooling Thermaltake Blue Orb-- bunch of other fans here and there....
Memory 2 gigs (2x1gb) of patriot ddr2 800 @ 4-4-4-12-2t
Video Card(s) Sapphire X1950pro Pci-E x16 @stock@stock on stock
Storage Seagate 7200.11 250gb Drive, WD raptors (30/40) in Raid 0
Display(s) ANCIENT 15" sony lcd, bought it when it was like 500 bucks
Case Apevia X-plorer blue/black
Audio Device(s) Onboard- Why get an sound card when you can hum??
Power Supply Antec NeoHe 550-manufactured by seasonic -replacement to the discontinued smart power series
Software Windows XP pro SP2 -- vista is still crap
can't cuda cards also accelerate? so why would we need this unless if we are ridiculous game designers.
 

norway7

New Member
Joined
Oct 27, 2008
Messages
1 (0.00/day)
Softmod of Geforce FX 260/280 to Quadro CX

I think most of the users of Premiere CS4 wonder if it's possible decrease rendering time of H.264 with a softmoded FX260/280. Maybe not as much as with a Quadro CX, but GPU rendring will without doubt go faster than ordinary CPU rendering. Someone who actually tested this?
 
Top