News Posts matching #NVIDIA

Return to Keyword Browsing

NVIDIA Folding@Home GPU client: where is it?

On February 16th of this year, NVIDIA went ahead and announced their new GPU computing client, CUDA. Over seven weeks later, we have yet to see so much as a beta of CUDA. And so, this means that NVIDIA, similar to the G80 Vista driver fiasco, has yet to deliver a product that they promised we'd see. The part about this that NVIDIA users probably hate the most is the lack of a GPU-based Folding@Home client. It seems that these days, everything except an NVIDIA system (even the PS3) can run a Folding@Home GPU client. This is an extreme opposite to ATI's GPU processing client, called "Stream", which has a large list of clients that can be used to accelerate programs using the GPU.

G100 supports Cuda 2

G100, an upcoming graphics core from NVIDIA, supposedly supports CUDA 2. CUDA, an acronym for "compute unified device architecture", allows programmers to offload complex mathematical functions onto the GPU for processing. It takes advantage of Unified Shaders for these mathematical calculations. This is a new way to do demanding science computer calculations, as a graphics chip is much faster than a CPU.

The differences between CUDA and CUDA 2 are to be seen. The G100 core is scheduled Q1 2008 and it might be even smaller than 65 nanometre.

G84 and G86 NVIDIA chips late

NVIDIA's mainstream chips should be launched on April 17th, we were already reporting about G86 and G84 here. Now one of The Inquirers says NVIDIA got problems with the latest chip revision and therefore needs to respin it to get rid of the bugs. That will most likely mean that the their launch together with the 8800 Ultra in mid April will be postponed. But I am quite sure NVIDIA will still be first with it's direct R600 rival, AMD/ATI is again later then they admit.

NVIDIA Launches nForce 680i LT SLI Edition

NVIDIA has updated its line of motherboard chipsets with the release of nForce 680i LT SLI Edition chipset. The 680i LT is cut-down version of the nForce 680i flagman. The main difference between the two chipsets is the lack of third PCI-E physics slot, which is not presented in the 680i LT model. Other specs remain the same. Motherboards equipped with the 680i LT chipset should cost under $200. EVGA was the first NVIDIA partner to release motherboard, which follows NVIDIA's reference nForce 680i LT design. Reviews of the EVGA motherboad also surfaced all over the big hardware sites.

Hot Hardware | [H]ard|OCP | Legit Reviews | Guru3D | nvnews.net | bit-tech | iXBT | The Tech Report

Nvidia's 8950GX2 gets a block

Even before the 8950GX2 card hits the stores, blocks are already starting to appear. The Inquirer managed to snap a few shots of Alphacool's block, ready to cool the beast:

The block covers everything: GPU, RAM, NVIO1 chip, PWM...

The block looks very thin, which brings up concerns over its restriction/performance. Alphacool has produced some highly restrictive blocks in the past.

NVIDIA Terminates ULi Chipset Supplies

Maybe many of you know that ULi was acquired by NVIDIA in February of last year. Now in order to push its own chipset series into the lower segment of the mainboard market, NVIDIA decided to terminate the supplies of ULi chipsets to third parties completely. At the CeBIT 2007 show NVIDIA Corp. has demonstrated such solutions as MCP68 and MCP73, so the ULi solutions need to be removed from their way. Engineers that were involved into ULi chipsets development will now join the NVIDIA team working in the same field. So, it looks like ULi branded products are history now.

PNY Releases NVIDIA Quadro FX 4600 and Quadro FX 5600 Graphics

PNY Technologies, the supplier and marketer of NVIDIA Quadro by PNY professional graphics boards announced today the immediate availability of next generation NVIDIA Quadro by PNY solutions based on new NVIDIA Quadro FX 4600 and FX 5600 graphics, and NVIDIA GSync genlock/frame lock and HD SDI options. These new ultra high-end professional graphics solutions meet the challenges of the most complex 3D design, DCC, visualization, scientific, and broadcast applications with a new unified architecture, Shader Model 4.0 technology, large frame buffers, and GPU computing for visualization technology. The new NVIDIA Quadro by PNY solutions include:

NVIDIA's New Software Development Kit Supports Shader Model 5.0

Even though not all game developers have adopted shader model (SM) 3.0 introduced three years ago, and some claiming that transiting to DirectX 10's shader model 4.0 right now hardly makes sense, NVIDIA's new software development kit (SDK) already features profiles for shader model 5.0, which is believed to be an improved version of the SM4.0.

NVIDIA new SDK 10, which was released just last week, apparently contains macro invocations that define the supported CG profiles, including such profiles as Fragment50, Vertex50, Geometry50, meaning that current SDK supports architecture micro-code profiles for pixel shaders 5.0, vertex shaders 5.0 and geometry shaders 5.0.

While hardly anybody knows what shader model 5.0 actually is and how much is it different from the shader model 4.0, the inclusion of the architecture micro-code inside a compiler indicates that NVIDIA foresees the arrival of shader model 5.0-capable hardware soon enough to enable game developers to compile their titles for it.

NVIDIA GeForce 8500GT Pictured


VR-Zone has revealed some information about the GeForce 8500GT video cards to be released on April 17. Based on 80nm TSMC process and 128-bit memory interface, the GeForce 8500GT cards will be clocked at 450MHz core with 256MB of DDR2 memory at 400MHz (800MHz DDR). Those cards are expected to outperform any current GeForce 7600GS, scoring around 22xx and 42xx marks at 3DMark06 (1280x1024) and 3DMark05 (1024x768) respectively. The DX10 performance is unknown at this stage. The NVIDIA GeForce 8500GT series will be priced between $79 to $99 USD.

AGP solutions from NVIDIA: 7900GS and 7950GT

As it turns out AGP is not quite dead like many people (obviously PCI-E users) claim. If you check our news since the 1st of February we reported five times about the neverending AGP-story. Three of the reports were related to new AGP cards from ATI released to the market. Now it's NVIDIA's turn and they hit the undying AGP crowd with the successor of the 7800GS and another, even faster card.
The Germans from 3DCenter.de found out that with the latest NVIDIA Forceware 101.41 Beta driver you not only get problems when trying to run Doom 3 and Battlefield 2142 under Vista but NVIDIA is disclosing the upcoming 7900GS and 7950GT GeForce cards with it.
But the best part of the story is that at least a 7950GT card from XFX is already listed at a pricewatcher for a mere 226 Euros. Of course it's not in stock yet but surely not far from really being around.

NVIDIA 8600GT and 8600GTS Pictured

OCWorkbench has found some pictures and specs of NVIDIA's mainstream DirectX 10 cards, the 8600GT and the 8600GTS. The GT (pictured below on the left) is the replacement for the 7600GT and will feature the G84-300 GPU running at 540MHz and either 128MB or 256MB of 128-bit GDDR3 RAM at 1400MHz, priced between $150 and $180. The 8600GTS (below on the right) is a step up from the GT and will replace the 7900GS, using the G84-400 GPU running at 675MHz and either 256MB or 512MB of 128-bit GDDR3 RAM at 2000MHz, with a price of between $200 and $250. As reported before on techPowerUp!, these cards should be released (along with the 8500GT) on April 17th.

NVIDIA releases G80 Quadro cards

NVIDIA Corporation, the worldwide leader in programmable graphics processor technologies, yesterday unveiled a new line of professional graphics solutions: NVIDIA Quadro FX 4600, Quadro FX 5600, and NVIDIA Quadro Plex VCS Model IV. Armed with the largest increase in GPU power and functionality to date, these solutions are designed to help solve the world's most complex professional graphics challenges.

Tackling the extreme visualization challenges of the automotive styling and design, oil and gas exploration, medical imaging, visual simulation and training, scientific research, and advanced visual effects industries, these new Quadro solutions offer:
  • Next-Generation Vertex and Pixel Programmability-Shader Model 4.0 enables a higher level of performance and ultra-realistic effects for OpenGL and DirectX 10 professional applications
  • Largest Frame Buffers-Up to 1.5 GB frame buffers deliver throughput needed for interactive visualization and real-time processing of large textures and frames, enabling the superior quality and resolution for full-scene antialiasing (FSAA)
  • New Unified Architecture-Industry-first unified architecture capable of dynamically allocating compute, geometry, shading and pixel processing power for optimized GPU performance
  • GPU Computing for Visualization-Featuring NVIDIA CUDA technology, developers are, for the first time, able to tap into the high-performance computing power of Quadro to solve complex, visualization problems

NVIDIA isn't the only graphics company short of Vista drivers

While NVIDIA nearly got sued over their lack of Vista-ready drivers for their G80, ATI isn't exactly innocent. The Inquirer did a quick experiment to see if it was possible to configure a Vista workstation with an ATI FireGL graphics card. To their surprise, it wasn't. This is because ATI does not have any FireGL drivers compatible with Windows Vista. And unlike NVIDIA, they do not even have beta drivers out. While most of the gaming community is more likely to use an NVIDIA G80 than an ATI FireGL, this is still a major problem for anyone relying on a FireGL based workstation.

NVIDIA to Launch GeForce 8600 Series on April 17th

NVIDIA is set to launch the mainstream 8600GTS (G84-400) and 8600GT (G84-300), as well as the 8500GT (G86-300), on the 17th of April. The GeForce 8600GTS and 8600GT will have 256MB GDDR3 memories onboard each, both sporting a 128-bit memory interface but no HDMI yet. The GeForce 8600GTS is meant to replace the 7950GT and 7900GS, while the latter will replace the 7600GT. The 8500GT aims to replace the 7600GS.

The 8600GTS will be clocked at 700MHz core / 2GHz memory and comes with dual D-DVI, HDTV, HDCP, but requires external power. Price is estimated between US$199-$249. Another mainstream model, the 8600GT, will be clocked at 600MHz core / 1.4GHz memory and has 2 variants; one with HDCP (G84-300) and the other without (G84-305). This model doesn't requires any external power. It will be priced between US$149-$169.

The last model meant for the budget segment is actually a G84 core but downgraded to meet the value segment pricing structure. The 8500GT will be clocked at 450MHz core / 800MHz 256MB DDR2 memory and comes in 2 variants; one with HDCP (G86-300) and the other without HDCP (G86-305). The 8500GT should see a retail price between US$79 to US$99. The 8300GS, which will be released towards the end of April, is expected to replace the current 7300 series.

The NVIDIA 80nm G84 and G86 line-up will meet head on with ATi's DX10 65nm offerings, where the mainstream RV630 is slated to arrive in May and the value RV610 is slated to arrive earlier in April.

nForce 680i LT SLI for Hardcore Gamers on March 12

As CeBIT is approaching, more and more new products are being finalized for launch. Along with MCP68 chipset, NVIDIA is going to launch the nForce 680i LT SLI chipset at the event, primarily targeted at hardcore gamers, whereras the current 680i SLI chipset is targeted at hardcore enthusiasts. nForce 680i LT SLI boards will be some US$50 cheaper than the 680i SLI boards, where the MSRP is around US$199 compared to US$249+.

The main changes are that nForce 680i LT SLI reference boards will come with active cooling instead of the heat-pipe design currently used on the 680i SLI reference board, a green PCB instead of a black PCB, will support DDR2-800 instead of DDR2-1200 SLI memory, 8 USB 2.0 ports instead of 10, one Gigabit Ethernet instead of two, two PCIe x16 slots instead of 3, and without all the neat stuff like LED POST codes, Power/Reset buttons and Speaker.

NVIDIA mentioned that overclocking on 680i LT SLI won't be as good as the 680i SLI, but there are strong reasons to believe that the chipset is basically the same unless the company has done some sort of sorting/binning on the chipsets. Will this be the budget OC king?

Zotac - New Brand of Graphic Cards


Zotac is one of the latest graphics card maker to enter the video market. It will appear at this year's CEBIT in Germany, at Booth B27 Hall 20. In fact, Zotac is subdivision company formed out of PC Partner, which will produce only NVIDIA GPU-based graphics cards. All our readers interested in the new Zotac brand, can click here and read the first Zotac GeForce 8800GTS 320MB review on the net.

No G8x AGP chip in the end?

Sad news for all AGP based motherboard owners (including myself) if it turns out to be true. The VR Zone is reporting that on the basis of their foundings NVIDIA won't develop a new AGP chip. The reason is that the G80 'simply' can't support it (though a bridge chip should implement the required compatibility if you ask me). That leads us all to hope for a R600 solution it seems.

G90 will be a 65nm G80 with 512-bit GDDR4

Or at least, that's the current rumor. While we debate the current R600 rumors, The Inquirer is claiming that their "senior industry sources" have let loose the first G90 details rumors. The G90 will undertake the monumental task of putting the G80 through a die shrink. If all goes well, this will allow for very high clocks, much lower power consumption, and a lower production cost. NVIDIA also hopes to get hold of some GDDR4 for the G90, and will put it on a 512-bit bus.

New NVIDIA compiler lets developers offload math functions to GPU

NVIDIA has announced the release of beta versions of the SDK and C compiler for their Compute Unified Device Architecture (CUDA) technology. The C compiler includes a set of C language extensions that will enable developers to write C code that targets NVIDIA's GPUs directly. These extensions are supported by software libraries and a special CUDA driver that exposes the GPU to the OS and applications as a math coprocessor.

This approach differs to that taken by AMD/ATI and their "Close to Metal" (CTM) initiative. With CTM, AMD/ATI has opened up the low-level ISA so that their graphics products can be programmed directly in assembly language. CTM relies on developers creating libraries and higher-level tools for in-game use.
NVIDIA CUDA technology is a fundamentally new computing architecture that enables the GPU to solve complex computational problems in consumer, business, and technical applications. CUDA (Compute Unified Device Architecture) technology gives computationally intensive applications access to the tremendous processing power of NVIDIA graphics processing units (GPUs) through a revolutionary new programming interface. Providing orders of magnitude more performance and simplifying software development by using the standard C language, CUDA technology enables developers to create innovative solutions for data-intensive problems. For advanced research and language development, CUDA includes a low level assembly language layer and driver interface.

NVIDIA Releases WHQL Drivers for Vista

NVIDIA has released its first WHQL certified drivers for Windows Vista, both 32-bit and 64-bit versions. The release number is 100.65, and these drivers work with all GeForce series 6, 7 and 8 cards, offering DirectX 10 support to those with 8800 cards. DirectX 10 with SLI is still unsupported, but it's good to get beyond the beta stage of drivers. You can download them here.

NVIDIA to shrink G72 to make G78

The G78 will be a 65nm version of the G72. Instead of being themed for basic video or games, it is themed around a low-power, Vista-ready solution. This G78 will only be expected to run Aero glass, an occasional DVD, and flash-based games. It will be produced with the G84 series, have a 64 bit memory interface, and will cost $60 or less.

NVIDIA Vice President on Vista Drivers

Although Vista is still a new OS, many users are complaining about the driver support, with one of the biggest complaints being NVIDIA's GeForce drivers - or lack of them. NVIDIA's Vice President of Software Engineering, Dwight Diercks, has been justifying the problems in an interview with Real World Benchmarks. To see the whole interview I suggest you visit the source, but the major contributing factor for the delay is that the company needs to write six new drivers - one for each of DirectX 9, DirectX 9 SLI, DirectX 10, DirectX 10 SLI, OpenGL and OpenGL SLI. One of NVIDIA's drivers for Vista has over 20 million lines of code, which is similar to all of Windows NT. He goes onto comment that the certified 8800 series driver should be released by the end of the month, and the SLI driver for the 7x00 series should be available in March, as well as support for Blu-ray and HD DVD.
Return to Keyword Browsing
Jun 16th, 2024 06:06 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts