Friday, March 26th 2010

AMD First with OpenGL 4.0 Compliant Graphics Driver

Shortly after the Khronos group announced the OpenGL 4.0, the newest version of the multi-platform graphics API, AMD is out with a preview graphics driver for its ATI Radeon, FireGL, FirePro, and Mobility Radeon graphics accelerators, which includes the OpenGL 4.0 ICD (installable client driver). The driver is available for Windows 7, Windows Vista, Windows XP, and Linux. OpenGL 4.0 is comparable and up to times with Microsoft's DirectX 11 API, it makes use of hardware features such as tessellation on the GPU, per-sample fragment shaders and programmable fragment shader input positions, 64-bit double precision floating point shader operations, etc., and has no restrictions on which later version of Windows it can run on. With OpenGL 4.0 for example, one can expect 3D graphics with the complexity comparable to DirectX 11 on Windows XP.

DOWNLOAD: ATI Catalyst OpenGL 4.0 Preview Driver for Windows 7/Vista, Windows XP, and Linux.
Add your own comment

51 Comments on AMD First with OpenGL 4.0 Compliant Graphics Driver

#1
Zubasa
Sihastru said:
All this noise is nice and all, also it's a good thing we move forward, but what applications and/or games do you know that are using OpenGL 4.0? Is ATI again first to support something that nobody is using to code anything at the moment?
If nobody gives hardware support in the first place, who on earth will code anything?
OpenGL is used in quite a few games and that includes pretty much all the games that can run on Macs, WoW included.
OpenGL 4.0 is a new extension added which is never a bad thing.
Posted on Reply
#2
Thrackan
Nice! Gotta try that stuff soon.
Posted on Reply
#3
wiak
Roph said:
No word of which cards it's supported on? I assume all ATI's DX 10.1+ cards.
Radeon HD Series gives you OpenGL 3.3
Radeon HD 5000 Series and above gives you OpenCL 4.0
Posted on Reply
#4
Sasqui
OpenGL USED to be the standard for video games, and practically every 3D desktop application until a little company called Microsoft came along an introduced DirectX. They promised a robust API, more functionality and closer integration with the OS layer as an enticement, which most game developers bought into, albeit slowely. The other part of it was the ease of code porting to other MS devices/operating systems.

OpenGL was/is open source, so the developers of it weren't working for the profit part of it like M$. OpenGL is somewhat extensible... adding new shader models is possible. With DirectX, you are totally at the whim of M$.

If you want to see a quick, simple comparison of OpenGL to DirectX, install Google Earth. There is program menu option to start it in OpenGL or DirectX. Ironically, I have to use OpenGL to run Google Earth, as it will crash often using DirectX
Posted on Reply
#5
Sihastru
I was talking about OpenGL 4.0, not the past versions, I know a few things older versions of OpenGL are used for (Adobe CS4 for example), but I don't know of any things that use OpenGL 4.0... And what exaclty does this version bring new to the table as opposed to the older versions?
Posted on Reply
#6
WarEagleAU
Bird of Prey
thanks, even though it is 3.3 for my card, i will be using this instead of the new cat 10.3s
Posted on Reply
#7
ov2rey
do HD4870 chip itself support OpenGL4.0?
Posted on Reply
#8
sweeper
I had the 10.3 drivers but without the OpenGL update. Cinebench has a OpenGL benchmark. I was only scoring mid 19FPS @ 2.8GHz. I installed the OpenGL 4.0 update (beta) and ran the benchmark and my system is running all stock now (2.6GHz) and my FPS jumped to 36FPS. So atleast in that benchmark it proved to be a performance gain. Now Heaven benchmark on OpenGL had problems. Some scenes were just black. I believe scene 10 and 12 were solid black with a few others having problems with solid black areas.
Posted on Reply
#9
Mussels
Moderprator
since sasqui brought it up, i'll cover it in one post so we dont need to go off topic:

openGL was king for graphics. it seemed to be the one for hardware acceleration, back when directX was still used for software.

Things changed however, because directX stopped being all about direct draw (2D) and became a 'bundle' - you got 2D (directdraw) 3D (direct3D) audio (directsound) input (directinput) and so on. Coding for openGL may have given you broarder support and potentially better work, but sticking with directX (and shafting linux/mac) made it easier, since you could get everything you needed in the one package - not just the video component.


Lately, people are just making directX games and swapping direct3D for OpenGL - which is why theres not many ports for other platforms. (they have to go to effort for input, sound, etc)
Posted on Reply
#10
Wile E
Power User
Sihastru said:
All this noise is nice and all, also it's a good thing we move forward, but what applications and/or games do you know that are using OpenGL 4.0? Is ATI again first to support something that nobody is using to code anything at the moment?
That was going to be my question. A new api is great and all, but what is out there that uses OGL4 that I need to worry about the drivers right this moment?
Posted on Reply
#11
btarunr
Editor & Senior Moderator
Sasqui said:
OpenGL was/is open source, so the developers of it weren't working for the profit part of it like M$.
Nope, OpenGL isn't open-source, never was. The "Open" in OpenGL refers to open-standards, where each GL vendor (such as NVIDIA, ATI, etc.,) is free to make his own additions (GL extensions).
Posted on Reply
#12
Unregistered
Mussels said:
DX11 graphics on XP? noice!
To bad there are no good games that are using OpenGL. The old stuff from ID doesn't count since they use either OpenGL or DX9...

btarunr said:
Nope, OpenGL isn't open-source, never was. The "Open" in OpenGL refers to open-standards, where each GL vendor (such as NVIDIA, ATI, etc.,) is free to make his own additions (GL extensions).
OpenGL (Open Graphics Library) is a standard specification defining a cross-language, cross-platform API for writing applications that produce 2D and 3D computer graphics. The interface consists of over 250 different function calls which can be used to draw complex three-dimensional scenes from simple primitives. OpenGL was developed by Silicon Graphics Inc. (SGI) in 1992[2] and is widely used in CAD, virtual reality, scientific visualization, information visualization, and flight simulation. It is also used in video games, where it competes with Direct3D on Microsoft Windows platforms. OpenGL is managed by a non-profit technology consortium, the Khronos Group.

The OpenGL standard allows individual vendors to provide additional functionality through extensions as new technology is created. Extensions may introduce new functions and new constants, and may relax or remove restrictions on existing OpenGL functions. Each vendor has an alphabetic abbreviation that is used in naming their new functions and constants. For example, NVIDIA's abbreviation (NV) is used in defining their proprietary function glCombinerParameterfvNV() and their constant GL_NORMAL_MAP_NV.
#13
shevanel
wasnt counterstrike 1.6 and half life 1 openGl?
Posted on Reply
#14
Mussels
Moderprator
shevanel said:
wasnt counterstrike 1.6 and half life 1 openGl?
more like "all of the above" they had software, D3D, and OGL iirc.
Posted on Reply
#15
Champ
what's the most recent game to use OpenGL? I thinking it would be nice to use two 4890's since they are getting dirt cheap for DX11 like graphics.
Posted on Reply
#16
Mussels
Moderprator
Champ said:
what's the most recent game to use OpenGL? I thinking it would be nice to use two 4890's since they are getting dirt cheap for DX11 like graphics.
for "DX11 like" do you mean... DX10?
Posted on Reply
#17
Champ
I thought I read OGL produced DX11 like graphics?
Posted on Reply
#18
Mussels
Moderprator
Champ said:
I thought I read OGL produced DX11 like graphics?
with DX11 hardware.
Posted on Reply
#19
Champ
I was about to say I just read it was only with DX11 hardware. Well, what's the point? You still have to buy a DX11 card and you might as well use MS's stuff, so you get no issues, right?
Posted on Reply
#21
Mussels
Moderprator
PCpraiser100 said:
haha Nvidia is slowpoke!
Posted on Reply
#22
devguy
Champ said:
I was about to say I just read it was only with DX11 hardware. Well, what's the point? You still have to buy a DX11 card and you might as well use MS's stuff, so you get no issues, right?
Well, the HD 5000 series are the first AMD GPUs with official tessellation support, but a tessellator has been built into AMD GPUs since the R600 (in fact, it even went into the x360 Xenos GPU). From what I've read (it obviously will never be verified) is that the original DX10.0 spec was to include tessellation (hence why AMD included it). However, nVidia didn't add such a component into its "DirectX 10.0 certified" g80 architecture, pressuring Microsoft to lax the API and shove off tessellation (and other features) for DirectX 10.1 and 11.

Using the Linux Unigine OpenGl demo (not sure with the Windows OpenGl renderer), one can use the tessellation option with a pre-RV870 GPU (not an option with DirectX 11). Granted, the framerate takes a nose dive, but it is possible. It is unclear if the Unigine demo is using the built in tessellator of a pre-RV870 GPU (RV770 in my case) or just doing it in software. Because the Unigine demo (and Ubuntu Karmic and newer) require an R600 (or more recent GPU), one cannot see if tessellation would even work with a R500 or older GPU. It'd be cool if someone with a pre-Fermi nVidia GPU tries to run the Unigine Linux demo with tessellation enabled and reports their results.

Finally, one more thing to keep in mind is that many stubborn people (;)) chose to remain running Windows XP, and thus they loose out on the DirectX 10/11 features (regardless if their card is capable of them). Should a game support OpenGl 4.0, it will run in XP with graphics comparable to the same game being rendered on Vista/7 under DirectX 11.

As an aside, I would like to do something I don't typically do, and that would be commend nVidia on their Fermi tessellator. From benchmarks I've seen, Fermi's first generation tessellator eats the R870's tessellator for breakfast, yet AMD has had tessellation experience since the Xenos GPU (as I mentioned before). :respect:
Posted on Reply
#24
Mussels
Moderprator
TAViX said:
@devguy

Indeed!! Unreal 2, Unreal Tournament 2, Counter Strike, Half Life, Quake 3 Arena, Serious Sam, and some other games are using the tessellation option when they detect an ATI card. But the feature was actually implemented or coded in game not in the DX driver...I think...

Anyway that feature wasn't called Tessellation but, ATI TruForm

http://ixbtlabs.com/articles/atitruform/

http://www.anandtech.com/showdoc.aspx?i=1476&p=4
yes! i knew it had another name, but had forgotten it.

Truform has been around for a VERY long time, but since NV never used it, it died off. It was mostly used in openGL games, where it was easier 'per hardware' features in
Posted on Reply
#25
devguy
TAViX said:
@devguy

Indeed!! Unreal 2, Unreal Tournament 2, Counter Strike, Half Life, Quake 3 Arena, Serious Sam, and some other games are using the tessellation option when they detect an ATI card. But the feature was actually implemented or coded in game not in the DX driver...I think...

Anyway that feature wasn't called Tessellation but, ATI TruForm

http://ixbtlabs.com/articles/atitruform/

http://www.anandtech.com/showdoc.aspx?i=1476&p=4
Interesting. From Wikipedia:
This unit is reminiscent of ATI's earlier "TruForm" technology, used initially in the Radeon 8500, which performed a similar function in hardware.[5] While this tessellation hardware is not part of the current OpenGL or Direct3D requirements, and competitors such as the GeForce 8 series lack similar hardware, Microsoft has included Tessellation as part of their D3D10.1 future plans.[6] The "TruForm" technology from the past received little attention from software developers and was only utilized in a few game titles (such as Madden NFL 2004, Serious Sam, Unreal Tournament 2003 and 2004, and unofficially Morrowind), because it was not a feature shared with NVIDIA GPUs which had a competing Tessellation solution using Quintic-RT patches which met with even less support from developers.[7] Since the Xenos contains similar hardware, and Microsoft sees hardware surface tessellation as a major GPU feature with proposed implementation of hardware tessellation support in future DirectX releases (presumably DirectX 11),[4][6] dedicated hardware tessellation units may receive increased developer awareness in future titles. It remains to be seen whether ATI's implementation will be compatible with the eventual DirectX standard.
And about the R600 tessellator.
Posted on Reply
Add your own comment