Tuesday, February 26th 2013
AMD TressFX Technology Detailed
AMD unveiled the new TressFX technology it teased us with, earlier this week. The technology, as predicted, works to create realistic hair rendering and physics; but we imagine it could be applied to foliage and hopefully, furry donuts as well. It will be first implemented in the 2013 reboot title of the Tomb Raider franchise, in which Lara Croft finally parted with her braid. TressFX helps accurately render Croft's hair, drawing finer locks of hair than pre-rendered hair textures plastered on bigger hair polygons that look unnatural. The free and fluid nature of these locks can then be used to accurately draw the effects of wind and water onto the hair. Below are a few before-after instances of TressFX.
Technically, TressFX is a toolset co-developed by AMD and Crystal Dynamics, which taps into DirectCompute to unlock the number-crunching prowess of the GPU (specifically Graphics CoreNext ones), to render individual strands of hair. It is built on the foundation laid by AMD's work on Order Independent Transparency (OIT), and uses Per-Pixel Linked-List (PPLL) data structures to manage rendering complexity and memory usage. DirectCompute is additionally used to process the physics of these strands of hair, which are affected by the character's motion, and elements such as wind and water/rain. TressFX will be implemented at least on the PC version of the upcoming Tomb Raider.
Technically, TressFX is a toolset co-developed by AMD and Crystal Dynamics, which taps into DirectCompute to unlock the number-crunching prowess of the GPU (specifically Graphics CoreNext ones), to render individual strands of hair. It is built on the foundation laid by AMD's work on Order Independent Transparency (OIT), and uses Per-Pixel Linked-List (PPLL) data structures to manage rendering complexity and memory usage. DirectCompute is additionally used to process the physics of these strands of hair, which are affected by the character's motion, and elements such as wind and water/rain. TressFX will be implemented at least on the PC version of the upcoming Tomb Raider.
99 Comments on AMD TressFX Technology Detailed
GameCube = ATi Flipper / PS2 = GS / X-Box = Nvidia NV2A
Wii = ATi Hollywood / PS3 = Nvidia SCEI RSX / X-Box 360 = ATi Xenos
Wii U / PS4 / X-Box 720 = AMD Radeon GPUs
Just in numbers alone it opens up and your not hindering yourself as a developer in anyway if you choose to go PC unlike a proprietary API which is dependant on the end user having a certain hardware type.
This might just be in the PC version as of now but the potential to getting adopted and used is much greater then before and has a higher chance given what we currently know of whats going to be in the Next-Gen consoles.
Anyways.. didn't Alice Madness Returns had similar hair effects?
At least this is a step in the right direction. Of course more physics all around would be better. Isn't that a developers not doing it not amd or nvideas fault?
nVIDIA will most likely catch up with it's future generation of hardware, jut like AMD did with tessellation. After all, they are all part of DX11, so using them in games means using the features that this API has indifferent to the capabilities of each side (hopefully, as efficient as possible). nVIDIA made a bet and perhaps it lost it. Future will tell. At this point, if nVIDIA made some better compute parts from 6xx series, this would not be such a big fuss. But then again, perhaps 680 and the rest would not be as fast as 7970 and their direct contestants.
As far as I'm aware, Dirt Showdown developers put that DC usage into their game long before everyone new nVIDIA was weak in this department.
PhysX uses both GPU and CPU...and pretty much always has done....if it didn't (as example), why would the 3.0 SDK include multithreaded CPU support?
As for the whole she-done-me-wrong wailing, some of us remember the posturing from both sides:
June 2006: ATI start posturingagainst Ageia
September 2007: Intel cuts AMD off at the knees by acquiring Havok
November 2007: AMD look into acquiring Ageia
Three months later Nvidia acquires Ageia
June 2008: AMD bigs up Havok FX (Number of Havok FX games produced : 0 )
Around the same time, Nvidia offered PhysX licence to AMD. AMD not interested and immediately begin a public thrashdownof PhysX via messrs Huddy, Cheng and Robison while promoting OpenCL Havok FX. Nvidia reply by locking AMD/ATi out of PhysX...AMD replyby telling world+dog that PhysX is irrelevant.
Personally, I'd say that having watched the whole thing unfolding its a case of handbags at ten paces. AMD dithering twice on entering the GPU physics market and losing out...twice, then coming on strong with the "we didn't want it anyway" gambit, Nvidia playing hardball in response to trash talk. Neither of the companies covers themselves in glory, but strangely enough, a whole bunch of people forgetting that the bitch-slapping involved both camps. From the timelines, this must have been the case. There is no way, short of an educated guess, that AMD would have expected Nvidia to pare down GK104 with respect to compute- especially given Nvidia's previous GPUs since G80. Having said that, I doubt that AMD wouldn't capitalize on the fact now that they are aware of it. From this PoV, Titan's entry into the market makes a convenient backstop (for Nvidia) should AMD look to code endless compute loops into future titles. It might have crippled the performance of pre-GK110 Nvidia (and VLIW4/5 AMD) at the expense of GCN, but Titan's appearance would probably convince AMD that massive compute additions might not be the PR boon they could have been.
:nutkick::laugh::clap:
With nude mods portraying her realistic and glistening pubic hair
Although I would expect Lara would have to be mighty hairy down there to get any sort of hair movement going.
It maybe a good idea for Square Enix to make some DLC hairs for Lara or add it to new DLC to promote TressFX if they want more money.
It was more of a I got the shiny ball and i'll make it work for my hardware and I dont care as much if it works with yours so buy our products.
Nvidia was its own worst enemy and it was well documented in articles. It took Nvidia 3yrs to recognize there selfish mistake but by that time it was too late.
Not to mention they are doing it again with Tegra, Tegra optimize games sold at the Tegra Zone.
If you were to beleive Nvidia statement it was tossing PhysX at the developers and there were rejecting it. Not to mention that PhysX seamed to be a PS3 priority and PCs were taking a backseat so they were content to only update PhysX for the GPU and leave CPU behind.
After all why update it if developers arent asking for it and you can use it to your advantage to sell your GPUs as an added feature. Makes the buying of Aegia pointless if your not.
Thats why they are only how many titles that support PhysX in 5yrs ?:confused:
Games using Havok doesnt look like 0 :ohwell:
Havok FX was cancelled for that very same reason why PhysX had been failing. Hardware dependant and limits developers potential customer base. Intel probably had sense and decided not every pc will have the same GPU or a Nvidia/AMD GPU but most will have a CPU and that would be the best going forward.
Despite it was PSX and that did tax a lot... :D