Tuesday, February 26th 2013

AMD TressFX Technology Detailed

AMD unveiled the new TressFX technology it teased us with, earlier this week. The technology, as predicted, works to create realistic hair rendering and physics; but we imagine it could be applied to foliage and hopefully, furry donuts as well. It will be first implemented in the 2013 reboot title of the Tomb Raider franchise, in which Lara Croft finally parted with her braid. TressFX helps accurately render Croft's hair, drawing finer locks of hair than pre-rendered hair textures plastered on bigger hair polygons that look unnatural. The free and fluid nature of these locks can then be used to accurately draw the effects of wind and water onto the hair. Below are a few before-after instances of TressFX.

Technically, TressFX is a toolset co-developed by AMD and Crystal Dynamics, which taps into DirectCompute to unlock the number-crunching prowess of the GPU (specifically Graphics CoreNext ones), to render individual strands of hair. It is built on the foundation laid by AMD's work on Order Independent Transparency (OIT), and uses Per-Pixel Linked-List (PPLL) data structures to manage rendering complexity and memory usage. DirectCompute is additionally used to process the physics of these strands of hair, which are affected by the character's motion, and elements such as wind and water/rain. TressFX will be implemented at least on the PC version of the upcoming Tomb Raider.
Add your own comment

99 Comments on AMD TressFX Technology Detailed

#51
Fluffmeister
XzibitIts part of DirectX. There is this company called Microsoft. Makes OS for PCs, Phones, Tablets and has a gaming console. It not just helps AMD but Microsoft. It can cross promote games with features that are further enhanced on Games for Windows. Even if its not embrased by PS4 or Wii/U.
Equally I seriously doubt Sony gives a toss about DirectX and who embraces it. Again, there is a wide range of technologies licensed on both sides which developers are free to use, they don't have to be mutually exclusive.
Posted on Reply
#52
Xzibit
FluffmeisterEqually I seriously doubt Sony gives a toss about DirectX and who embraces it. Again, there is a wide range of technologies licensed on both sides which developers are free to use, they don't have to be mutually exclusive.
Sony might not but developers will have a instantly bigger pool of potential buyers for there products and they will want to 1-up each other.

GameCube = ATi Flipper / PS2 = GS / X-Box = Nvidia NV2A

Wii = ATi Hollywood / PS3 = Nvidia SCEI RSX / X-Box 360 = ATi Xenos

Wii U / PS4 / X-Box 720 = AMD Radeon GPUs

Just in numbers alone it opens up and your not hindering yourself as a developer in anyway if you choose to go PC unlike a proprietary API which is dependant on the end user having a certain hardware type.

This might just be in the PC version as of now but the potential to getting adopted and used is much greater then before and has a higher chance given what we currently know of whats going to be in the Next-Gen consoles.
Posted on Reply
#53
Bjorn_Of_Iceland
Dj-ElectriCIf this includes realistic boob movement i will sell my two 680s and get two 7970s
Nice. One for each boob I reckon?

Anyways.. didn't Alice Madness Returns had similar hair effects?
Posted on Reply
#54
seronx
Bjorn_Of_IcelandAnyways.. didn't Alice Madness Returns had similar hair effects?
Alice Madness Returns hair is basically the same thing but isn't GPU accelerated or as accurate as Tomb Raiders.
Posted on Reply
#55
Nordic
RejZoRSo they sticked some hair on a baldy who was designed to be bald to begin with. Whats next, making Lara hairy where she wasn't intended to be? Pathetic, like they are competing with NVIDIA who will make more retarded post processing physics effects and lock them down to one platform.
F**k that.

It looks patehtic as well. This guy looks like those eggs where you stuff them with cotton and plant some wheat seeds in them so they grow hair. An egg with wheat come over. Nice...
The guy looks better with a bald head. Period.

If anyone bothered to work with some art ANYONE could make some rather realistic har using CPU alone. You wouldn't have 3 billions of hair strands but if anyone would even bother to make clusters of hair taht move based on head movement, freakin CPU Havok could do that. But instead, no one even bothered to do that. Instead pretty much all games used 100% static hair.
So why 100% static or 100% super duper HW accelerated. Like no one knows how to fuckin make something in the middle. They always have to do it on one or another extreme...
I am pretty sure that hair on the bald guy was not done by amd's tressfx.

At least this is a step in the right direction. Of course more physics all around would be better. Isn't that a developers not doing it not amd or nvideas fault?
Posted on Reply
#56
TheGuruStud
RejZoRSo they sticked some hair on a baldy who was designed to be bald to begin with. Whats next, making Lara hairy where she wasn't intended to be? Pathetic, like they are competing with NVIDIA who will make more retarded post processing physics effects and lock them down to one platform.
F**k that.

It looks patehtic as well. This guy looks like those eggs where you stuff them with cotton and plant some wheat seeds in them so they grow hair. An egg with wheat come over. Nice...
The guy looks better with a bald head. Period.

If anyone bothered to work with some art ANYONE could make some rather realistic har using CPU alone. You wouldn't have 3 billions of hair strands but if anyone would even bother to make clusters of hair taht move based on head movement, freakin CPU Havok could do that. But instead, no one even bothered to do that. Instead pretty much all games used 100% static hair.
So why 100% static or 100% super duper HW accelerated. Like no one knows how to fuckin make something in the middle. They always have to do it on one or another extreme...
How dare you allege that physics can be done on the CPU, thereby, defying nvidias fullproof (retarded) plan of forcing their standard to run only on their cards, even though it works perfectly on cpu!
Posted on Reply
#57
TRWOV
FluffmeisterThe improvements when enabled are really rather stunning:
Are they going to sell the bottled version too? I could use some.
Posted on Reply
#58
Calin Banc
Crap DaddyOf course it will. But NV cards will take a performance hit. There are a few games out there which use the direct compute ability of GCN not because it's the only way to make the game look good but because they are Gaming Evolved titles.
I'm not programer so I don't really know if it is or it is not the only way or the most efficient way of accomplish a goal, but the results are there. It's not an overly tessellated object with 10k polys instead of a few hundred or a thousand, it's not some invisible ocean underneath the map and it is most definitely not AMD locked.

nVIDIA will most likely catch up with it's future generation of hardware, jut like AMD did with tessellation. After all, they are all part of DX11, so using them in games means using the features that this API has indifferent to the capabilities of each side (hopefully, as efficient as possible). nVIDIA made a bet and perhaps it lost it. Future will tell. At this point, if nVIDIA made some better compute parts from 6xx series, this would not be such a big fuss. But then again, perhaps 680 and the rest would not be as fast as 7970 and their direct contestants.

As far as I'm aware, Dirt Showdown developers put that DC usage into their game long before everyone new nVIDIA was weak in this department.
Posted on Reply
#59
HumanSmoke
TheGuruStudHow dare you allege that physics can be done on the CPU, thereby, defying nvidias fullproof (retarded) plan of forcing their standard to run only on their cards, even though it works perfectly on cpu!
Cool story.
PhysX uses both GPU and CPU...and pretty much always has done....if it didn't (as example), why would the 3.0 SDK include multithreaded CPU support?

As for the whole she-done-me-wrong wailing, some of us remember the posturing from both sides:
June 2006: ATI start posturingagainst Ageia
September 2007: Intel cuts AMD off at the knees by acquiring Havok
November 2007: AMD look into acquiring Ageia
Three months later Nvidia acquires Ageia
June 2008: AMD bigs up Havok FX (Number of Havok FX games produced : 0 )
Around the same time, Nvidia offered PhysX licence to AMD. AMD not interested and immediately begin a public thrashdownof PhysX via messrs Huddy, Cheng and Robison while promoting OpenCL Havok FX. Nvidia reply by locking AMD/ATi out of PhysX...AMD replyby telling world+dog that PhysX is irrelevant.

Personally, I'd say that having watched the whole thing unfolding its a case of handbags at ten paces. AMD dithering twice on entering the GPU physics market and losing out...twice, then coming on strong with the "we didn't want it anyway" gambit, Nvidia playing hardball in response to trash talk. Neither of the companies covers themselves in glory, but strangely enough, a whole bunch of people forgetting that the bitch-slapping involved both camps.
Calin BancAs far as I'm aware, Dirt Showdown developers put that DC usage into their game long before everyone new nVIDIA was weak in this department.
From the timelines, this must have been the case. There is no way, short of an educated guess, that AMD would have expected Nvidia to pare down GK104 with respect to compute- especially given Nvidia's previous GPUs since G80. Having said that, I doubt that AMD wouldn't capitalize on the fact now that they are aware of it. From this PoV, Titan's entry into the market makes a convenient backstop (for Nvidia) should AMD look to code endless compute loops into future titles. It might have crippled the performance of pre-GK110 Nvidia (and VLIW4/5 AMD) at the expense of GCN, but Titan's appearance would probably convince AMD that massive compute additions might not be the PR boon they could have been.
Posted on Reply
#60
Prima.Vera
LionheartLmfao you made me spit out my organic apple juice:roll::roll:
What's an organic apple juice. Can it be inorganic apple juice?? :eek::eek:
:nutkick::laugh::clap:
Posted on Reply
#61
GLD
Guys, when to refer to Lara Croft, they are breasts not boobs.
Posted on Reply
#62
Mussels
Freshwater Moderator
HumanSmokeCool story.
PhysX uses both GPU and CPU...and pretty much always has done....if it didn't (as example), why would the 3.0 SDK include multithreaded CPU support?
physX has always disabled most of its 'features' if ran on a CPU. it ran on CPU for the consoles, but on PC it would always turn off or reduce effects drastically for CPU use (yes, even when sufficient CPU power was provided). it was deliberately sabotaged to make it 'better' when GPU accelerated.
Posted on Reply
#63
Prima.Vera
SaltyFishThere is OpenCL for multi-platform... but devs are too lazy to use it.
There. Fixed. ;)
Posted on Reply
#64
Depth
I hope this realistic hair trend doesn't spiral into a hairy Lara Croft

With nude mods portraying her realistic and glistening pubic hair
Posted on Reply
#65
Mussels
Freshwater Moderator
DepthI hope this realistic hair trend doesn't spiral into a hairy Lara Croft

With nude mods portraying her realistic and glistening pubic hair
larry croft and the chest hair of doom
Posted on Reply
#66
H82LUZ73
Imagine the Nude patch for Skyrim will be downloaded one trillion times just to see the full bushy ladies.
Posted on Reply
#68
xvi
HumanSmoke
TheGuruStudHow dare you allege that physics can be done on the CPU, thereby, defying nvidias fullproof (retarded) plan of forcing their standard to run only on their cards, even though it works perfectly on cpu!
Cool story.
PhysX uses both GPU and CPU...and pretty much always has done....if it didn't (as example), why would the 3.0 SDK include multithreaded CPU support?

As for the whole she-done-me-wrong wailing, some of us remember the posturing from both sides:
June 2006: ATI start posturingagainst Ageia
September 2007: Intel cuts AMD off at the knees by acquiring Havok
November 2007: AMD look into acquiring Ageia
Three months later Nvidia acquires Ageia
June 2008: AMD bigs up Havok FX (Number of Havok FX games produced : 0 )
Around the same time, Nvidia offered PhysX licence to AMD. AMD not interested and immediately begin a public thrashdownof PhysX via messrs Huddy, Cheng and Robison while promoting OpenCL Havok FX. Nvidia reply by locking AMD/ATi out of PhysX...AMD replyby telling world+dog that PhysX is irrelevant.

Personally, I'd say that having watched the whole thing unfolding its a case of handbags at ten paces. AMD dithering twice on entering the GPU physics market and losing out...twice, then coming on strong with the "we didn't want it anyway" gambit, Nvidia playing hardball in response to trash talk. Neither of the companies covers themselves in glory, but strangely enough, a whole bunch of people forgetting that the bitch-slapping involved both camps.

From the timelines, this must have been the case. There is no way, short of an educated guess, that AMD would have expected Nvidia to pare down GK104 with respect to compute- especially given Nvidia's previous GPUs since G80. Having said that, I doubt that AMD wouldn't capitalize on the fact now that they are aware of it. From this PoV, Titan's entry into the market makes a convenient backstop (for Nvidia) should AMD look to code endless compute loops into future titles. It might have crippled the performance of pre-GK110 Nvidia (and VLIW4/5 AMD) at the expense of GCN, but Titan's appearance would probably convince AMD that massive compute additions might not be the PR boon they could have been.
I think you missed the
RejZoRSo they sticked some hair on a baldy who was designed to be bald to begin with. Whats next, making Lara hairy where she wasn't intended to be? Pathetic, like they are competing with NVIDIA who will make more retarded post processing physics effects and lock them down to one platform.
F**k that.
Posted on Reply
#69
Widjaja
DepthI hope this realistic hair trend doesn't spiral into a hairy Lara Croft

With nude mods portraying her realistic and glistening pubic hair
There will indeed be a demand.
Although I would expect Lara would have to be mighty hairy down there to get any sort of hair movement going.

It maybe a good idea for Square Enix to make some DLC hairs for Lara or add it to new DLC to promote TressFX if they want more money.
Posted on Reply
#70
xvi
WidjajaThere will indeed be a demand.
Although I would expect Lara would have to be mighty hairy down there to get any sort of hair movement going.

It maybe a good idea for Square Enix to make some DLC hairs for Lara or add it to new DLC to promote TressFX if they want more money.
Leisure Suit Laura?
Posted on Reply
#71
XNine
CaseLabs Rep
Ferrum MasterBtw it made me thought, that old Lara had BIGGER boobs - damn it...

A PATCH, we demand a game PATCH :roll:
But they were triangular, which was pretty disgusting.
Posted on Reply
#72
Xzibit
HumanSmokeCool story.
PhysX uses both GPU and CPU...and pretty much always has done....if it didn't (as example), why would the 3.0 SDK include multithreaded CPU support?

As for the whole she-done-me-wrong wailing, some of us remember the posturing from both sides:
June 2006: ATI start posturingagainst Ageia
September 2007: Intel cuts AMD off at the knees by acquiring Havok
November 2007: AMD look into acquiring Ageia
Three months later Nvidia acquires Ageia
June 2008: AMD bigs up Havok FX (Number of Havok FX games produced : 0 )
Around the same time, Nvidia offered PhysX licence to AMD. AMD not interested and immediately begin a public thrashdownof PhysX via messrs Huddy, Cheng and Robison while promoting OpenCL Havok FX. Nvidia reply by locking AMD/ATi out of PhysX...AMD replyby telling world+dog that PhysX is irrelevant.
You kind of left the most important part out. Aegia developed it for CPUs (remember the short lived PPU craze) and when Nvidia bought them they pretty much only developed it for there GPU. It wasnt after pressure that it was "crippled" for CPU instruction code which Nvidia didnt care for cause it wouldnt distinguish them from Havok.
It was more of a I got the shiny ball and i'll make it work for my hardware and I dont care as much if it works with yours so buy our products.

Nvidia was its own worst enemy and it was well documented in articles. It took Nvidia 3yrs to recognize there selfish mistake but by that time it was too late.
Not to mention they are doing it again with Tegra, Tegra optimize games sold at the Tegra Zone.

If you were to beleive Nvidia statement it was tossing PhysX at the developers and there were rejecting it. Not to mention that PhysX seamed to be a PS3 priority and PCs were taking a backseat so they were content to only update PhysX for the GPU and leave CPU behind.
After all why update it if developers arent asking for it and you can use it to your advantage to sell your GPUs as an added feature. Makes the buying of Aegia pointless if your not.

Thats why they are only how many titles that support PhysX in 5yrs ?:confused:

Games using Havok doesnt look like 0 :ohwell:
Havok FX was cancelled for that very same reason why PhysX had been failing. Hardware dependant and limits developers potential customer base. Intel probably had sense and decided not every pc will have the same GPU or a Nvidia/AMD GPU but most will have a CPU and that would be the best going forward.
Posted on Reply
#73
Ferrum Master
XNineBut they were triangular, which was pretty disgusting.
Nope, they weren't they allowed more precious polygons being spent on her, for a saint cause of course :D

Despite it was PSX and that did tax a lot... :D
Posted on Reply
#74
Prima.Vera
XzibitYou kind of left the most important part out. Aegia developed it for CPUs (remember the short lived PPU craze) and when Nvidia bought them they pretty much only developed it for there GPU.
Ageia had 2 PCI-ex add-on cards with big PPUs dedicated only for physics. Only after nvidia bought them and develop for CUDA, psysx could have be run in software mode... :eek:
Posted on Reply
#75
Xzibit
Prima.VeraAgeia had 2 PCI-ex add-on cards with big PPUs dedicated only for physics. Only after nvidia bought them and develop for CUDA, psysx could have be run in software mode... :eek:
Nope. Ageia ran NovodeX (PhysX) in software mode as far back as 2005 in all 3 consoles just not accelerated through GPUs.
Posted on Reply
Add your own comment
Apr 25th, 2024 06:30 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts