Sunday, August 16th 2009

ATI ''Evergreen'' Promises You Won't Believe Your Eyes

All bets are off, AMD's DirectX 11 compliant GPUs are on course to compliment the commercial launch of Microsoft Windows 7, with enough of a head-start to allow buyers to have DirectX 11 hardware by the time they have the new OS. Codenamed "Evergreen", AMD's new family of graphics processors are slated for September 10, that's 25 days from now.

The company further carried out demos of its upcoming hardware to sections of the media in private, at their suite in the same hotel in which Quakecon 2009 is being hosted at. Behind the covered side-panel of the Lian-Li is a working sample, which AMD refused to let being pictured. Legit Reviews sneaked around the case to take a shot of its panel nevertheless.

AMD further demonstrated over six new technology demonstrations including Parallax Occlusion Mapping, Detailed Tessellation, and High-Definition Ambient Occlusion, all of which will be some of the key ingredients of DirectX 11, and in all of which, AMD's hardware is churning out high frame-rates at 2560x1600 pixel resolution.
Sources: Legit Reviews, Techpulse360
Add your own comment

153 Comments on ATI ''Evergreen'' Promises You Won't Believe Your Eyes

#101
Mussels
Freshwater Moderator
enzoltim lmao-ing at you. 8800gtx was just 8800gtx. the 8800gt was changed to 9800gt and the 9800gtx/+ was changed to gts250. :rolleyes:
^ what he said. 8800GTX was a G80 chip, and they were never renamed.

renaming started with the 8800GT G92
Posted on Reply
#102
to6ko91
enzoltim lmao-ing at you. 8800gtx was just 8800gtx. the 8800gt was changed to 9800gt and the 9800gtx/+ was changed to gts250. :rolleyes:
Oooops :o:p I must say I was quite sleepy at the time
Posted on Reply
#103
Steevo
What a thign to brag about. Glad they only renamed a few models, unlike ATI, who just made new models.....silly ATI.


Does it taste good?
Posted on Reply
#104
a_ump
PCpraiser100Nvidia is lazy, ATI is commiting. Simple as that. when the 9800GT came around to replace the 8800GT for DX10.1 compatibility, I couldn't stop laughing.
but the 9800GT isn't dx10.1 compatible...none of Nvidia's desktop GPU's are dx10.1 compatible.

actually newegg lists this 9600GT by apollo as dx10.1, but i'd bet that is a typo as no other 9600GT or shit as i said desktop GPU by nvidia is dx10.1
Posted on Reply
#105
Mussels
Freshwater Moderator
wow, the egg screwed up there.

Only some newer laptop cards (and really slow ones at that) support DX10.1 in the nvidia camp.
Posted on Reply
#106
Steevo
many seem to forget that ATI is holding hands with MS to develop DX, nvidia just stands by and waits, then releases a behemoth and throws money at it to make it go. How long can they continue to do that in a volatile market? But there are those drinking the green kool aid.
Posted on Reply
#107
Mussels
Freshwater Moderator
nvidia has been on the outside ever since MS broke up with them and started dating ATI.


The thing is, ever since the Geforce 3 nvidia has always lagged behind - shader model 1.3 when ATI had 1.4, shader model 1.4 when ATI had 2.0 - shader model 2.0 when ATI had shader model 2.0 a gazillion times faster :P 24 bit rendering when Nv only had 16 and 32 bit (and that didnt go well with the FX series)

SM3.0 was somewhat even, in my memory. i was busy in that era and didnt pay as much attention to high end hardware.

Skip ahead - nv won out with DX10, they came out first and had better DX10 support for some time.

DX10.1 - Nv didnt even bother. boo :(

DX 11 - ATI seems to be taking a lead again. they want to be out first again, and they want the 'features lead' they've had with DX10.1 to extend to DX11
Posted on Reply
#109
Mussels
Freshwater Moderator
ShogoXTSkipping this generation.
i want to wait for the die shrink version :D
Posted on Reply
#110
Steevo
Musselsnvidia has been on the outside ever since MS broke up with them and started dating ATI.


The thing is, ever since the Geforce 3 nvidia has always lagged behind - shader model 1.3 when ATI had 1.4, shader model 1.4 when ATI had 2.0 - shader model 2.0 when ATI had shader model 2.0 a gazillion times faster :P 24 bit rendering when Nv only had 16 and 32 bit (and that didnt go well with the FX series)

SM3.0 was somewhat even, in my memory. i was busy in that era and didnt pay as much attention to high end hardware.

Skip ahead - nv won out with DX10, they came out first and had better DX10 support for some time.

DX10.1 - Nv didnt even bother. boo :(

DX 11 - ATI seems to be taking a lead again. they want to be out first again, and they want the 'features lead' they've had with DX10.1 to extend to DX11
Yeah, Nvidia didn't make a product that no one really embraced, good business model while burning money making sure to take the lead in marginal performance.:p:rockout:


ATI had better have some money ready to throw down when the NDA is lifted, and these start hitting the street, or else Nvidia is going to spin some BS and put a kink in their launch.


I bet W1zz already has four or more, and if not they aren't doing it right.
Posted on Reply
#111
PP Mguire
Right now DX11 dosent offer any real extra eye candy. What it does offer though is alot of performance increase and new rendering options.

Wolfenstein is the first DX11 title. We got to play it at Quakecon in their suite.

DX10.1 is in fact real. It uses a new AA technique and a few tiny extra things that Nvidia wanted no part of because its extra eye candy that drops performance alot. Hawx is a perfect example of this.

AMD was being stupid and boistrous in their suite though. Saying they are way ahead of Nvidia in the game ect ect. But when i talked to an Nvidia guy myself he gave me a few pointers and it looks like GT300 will be a margain ahead of ATI, yet again.
Posted on Reply
#112
Steevo
PP MguireRight now DX11 dosent offer any real extra eye candy. What it does offer though is alot of performance increase and new rendering options.

Wolfenstein is the first DX11 title. We got to play it at Quakecon in their suite.

DX10.1 is in fact real. It uses a new AA technique and a few tiny extra things that Nvidia wanted no part of because its extra eye candy that drops performance alot. Hawx is a perfect example of this.

AMD was being stupid and boistrous in their suite though. Saying they are way ahead of Nvidia in the game ect ect. But when i talked to an Nvidia guy myself he gave me a few pointers and it looks like GT300 will be a margain ahead of ATI, yet again.
Nvidia is going to make another huge die, that is hard to cool, and will win performance, and cost a small fortune, but what is going to use the performance besides synthetic benchmarks?


I built with GTA4 in mind, and play it well at 1920X1200 at high settings. When these cards hit, or the 2GB variant of the 4890 come down in price I will upgrade, and still not have a game that can really stress my system.
Posted on Reply
#113
Mussels
Freshwater Moderator
PP MguireRight now DX11 dosent offer any real extra eye candy. What it does offer though is alot of performance increase and new rendering options.

Wolfenstein is the first DX11 title. We got to play it at Quakecon in their suite.

DX10.1 is in fact real. It uses a new AA technique and a few tiny extra things that Nvidia wanted no part of because its extra eye candy that drops performance alot. Hawx is a perfect example of this.

AMD was being stupid and boistrous in their suite though. Saying they are way ahead of Nvidia in the game ect ect. But when i talked to an Nvidia guy myself he gave me a few pointers and it looks like GT300 will be a margain ahead of ATI, yet again.
you got 10.1 backwards.

10.1 was part of 10.0 that nvidia couldnt do, so it was dropped. 10.1's features are designed to ENHANCE performance - you ever wonder why DX10 has poor AA support, and many games where AA+HDR just dont work at the same time? Nvidia is your answer, they couldnt do it so they forced it out of the standards.
Posted on Reply
#114
PP Mguire
Thats not what AMD said. 10.1 drops performance like crazy when AA is enabled.
Posted on Reply
#115
trt740
stupid question is vista 64 going to be Dx11 compatible.
Posted on Reply
#116
Frick
Fishfaced Nincompoop
trt740stupid question is vista 64 going to be Dx11 compatible.
Everything else would be stupid imo.
Posted on Reply
#117
CyberDruid
I love watching this...GFX wars are only fun for the spectators...nothing sucks more than buying the "bleeding edge" only to find out that something ten times better is coming out next quarter :D
Posted on Reply
#118
erocker
*
trt740stupid question is vista 64 going to be Dx11 compatible?
Yes.
Posted on Reply
#119
Mussels
Freshwater Moderator
PP MguireThats not what AMD said. 10.1 drops performance like crazy when AA is enabled.
you definately read that wrong.

go google DX10.1 - one of its key features was improved antialiasing speeds over DX10


techreport.com/discussions.x/14707
We have been following a brewing controversy over the PC version of Assassin's Creed and its support for AMD Radeon graphics cards with DirectX 10.1 for some time now. The folks at Rage3D first broke this story by noting some major performance gains in the game on a Radeon HD 3870 X2 with antialiasing enabled after Vista Service Pack 1 is installed—gains of up to 20%
TR: What other image quality and/or performance enchancements does the DX10.1 code path in the game offer?

Beauchemin: There is no visual difference for the gamer. Only the performance is affected.
So we have confirmation that the performance gains on Radeons in DirectX 10.1 are indeed legitimate. The removal of the rendering pass is made possible by DX10.1's antialiasing improvements and should not affect image quality.
Posted on Reply
#120
PP Mguire
Musselsyou definately read that wrong.

go google DX10.1 - one of its key features was improved antialiasing speeds over DX10


techreport.com/discussions.x/14707
There was no reading to it. AMD said it right in front of me. Go grab a 10.1 card and play Hawx with AA enabled. = lag
Posted on Reply
#121
MomentoMoir
curious to see were this discussion leads so im subscribe
Posted on Reply
#122
erocker
*
MomentoMoircurious to see were this discussion leads so im subscribe
You like watching your boyfriend get into arguments about theoretical lag using image quality enhancements? Interesting...

Subscribed! :D
Posted on Reply
#123
PP Mguire
PP MguireThere was no reading to it. AMD said it right in front of me. Go grab a 10.1 card and play Hawx with AA enabled. = lag
Oh and just read the quotes. Thats Assassins Creed which if im not mistaken the "10.1" support was taken out almost immidiately. Which makes me believe that it was "theoretical" 10.1 support.
Posted on Reply
#124
entropy13
PP MguireThere was no reading to it. AMD said it right in front of me. Go grab a 10.1 card and play Hawx with AA enabled. = lag
I must have finished the whole campaign with lots of lag then. :roll:
Posted on Reply
#125
MomentoMoir
erockerYou like watching your boyfriend get into arguments about theoretical lag using image quality enhancements? Interesting...

Subscribed! :D
ya i do any arguments i some what understand amuse me
Posted on Reply
Add your own comment
Apr 23rd, 2024 15:23 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts