Thursday, June 4th 2009

AMD Radeon RV840 Graphics Card Caught on Camera

Yesterday, AMD unveiled its surprise for this year's Computex, with a demonstration of a DirectX 11 3D scene. Behind the scenes though, was what AMD claimed to be the "world’s first true DirectX 11 graphics processor". The hardware itself however, wasn't publicly displayed, although a memento of AMD's partnership with TSMC, a wafer of 40 nm DirectX 11 GPUs, was made public. VR-Zone however, sneaked into the backdrops and pictured the machine that ran the demo (which ironically, was built into a case with a side-window).

The graphics card, a portion of which, is hidden behind the "wing" of the AMD Dragon logo graphic, is seen to be about 8.5 inches long, spans across two slots, and has a seemingly sporty cooler with the ATI-red shroud. It draws power from a single 6-pin PCI-E connector. The photographers note that this could be the RV840-based desktop accelerator, which forms the performance-mainstream product for the company's upcoming DirectX 11-compliant GPU lineup codenamed "Evergreen". The first product from this series is expected to be released in September, weeks ahead of the launch of Microsoft Windows 7.

Source: VR-Zone
Add your own comment

46 Comments on AMD Radeon RV840 Graphics Card Caught on Camera

#1
my_name_is_earl
The cooler look quite different. It look like a third party.

Techpowerup, what sup with the advertising floating across the article and it can't be close. It's getting anoying and kept me from reading the article. Maybe if your site is dynamic then this wouldn't happen. My screen is 1920x1200 and when I maximize the website, the little square advertising from the right float in the middle. Hope you guy recheck that.
Posted on Reply
#2
W1zzard
my_name_is_earl said:
Techpowerup, what sup with the advertising floating across the article and it can't be close. It's getting anoying and kept me from reading the article. Maybe if your site is dynamic then this wouldn't happen. My screen is 1920x1200 and when I maximize the website, the little square advertising from the right float in the middle. Hope you guy recheck that.
wtf are you talking about? screenshot?
Posted on Reply
#3
scope54
alentor said:
oww really? i thought physics gonna be unique and supported only by nvidia boards.. guess i was wrong, but is physics useful? i think the CPU is powerful enough the handle the physics by itself, the only 2 engine who use that much of CPU power is crysis and maybe, unreal engine/F.E.A.R. engine.... and whats cuda xD and will ATI cards support cuda?
CUDA is Nvidia's API to interface with the GPU and use the GPU to accelerate whatever you want...PhysX calculates SOOO much more then what is usually used in games that dont have it, like Crysis and F.E.A.R. (Unreal 3 engine uses Physx). ATI will never make/support a PhysX driver because they would have to pay Nvidia to use it, and they want to make GPU physics an open source 'thing' (for lack of a better word) with OpenCL. With OpenCL ANY developer can just code physics to be calculated on the GPU and CPU without any licensing fees.
ATI also uses CAL/Brook+ for their API to interface with the GPU.

(if i got anything wrong here please correct me.....lol)
Posted on Reply
#4
alentor
scope54 said:
CUDA is Nvidia's API to interface with the GPU and use the GPU to accelerate whatever you want...PhysX calculates SOOO much more then what is usually used in games that dont have it, like Crysis and F.E.A.R. (Unreal 3 engine uses Physx). ATI will never make/support a CUDA driver because they would have to pay Nvidia to use it, and they want to make GPU physics an open source 'thing' (for lack of a better word) with OpenCL. With OpenCL ANY developer can just code physics to be calculated on the GPU and CPU without any licensing fees.
ATI also uses CAL/Brook+ for their API to interface with the GPU.

(if i got anything wrong here please correct me.....lol)
owwww i get it now, heh thank u ;o
Posted on Reply
#5
Disruptor4
scope54 said:
CUDA is Nvidia's API to interface with the GPU and use the GPU to accelerate whatever you want...PhysX calculates SOOO much more then what is usually used in games that dont have it, like Crysis and F.E.A.R. (Unreal 3 engine uses Physx). ATI will never make/support a CUDA driver because they would have to pay Nvidia to use it, and they want to make GPU physics an open source 'thing' (for lack of a better word) with OpenCL. With OpenCL ANY developer can just code physics to be calculated on the GPU and CPU without any licensing fees.
ATI also uses CAL/Brook+ for their API to interface with the GPU.

(if i got anything wrong here please correct me.....lol)
I was under the impression that CUDA was actually free and ATi was offered to support it. Didn't hear any details about pricing though, only knew that ATi declined the offer.
Posted on Reply
#6
btarunr
Editor & Senior Moderator
Disruptor4 said:
I was under the impression that CUDA was actually free and ATi was offered to support it.
Your impression is correct.
Posted on Reply
#7
Valdez
alentor said:
oww really? i thought physics gonna be unique and supported only by nvidia boards.. guess i was wrong, but is physics useful? i think the CPU is powerful enough the handle the physics by itself, the only 2 engine who use that much of CPU power is crysis and maybe, unreal engine/F.E.A.R. engine.... and whats cuda xD and will ATI cards support cuda?
physx != physics
Posted on Reply
#8
alentor
Valdez said:
physx != physics
lol yea... that what i meant ; p
Posted on Reply
#9
my_name_is_earl
W1zzard said:
wtf are you talking about? screenshot?




Typical FireFox F*up. Maybe Firefox don't like you guy. IE scale without a problem.
Posted on Reply
#10
Valdez
Disruptor4 said:
I was under the impression that CUDA was actually free and ATi was offered to support it. Didn't hear any details about pricing though, only knew that ATi declined the offer.
Physx was offered to amd, not cuda.
Amd rejected it, because if amd would accepted it, then physx would become a "standard", but it's nvidia's technology, it's not open. The source code is in nvidia's hand, and it is optimized for nvidia gpu's.
I'm sure in every physx title ati gpu's would be slower than nvidia cards. Amd doesn't want this.
Posted on Reply
#14
shiny_red_cobra
Everything is aligned fine for me...using Firefox 3 @ 1920x1200 resolution.
Posted on Reply
#15
Geofrancis
i heard there was some kind of physics built into direct x 11 not the PhysX that nvidea owns but some kind of microsoft version that will run on both cards built into the dirext x 11.
Posted on Reply
#16
jamesrt2004
MilkyWay said:
is that a 3 slot cooler? i cant see shit even if we could it would just look like every other ati card red with the crap stock cooler
It's dual slot... so easy to tell... just the angle it's taken at is silly


ShadowFold said:
Keep on pushin AMD :rockout: Looks like DX10 cards will be able to run DX11 games, just in like a legacy compatible mode or something..
every card can support dx11... just it will have dx11 features disabled but will still RUN on dx11 if that makes sense? :)

Geofrancis said:
i heard there was some kind of physics built into direct x 11 not the PhysX that nvidea owns but some kind of microsoft version that will run on both cards built into the dirext x 11.
yeah your right it's the OPEN standard (nvidia will charge for cuda.) Open CL, which is why nvidia is pissed about it as open cl will be the thing game tech's will code in as opposed to cuda, so they wasted their money!!
Posted on Reply
#17
sapetto
Huh!? Nice name for a VGA - Evergreen...

Posted on Reply
#18
ov2rey
W1zzard said:
does anyone else see this? exactly which browser version is that?
using Opera 10 beta no problem
Posted on Reply
#19
alentor
W1zzard said:
does anyone else see this? exactly which browser version is that?
usinng Firefox 3.5b4, no problem..
Posted on Reply
#21
Steevo
I once named my 52X drive mf after it's twin blew up a disc in the drive.
Posted on Reply
Add your own comment