• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ATI R680 PCB Design and Cooling Early Pics

as much as i would like to see AMD put up a fight against Intel and Nvidia... they are not doing much to put up a fight, AMD messed up when they bought ATi. Not only were they reeling at the time but they are bringing down a good company. I used to be a ATi fanboy but now that i have a nvidia product, i like it, and it has more upside. AMD is going to go bankrupt, the Phenom is not gonna be as great as they expect and Intel is destroying them in sales... You could say the Core2Duo architecture was the best thing to happen to Intel.

AMD is gonna go down and prices are gonna skyrocket in the processor war... But once Intel brings in there GPU's it will be a good fight between them and Nvidia

I don't really think that the fall of AMD is in the best interest to Intel. The better scenery for Intel is a weak but still alive AMD, and never a monopoly IMHO. I will explain.
In this kind of bussiness it's very easy to enter one market if there isn't any competition there. Only for entering the market you could take over ~20% of market share, of course if the product is good enough, not better than the compettion, just good. Samsung has followed this strategy many times, and I don't think I have to say they have succeeded.
On the other hand trying to enter a market in which there is already competition is very difficult. Can you remember XSI? Their cards were good, more or less on par with Nvidia or Ati (they had some drivers issues, but which new card doesn't nowadays?), but they were new in the game, there were alternatives to the better yet expensive cards in respective segments, so they didn't get any market share. There existed the possibility to buy the better Radeons or the worse but cheaper Nvidias. In this game there wasn't a place for XSI.
I know it's not the best example, since Volari had severe rendering issues on some games, but they could have had some market share for not gaming PCs for example.
If AMD goes down, someone will buy it, they are just not going to let it totally dissapear. The buyer could be IBM or Samsung, for example. If any of those buy AMD it could mean big troubles for Intel, since what AMD lacks, both of those have in excess. Money.
 
they were so prepared that they couldnt even get a proper gpu virtualisation to work, they barely got anything working without crying to microsoft to get dx10 nerfed so they were back in the game. ATI architecture is waaay ahead of NV, driver support may be another issue, resulting is a knee-capped card.

NV 2 steps ahead, ?? I think not...

Ain't that the truth! How soon we forgot the flame wars over at nvidia's forums back last novermber, december 2006 over proper dx10 drivers :rolleyes:
 
Thats exactly what i thought that the problems with the ATi AMD merger is killing the 2 company's products maybe when they overcome the problems possibly 2008-2010 we will see AMD/ATi go back to what they used to be like.

I think if AMD can get out a really poweful cpu that clocks like hell i mean clocks we've never seen like 6ghz and stuff then it will beat intel because Intel just go for crappy tech but clock like fuck to compensate and they have like quad cores that arnt even proper quad cores but clock past 4ghz and run faster than any of the AMD quad cores that have got proper tech in them but are still shit.

I think that 3dfx had great cards but wtf happend at the end of its life with those multi gpu concept cards and no support, ATi just brought out cards with great drivers and Nvidia was coming out with better products than 3dfx.

I had a vodoo 3 card it was good at the time.
 
Actually, what I personally believe wrong with the general performance of the R600 (and to a lesser extent, the RV670) is the super scalar archeticture. Check up Beyond 3D and probably a few other sites that I cant name, Anand probably has it too. The Radeon cards have the potential to use 320 stream processors if their compiler software and the game software interfaces perfectly and they all agree and so on. However, at worst, the Radeon cards only use 64 stream processors, or roughly half of what's on a 8800GTX. So I think most games fall in the range of 128 and 256 stream processors for ATI. However, because of a few other differences in the cards the Nvidia cards run faster in most games. Thats why in a few certian games you see the ATI cards running close to a GTX. I think if ATI managed to get something like TWIMTBP off the ground it would help ATI get better drivers out faster AND get optimizations in more games.
 
The long of the short is that AMD believes that Nvidia is locking it out of the market with its TWIMTBP programme—something I’m sure Nvidia would disagree with—and that developers working with Nvidia often make it difficult for AMD to get access to code early enough to develop CrossFire drivers in time for a game’s launch. Whatever the case may be (I try not to get involved with the politics of this industry), I’d really like to see more CrossFire support out of the gate in as many of the big titles as possible - but I don’t think it’s just an issue for the developers to tackle. AMD’s driver and developer relations teams need to pull the strings on AMD’s side of the fence too.

One interesting tidbit I did learn was that AMD is looking at ways to make multi-GPU as transparent as possible, because it no longer sees a future in making increasingly large GPUs. I’m speculating here, but I can see AMD using something like a HyperTransport bus to pass data between the two (or more) GPUs and a PCI-Express controller, which may also have the render back-ends incorporated that talk directly to the on-board memory. It sounds crazy I know, but I really believe that if multi-GPU is going to be the future, it needs to be as transparent for the user as humanly possible.
Source

The other thing that still irks me a little is the chip’s architectural efficiency – I can’t help but feel this card should (and would) crucify Nvidia’s GeForce 8800 GT if code was written in such a way to take advantage of the VLIW architecture or if AMD had opted for a more versatile architecture that doesn’t suffer from some of the constraints that we’re used to seeing in GPUs of past years, before the unified shaders came to be.
Source

This really sums it up!
 
I think if ATI managed to get something like TWIMTBP off the ground it would help ATI get better drivers out faster AND get optimizations in more games.

I agree with this - but marketing further into the performance gaming market right now would be very hard for ATI, as nVidia support is massive, and even n00bs are sucked over to their side rather quickly.

Their best bet, would be to advertise on their HD capabilites being superior to nVidias. I had the thought that they should start including a small advertisement of logo with all these CGI movies coming out - most have admitted to using ATI's hardware, and I'm sure they wouldn't mind a 15s ATI logo brandished at the beginning of a movie. People will go see the film, see the logo (and most will recognize it from the hardware industry), and say that if it's good enough to create a movie like that, then it must be superior in IQ. People looking for the best equipment for their HD capabilities at home would also take note, too . . . and we all know how quickly HD broadcasts and movies are moving in.

ATI needs a campaign like TWIMTBP, but they need to target a completely different market right now.
 
7950GX2 did not kill Nvidia, so why would this card bury AMD/ATI ?

To me this seems much more elegant than Nvidias Frankenstein (7950GX2) with two cards strap up as one with some Macgyver tape.

The difference is that the 7950 GX2 wasn't brought out as a last-ditch effort to grab back market share. When the GX2 appeared the G70 series was already a huge success; GX2 was just a marketing/PR stunt. (Granted, also the fastest DX9 video card in the world - and I should know, I have 2 of em :D.)

ATI architecture is waaay ahead of NV, driver support may be another issue, resulting is a knee-capped card.

That's not a "maybe", it's a fact. The raw power is there in the silicon, it's a crying shame that the drivers just can't make effective use of it.
 
imperialreign said:
I had the thought that they should start including a small advertisement of logo with all these CGI movies coming out - most have admitted to using ATI's hardware, and I'm sure they wouldn't mind a 15s ATI logo brandished at the beginning of a movie. People will go see the film, see the logo (and most will recognize it from the hardware industry), and say that if it's good enough to create a movie like that, then it must be superior...

If only ay? That's a really good idea, maybe you should pitch it to ATi, linking this thread. ;P
 
Do you know what the sad thing is? If AMD go down the toilet bowl they will take ATI with them , then we are fucked
 
ATI should have purchased AMD, then we would see some goodness ;)
 
If only ay? That's a really good idea, maybe you should pitch it to ATi, linking this thread. ;P

I'd love to, but I don't really think they'd take me seriously - not unless there was a strong showing of support for something like that fro the fan forums.
 
Well I'm always up for trying to make a difference!
 

Does that tell me what is at fault, other than the supposed "lack of ROPs" like some people are claiming? It did tell me why the AA settings suck on the ATI cards, and a couple other things. But the thing is like 30 pages long... And still doesnt really answer my "question". Also it said some stuff about virtualization that the stream processors could be used to help out windows somehow, I dunno.

http://www.techpowerup.com/reviews/Zotac/GeForce_8800_GTS_512_MB/ Here says ROP's are 16x2... so thats 32 ROPs (more than any other card)? The only thing I see as "better" on the Nvidia cards is the shader clock. So what is it the shader clock giving the nvidia cards the big advantage? Because the numbers never add up, ATI should be smashing nvidia, just considering numbers. ATI has 320 stream processors and all this other nonsense that should be making it great, it just seems like drivers and software support might be to blame.
 
it just seems like drivers and software support might be to blame.

pretty much sums it up, if you got great hardware, but no drivers/software to use it, you only ever gonna be as powerful as the software in use dictates
 
v56k560.jpg


v5_6000_1.jpg


quantummercury.jpg


voodoo68000.jpg


A little collection...:D
 
good 'ol 3DFX . . . sadly missed, and still well respected . . .


they were out for blood with those setups :laugh: and would've gotten it too if they could've lasted another year
 
You'll need liquid cooling or a four slot cooler to keep 4 RV670s in operational conditions.

Your logic is amazing. 4 chips, 4 slot cooler. :roll:
 
they were so prepared that they couldnt even get a proper gpu virtualisation to work, they barely got anything working without crying to microsoft to get dx10 nerfed so they were back in the game. ATI architecture is waaay ahead of NV, driver support may be another issue, resulting is a knee-capped card.

NV 2 steps ahead, ?? I think not...


I be leave i remember that did it not all so mean that there was NO REASON whot so ever that DX 10 could not be on XP ?... As NV wanted NV got it changed..
 
I be leave i remember that did it not all so mean that there was NO REASON whot so ever that DX 10 could not be on XP ?... As NV wanted NV got it changed..

microsoft got that changed, not NV. They just wanted to push vista, and decided DX10 + the new audio scheme would work better bundled together.
 
Strange how things turn up lately.. i look forward to some benchies.
Lets just hope AMD wont sell 'em by the inches ;)
 
Your logic is amazing. 4 chips, 4 slot cooler. :roll:

Sure is, Try placing four RV670s on a PCB, you cant. So it has to be two RV670s, two each on a PCB a la 7950 GX2. You simply can't have one PCB atop another, so there has to be a 1 slot gap for a leaf-blower. That would work out to be four slots in all.
 
Sure is, Try placing four RV670s on a PCB, you cant. So it has to be two RV670s, two each on a PCB a la 7950 GX2. You simply can't have one PCB atop another, so there has to be a 1 slot gap for a leaf-blower. That would work out to be four slots in all.

he makes a good point here, the GX2 was pretty much two single slot cards stuck together, sharing a single PCI-E slot.
 
I'd love to, but I don't really think they'd take me seriously - not unless there was a strong showing of support for something like that fro the fan forums.

Not possible because the CG companies that make these movies use ATI FireGL hardware that they would have bought just like any other customer. So a producer wouldn't agree to spending 15 seconds (15x 20 = 300 frames) to show an ATI logo unless ATI pays for it. On the other hand, several games like Unreal Tournament 2004, 2003, FEAR, etc. Show either a 2D image or a 3D animation of the NVidia "The way it's meant to be played" logo for the reason that NVidia gives them their newest cutting edge hardware, hardware that would not have been released to the market at the point when the game was being made. And game developer wouldn't mind putting up a logo or even making their games work better with NVidia hardware. NVidia leases them the hardware for peanuts.
 
Last edited:
if yall are that desperate for a card to beat the 8800gtx... sad... especially considering this is a dual GPU and the 8800gtx has been out over a year and has a single gpu...

ATi is screwed...

I don't find anything sad about ATI releasing a single card that'll be the best in the market upon release :wtf: I support the underdog anyway cos monopolies SUCK and we can't all afford an 8800gtx :slap:
 
I don't find anything sad about ATI releasing a single card that'll be the best in the market upon release :wtf: I support the underdog anyway cos monopolies SUCK and we can't all afford an 8800gtx :slap:

but we CAN all afford 8800GT 256 :toast:
 
Back
Top