• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

ATI Radeon HD 4870 and Radeon HD 4850 Price Cuts This Week

Looking at the graphs posted by ramsay, you can see that at 1920x1200, the Hd4870 only differs at max 4 fps in what I'm assuming to be average framerate figures.

Does that justify the price difference between the cards now?
 
Sorry Wolf, have to disagree with you there. Lots of head room on the ram? Ah but if you want to talk about future proofing why buy a Nvidia card when it doesn't fully support Dx10.1?
See?

There is only so much you can do, but eventually one way or another this industry will require you to upgrade if you want silky high resolution graphics. That's how it has always been, they have just replaced the 'Ultra' tag and copied ATI's XTX naming conventions. :p

512mb works fine at 1920x1080 with 4AA in RA3, same goes for TF2, Burnout and CoD:WaW. It's only crysis: warhead that gives absurdly low framerates (x64 euthusiast settings with edgeAA and vsync60 at a res of 1280x1024 on average 21~25fps at the outdoor busy locations) for me. and it isn't a gpu memory issue.

I don't think we will see games that use over 512mb until way into next year (Crysis 2: Massive Memory Leak :p ).
GTA 4 already uses more than 1GB at max settings. And PT Boats, which has been out for quite a long time already. Hell, even COD4 uses more than 512MB at 1920x1200. I logged up to 700MB of video memory usage in Rivatuner on my 8800GT 1GB while playing COD4.

I think the time of 512MB is already dead. Much like the time of 256MB was dead as soon as Quake 4 released.
 
I think you will still keep some value in Oz... from what I understand 1337 hardware is crazy expensive/rare.

That's quite true! The only more successful PC Retailers are the Etailers, there are not many PC enthusiasts system builders around Sydney and I can confidently say I have the best computer in my suburb lol.

Not many people would know what a gtx295 is or a 4870x2 let alone a Corei7! Most buy bundled systems from dell which I find depressing because they make you pay more for the services rather than the rig itself which is quite a rip off.
 
Sorry Wolf, have to disagree with you there. Lots of head room on the ram? Ah but if you want to talk about future proofing why buy a Nvidia card when it doesn't fully support Dx10.1?
See?

There is only so much you can do, but eventually one way or another this industry will require you to upgrade if you want silky high resolution graphics. That's how it has always been, they have just replaced the 'Ultra' tag and copied ATI's XTX naming conventions. :p

512mb works fine at 1920x1080 with 4AA in RA3, same goes for TF2, Burnout and CoD:WaW. It's only crysis: warhead that gives absurdly low framerates (x64 euthusiast settings with edgeAA and vsync60 at a res of 1280x1024 on average 21~25fps at the outdoor busy locations) for me. and it isn't a gpu memory issue.

I don't think we will see games that use over 512mb until way into next year (Crysis 2: Massive Memory Leak :p ).

your point makes no sense to me, im taking about future proofing, i agree that piss all takes advantage of 1gb at the moment.

your point about dx10.1 is completely null and void (has no comparison to framebuffer size...), practically nobody works with 10.1 and both companies are moving to dx11.

ATi did dx10.1 in much the same fashion as nvidia released Physx, its propiatary and for use only with supported GPUs.

i don't see how this has any comparison to or bearing on amount of framebuffer, as ANY company, card, or application can fit or make use of 512mb+

GTA 4 already uses more than 1GB at max settings. And PT Boats, which has been out for quite a long time already. Hell, even COD4 uses more than 512MB at 1920x1200. I logged up to 700MB of video memory usage in Rivatuner on my 8800GT 1GB while playing COD4.

I think the time of 512MB is already dead. Much like the time of 256MB was dead as soon as Quake 4 released.

i agree wholeheartedly. not too many things will use the extra memory, but does it mean we should not have it? ill use the timeless line..

some say why, i say why not. especially if the price is negligible, which it now unfortunately isnt for the 4870, lying at 50+ USD
 
+1. It's a marketing gimmick as DX11 will be the next standard. The only thing that isn't gimmicky about physX is that it uses a programming standard, CUDA, to achieve the calculations. Standards are good things.
 
DX10.1 is backwards compatible with DX11
 
DX10.1 is backwards compatible with DX11

That is true, but that will only mean that the current cards compatible with dx10.1 will only be able to use *some* of the dx11 capabilities, ati amd doesn't expect to release full dx11 card until probably Q4 2009.
 
Yea I gotta admit, by the time DX11 is out the HD 4800's probably won't run them all that well.
 
Yea I gotta admit, by the time DX11 is out the HD 4800's probably won't run them all that well.

Its by no means its a needed upgrade though, even DX10 hasn't completely outpaced DX9 yet same with Vista > XP. I don't think the 48xx series will become officially be outdated til late 2010 -2012

DX10.1 is closer to a gimmick I must agree.
 
this is really going to give ati more power! Now we can all games on high settings for cheap. I remember in summer when the gtx9800 was 349$ here in canada.
 
@wolf: "your point makes no sense to me, im taking about future proofing, i agree that piss all takes advantage of 1gb at the moment."

So am I... you can't future proof, all you can do is buy the best card for your money. 512mb isn't obsolete yet, if you want to run high resolutions and worry that you'll run out of memory, maybe you should worry about running out of horse power too and buy a X2 type card?

But for the majority of gamers and buyers, they'll probably take the lowest priced of the high end market.
 
well... I own netcore.24 in montreal, with 25 gaming stations all with 4850's (512) on 22 inch lcds 1680x 1050 and I can tell you all games play perfect on max settings ex for crysis and gta 4 but they still play at their native res. 512 cards are still good for any game unless you have a 30 inch lcd.
 
well... I own netcore.24 in montreal, with 25 gaming stations all with 4850's (512) on 22 inch lcds 1680x 1050 and I can tell you all games play perfect on max settings ex for crysis and gta 4 but they still play at their native res. 512 cards are still good for any game unless you have a 30 inch lcd.

Want another tech/employee with ATI experience in Montreal? :p
 
My butt's itchin for some 4870 Crossfire action, when's these prices gonna drop :toast:
 
@ Wile E
Cod4?
Come again? :P
http://www.techpowerup.com/reviews/Palit/GeForce_GTS_250_2_GB/6.html

hehe
I don't see the performance boost there...

Hehe? I fail to see what you are looking at. There's nothing to compare it to. It's the only 250 in that test. How exactly does that chart show anything at all pertaining to this debate?

And besides, the point was not how much performance increase 1GB saw, the point was that I logged a fair bit over 512MB of video memory usage in COD4. And that's an easy game to run.

Try to put GTA4 on high detail and crank the draw distance up and see how well a 512MB card does. ;) Here's a hint, my 8800GT 1GB outperforms my 4850 512MB in GTA4 at higher settings, and the 8800GT is a slower card. If GTA4 is already that bad, it's only going to get worse from here.

And to take a look back in history, when Quake4 released, we already had the 1900 series card out (or was it 1950?). The X1800XT 512MB outperformed the 19x0XT 256MB at Ultra settings. Sure, it was only one game, but all it took was that single game to push other developers to optimize for more than 256MB.

Crysis and GTA4 is this generation's Quake 4, mark my words.
 
Last edited:
The fact that at 1920x1200 the 4870 pulls ahead over the 9800GTX+/GTX250. If COD4 used over 512mb surely the fps would have dropped to a level where it would be out done by the 2gb version of the 9800(yes that 250 has a g92b core, and the it can still keep up in some games http://www.techpowerup.com/reviews/Palit/GeForce_GTS_250_2_GB/8.html )?
 
Last edited:
Wile E gtx+ spec http://www.overclockersclub.com/reviews/xfx_9800gtx_/4.htm
Difference of 5mhz on the core compared to the palit unless they changed the shader clock.

"Crysis and GTA4 is this generation's Quake 4, mark my words."
Helps if you don't pick games optimised for Nvidia... maybe that's why you have an advantage over the ATI card?
 
This also increases the chances that we will see a 9800 GTX+ price drop.
 
Wile E gtx+ spec http://www.overclockersclub.com/reviews/xfx_9800gtx_/4.htm
Difference of 5mhz on the core compared to the palit unless they changed the shader clock.

"Crysis and GTA4 is this generation's Quake 4, mark my words."
Helps if you don't pick games optimised for Nvidia... maybe that's why you have an advantage over the ATI card?
There was no GTX+ in the review you linked. I undertstand what you trying to get at, but you picked a poor review to make your point. There is no point of comparison for the 250 in that article.

Are you seriously pulling the "optimized for nVidia" card? lol. Look at other reviews, at the same settings, the 4850 512MB beats the 8800GT 512MB in both Crysis and GTA4. I already mentioned that the 8800 1GB only beats the 4850 at higher settings in GTA4. If the settings are kept within the 512MB limits of my 4850, my 4850 is faster. But guess what, when you increase the settings to take up 1GB, the 4850 512MB falls on it's face, and the 8800GT 1GB walks away from it. Considering the 4850 wins otherwise, I'd say it has nothing to do with "optimized for nVidia". ;)


Did you resist the transition to 512MB this hard when 256MB was the norm?
 
@wolf: "your point makes no sense to me, im taking about future proofing, i agree that piss all takes advantage of 1gb at the moment."

So am I... you can't future proof, all you can do is buy the best card for your money. 512mb isn't obsolete yet, if you want to run high resolutions and worry that you'll run out of memory, maybe you should worry about running out of horse power too and buy a X2 type card?

But for the majority of gamers and buyers, they'll probably take the lowest priced of the high end market.

you are skewing my words.

plenty takes advantage of the area between 512mb and 1gb, what i mean is piss all will use 1+ gb's of vram.

512 is very last year and unless you want to cheap out, or are satisfied with lower res's and less and less AA over time sure buy a 512 card.

an X2 type card, while having more power, also packs 1gb per GPU, think about why.

you CAN future proof, albiet for not very long, people that bought a 1GB 8800GT/9800GT probably wont have to upgrade yet, the power of G92 is still ample for most of todays games, however if stuck at 512mb, your already looking at a wall there. god forbid someone boght the 256mb model...
 
From the review: "The XFX 9800GTX+ features a die shrink to 55nm and increases in both the Shader and GPU core clock speeds to push the performance to the level of the competition. Not much else has changed from the initial release of the 9800GTX."

hmm? not a gtx+ you say? Did you read it, or only look at the link name?

"Did you resist the transition to 512MB this hard when 256MB was the norm?"
Yes, I got by with my laptop go5200m 32mb for a bit :P a massive step down from my old 6800 GT.
 
From the review: "The XFX 9800GTX+ features a die shrink to 55nm and increases in both the Shader and GPU core clock speeds to push the performance to the level of the competition. Not much else has changed from the initial release of the 9800GTX."

hmm? not a gtx+ you say? Did you read it, or only look at the link name?

"Did you resist the transition to 512MB this hard when 256MB was the norm?"
Yes, I got by with my laptop go5200m 32mb for a bit :P a massive step down from my old 6800 GT.
Are you looking at the same link that you provided?

You gave me this link: http://www.techpowerup.com/reviews/Palit/GeForce_GTS_250_2_GB/6.html

There is no GTX+ in there. We were talking about the benefits of additional frame buffer, so we have a 2GB 250, what are we supposed to be comparing it to? There's no other 250 in that review, and no GTX+. The only thing that's close is the GTX, but that has different clocks than the GTX+/250.
 
Back
Top