Monday, March 2nd 2009

ATI Radeon HD 4870 and Radeon HD 4850 Price Cuts This Week

With the release of ATI's next generation Radeon HD 4890 just a few weeks ahead, the company is going to cut the prices of its Radeon HD 4870 512 MB GDDR5 and Radeon HD 4850 512 MB video cards and thus balance its graphics cards product line-up. The ATI Radeon HD 4870 512MB will drop $50, from $199 down to $149 and fight with NVIDIA's rebranded GeForce GTS 250 1 GB. The ATI Radeon HD 4850 512 MB will drop to $129, and will become main competitor of NVIDIA's GTS 250 512 MB version. Resellers and distributors are expected to start selling with the new prices this week. Source: DailyTech
Add your own comment

69 Comments on ATI Radeon HD 4870 and Radeon HD 4850 Price Cuts This Week

#26
Wile E
Power User
Gzero, post: 1240174"
Sorry Wolf, have to disagree with you there. Lots of head room on the ram? Ah but if you want to talk about future proofing why buy a Nvidia card when it doesn't fully support Dx10.1?
See?

There is only so much you can do, but eventually one way or another this industry will require you to upgrade if you want silky high resolution graphics. That's how it has always been, they have just replaced the 'Ultra' tag and copied ATI's XTX naming conventions. :p

512mb works fine at 1920x1080 with 4AA in RA3, same goes for TF2, Burnout and CoD:WaW. It's only crysis: warhead that gives absurdly low framerates (x64 euthusiast settings with edgeAA and vsync60 at a res of 1280x1024 on average 21~25fps at the outdoor busy locations) for me. and it isn't a gpu memory issue.

I don't think we will see games that use over 512mb until way into next year (Crysis 2: Massive Memory Leak :p ).
GTA 4 already uses more than 1GB at max settings. And PT Boats, which has been out for quite a long time already. Hell, even COD4 uses more than 512MB at 1920x1200. I logged up to 700MB of video memory usage in Rivatuner on my 8800GT 1GB while playing COD4.

I think the time of 512MB is already dead. Much like the time of 256MB was dead as soon as Quake 4 released.
Posted on Reply
#27
random
TRIPTEX_MTL, post: 1239604"
I think you will still keep some value in Oz... from what I understand 1337 hardware is crazy expensive/rare.
That's quite true! The only more successful PC Retailers are the Etailers, there are not many PC enthusiasts system builders around Sydney and I can confidently say I have the best computer in my suburb lol.

Not many people would know what a gtx295 is or a 4870x2 let alone a Corei7! Most buy bundled systems from dell which I find depressing because they make you pay more for the services rather than the rig itself which is quite a rip off.
Posted on Reply
#28
wolf
Performance Enthusiast
Gzero, post: 1240174"
Sorry Wolf, have to disagree with you there. Lots of head room on the ram? Ah but if you want to talk about future proofing why buy a Nvidia card when it doesn't fully support Dx10.1?
See?

There is only so much you can do, but eventually one way or another this industry will require you to upgrade if you want silky high resolution graphics. That's how it has always been, they have just replaced the 'Ultra' tag and copied ATI's XTX naming conventions. :p

512mb works fine at 1920x1080 with 4AA in RA3, same goes for TF2, Burnout and CoD:WaW. It's only crysis: warhead that gives absurdly low framerates (x64 euthusiast settings with edgeAA and vsync60 at a res of 1280x1024 on average 21~25fps at the outdoor busy locations) for me. and it isn't a gpu memory issue.

I don't think we will see games that use over 512mb until way into next year (Crysis 2: Massive Memory Leak :p ).
your point makes no sense to me, im taking about future proofing, i agree that piss all takes advantage of 1gb at the moment.

your point about dx10.1 is completely null and void (has no comparison to framebuffer size...), practically nobody works with 10.1 and both companies are moving to dx11.

ATi did dx10.1 in much the same fashion as nvidia released Physx, its propiatary and for use only with supported GPUs.

i don't see how this has any comparison to or bearing on amount of framebuffer, as ANY company, card, or application can fit or make use of 512mb+

Wile E, post: 1240206"
GTA 4 already uses more than 1GB at max settings. And PT Boats, which has been out for quite a long time already. Hell, even COD4 uses more than 512MB at 1920x1200. I logged up to 700MB of video memory usage in Rivatuner on my 8800GT 1GB while playing COD4.

I think the time of 512MB is already dead. Much like the time of 256MB was dead as soon as Quake 4 released.
i agree wholeheartedly. not too many things will use the extra memory, but does it mean we should not have it? ill use the timeless line..

some say why, i say why not. especially if the price is negligible, which it now unfortunately isnt for the 4870, lying at 50+ USD
Posted on Reply
#29
Binge
Overclocking Surrealism
+1. It's a marketing gimmick as DX11 will be the next standard. The only thing that isn't gimmicky about physX is that it uses a programming standard, CUDA, to achieve the calculations. Standards are good things.
Posted on Reply
#30
ShadowFold
DX10.1 is backwards compatible with DX11
Posted on Reply
#31
random
ShadowFold, post: 1240342"
DX10.1 is backwards compatible with DX11
That is true, but that will only mean that the current cards compatible with dx10.1 will only be able to use *some* of the dx11 capabilities, ati amd doesn't expect to release full dx11 card until probably Q4 2009.
Posted on Reply
#32
ShadowFold
Yea I gotta admit, by the time DX11 is out the HD 4800's probably won't run them all that well.
Posted on Reply
#33
random
ShadowFold, post: 1240347"
Yea I gotta admit, by the time DX11 is out the HD 4800's probably won't run them all that well.
Its by no means its a needed upgrade though, even DX10 hasn't completely outpaced DX9 yet same with Vista > XP. I don't think the 48xx series will become officially be outdated til late 2010 -2012

DX10.1 is closer to a gimmick I must agree.
Posted on Reply
#34
TooFast
this is really going to give ati more power! Now we can all games on high settings for cheap. I remember in summer when the gtx9800 was 349$ here in canada.
Posted on Reply
#35
Gzero
@wolf: "your point makes no sense to me, im taking about future proofing, i agree that piss all takes advantage of 1gb at the moment."

So am I... you can't future proof, all you can do is buy the best card for your money. 512mb isn't obsolete yet, if you want to run high resolutions and worry that you'll run out of memory, maybe you should worry about running out of horse power too and buy a X2 type card?

But for the majority of gamers and buyers, they'll probably take the lowest priced of the high end market.
Posted on Reply
#37
TooFast
well... I own netcore.24 in montreal, with 25 gaming stations all with 4850's (512) on 22 inch lcds 1680x 1050 and I can tell you all games play perfect on max settings ex for crysis and gta 4 but they still play at their native res. 512 cards are still good for any game unless you have a 30 inch lcd.
Posted on Reply
#38
TRIPTEX_CAN
TooFast, post: 1241386"
well... I own netcore.24 in montreal, with 25 gaming stations all with 4850's (512) on 22 inch lcds 1680x 1050 and I can tell you all games play perfect on max settings ex for crysis and gta 4 but they still play at their native res. 512 cards are still good for any game unless you have a 30 inch lcd.
Want another tech/employee with ATI experience in Montreal? :p
Posted on Reply
#39
ShadowFold
My butt's itchin for some 4870 Crossfire action, when's these prices gonna drop :toast:
Posted on Reply
#40
Wile E
Power User
Gzero, post: 1241353"
@ Wile E
Cod4?
Come again? :P
http://www.techpowerup.com/reviews/Palit/GeForce_GTS_250_2_GB/6.html

hehe
I don't see the performance boost there...
Hehe? I fail to see what you are looking at. There's nothing to compare it to. It's the only 250 in that test. How exactly does that chart show anything at all pertaining to this debate?

And besides, the point was not how much performance increase 1GB saw, the point was that I logged a fair bit over 512MB of video memory usage in COD4. And that's an easy game to run.

Try to put GTA4 on high detail and crank the draw distance up and see how well a 512MB card does. ;) Here's a hint, my 8800GT 1GB outperforms my 4850 512MB in GTA4 at higher settings, and the 8800GT is a slower card. If GTA4 is already that bad, it's only going to get worse from here.

And to take a look back in history, when Quake4 released, we already had the 1900 series card out (or was it 1950?). The X1800XT 512MB outperformed the 19x0XT 256MB at Ultra settings. Sure, it was only one game, but all it took was that single game to push other developers to optimize for more than 256MB.

Crysis and GTA4 is this generation's Quake 4, mark my words.
Posted on Reply
#42
Wile E
Power User
Gzero, post: 1241811"
The fact that at 1920x1200 the 4870 pulls ahead over the 9800GTX+/GTX250. If COD4 used over 512mb surely the fps would have dropped to a level where it would be out done by the 2gb version of the 9800(yes that 250 has a g92b core, and the it can still keep up in some games http://www.techpowerup.com/reviews/Palit/GeForce_GTS_250_2_GB/8.html )?
It's not a direct comparison. The 250 is clocked differently. And read my edit.
Posted on Reply
#44
Gzero
Wile E gtx+ spec http://www.overclockersclub.com/reviews/xfx_9800gtx_/4.htm
Difference of 5mhz on the core compared to the palit unless they changed the shader clock.

"Crysis and GTA4 is this generation's Quake 4, mark my words."
Helps if you don't pick games optimised for Nvidia... maybe that's why you have an advantage over the ATI card?
Posted on Reply
#45
Ravenas
This also increases the chances that we will see a 9800 GTX+ price drop.
Posted on Reply
#46
Wile E
Power User
Gzero, post: 1242463"
Wile E gtx+ spec http://www.overclockersclub.com/reviews/xfx_9800gtx_/4.htm
Difference of 5mhz on the core compared to the palit unless they changed the shader clock.

"Crysis and GTA4 is this generation's Quake 4, mark my words."
Helps if you don't pick games optimised for Nvidia... maybe that's why you have an advantage over the ATI card?
There was no GTX+ in the review you linked. I undertstand what you trying to get at, but you picked a poor review to make your point. There is no point of comparison for the 250 in that article.

Are you seriously pulling the "optimized for nVidia" card? lol. Look at other reviews, at the same settings, the 4850 512MB beats the 8800GT 512MB in both Crysis and GTA4. I already mentioned that the 8800 1GB only beats the 4850 at higher settings in GTA4. If the settings are kept within the 512MB limits of my 4850, my 4850 is faster. But guess what, when you increase the settings to take up 1GB, the 4850 512MB falls on it's face, and the 8800GT 1GB walks away from it. Considering the 4850 wins otherwise, I'd say it has nothing to do with "optimized for nVidia". ;)


Did you resist the transition to 512MB this hard when 256MB was the norm?
Posted on Reply
#47
wolf
Performance Enthusiast
Gzero, post: 1240809"
@wolf: "your point makes no sense to me, im taking about future proofing, i agree that piss all takes advantage of 1gb at the moment."

So am I... you can't future proof, all you can do is buy the best card for your money. 512mb isn't obsolete yet, if you want to run high resolutions and worry that you'll run out of memory, maybe you should worry about running out of horse power too and buy a X2 type card?

But for the majority of gamers and buyers, they'll probably take the lowest priced of the high end market.
you are skewing my words.

plenty takes advantage of the area between 512mb and 1gb, what i mean is piss all will use 1+ gb's of vram.

512 is very last year and unless you want to cheap out, or are satisfied with lower res's and less and less AA over time sure buy a 512 card.

an X2 type card, while having more power, also packs 1gb per GPU, think about why.

you CAN future proof, albiet for not very long, people that bought a 1GB 8800GT/9800GT probably wont have to upgrade yet, the power of G92 is still ample for most of todays games, however if stuck at 512mb, your already looking at a wall there. god forbid someone boght the 256mb model...
Posted on Reply
#48
Gzero
From the review: "The XFX 9800GTX+ features a die shrink to 55nm and increases in both the Shader and GPU core clock speeds to push the performance to the level of the competition. Not much else has changed from the initial release of the 9800GTX."

hmm? not a gtx+ you say? Did you read it, or only look at the link name?

"Did you resist the transition to 512MB this hard when 256MB was the norm?"
Yes, I got by with my laptop go5200m 32mb for a bit :P a massive step down from my old 6800 GT.
Posted on Reply
#49
Wile E
Power User
Gzero, post: 1244233"
From the review: "The XFX 9800GTX+ features a die shrink to 55nm and increases in both the Shader and GPU core clock speeds to push the performance to the level of the competition. Not much else has changed from the initial release of the 9800GTX."

hmm? not a gtx+ you say? Did you read it, or only look at the link name?

"Did you resist the transition to 512MB this hard when 256MB was the norm?"
Yes, I got by with my laptop go5200m 32mb for a bit :P a massive step down from my old 6800 GT.
Are you looking at the same link that you provided?

You gave me this link: http://www.techpowerup.com/reviews/Palit/GeForce_GTS_250_2_GB/6.html

There is no GTX+ in there. We were talking about the benefits of additional frame buffer, so we have a 2GB 250, what are we supposed to be comparing it to? There's no other 250 in that review, and no GTX+. The only thing that's close is the GTX, but that has different clocks than the GTX+/250.
Posted on Reply
#50
Gzero
But I posted in the post you quoted another review lol. It was good for comparison for showing the difference in performance between the 4 cards (4870/4850 and gtx/gtx+).

Anyway, to clarify my final point: when you spend £140 / $150, you are not paying enough to be future proofing for the highest resolutions. You guys already know this, look how much both of you spent on your own cards. You can't argue to those that spend less that they shouldn't buy a 512mb card, when just over a year and a half ago (let's assume that's when they made their bargain purchase) they probably got a 256mb cut down card x1900xt 256mb for instance?

I think a 4870 512mb at that price is still a good card to consider, when upgrading or buying in that budget for a new pc today.
Posted on Reply
Add your own comment