Sunday, January 11th 2009

AMD's Response to G200b Slated for March

NVIDIA snatched the performance crown from ATI with the introduction of the GeForce GTX 295 accelerator, and its launch itinerary for CES 2009 includes the GeForce 285, NVIDIA's second fastest graphics accelerator. NVIDIA's campaign to regain the performance crown was spearheaded by the G200b graphics processor, that, while not offering anything new, helped cut manufacturing costs and reduced the thermal envelope of the GPU, making conditions favourable for a dual-GPU accelerator, the GeForce GTX 295.

AMD on the other hand, has announced price-cuts to respond to the GeForce GTX 295, by lowering the prices of its Radeon HD 4870 X2 accelerator. The G200b is likely to get a competitor from AMD by March, when the company is looking to release the industry's first GPU built on the 40nm manufacturing process, the RV740. But wait, there seems to be something larger on the cards, according to the various sources Hardware-Infos got in touch with. AMD is planning the RV790 graphics processor. It will be a current-generation GPU built on the next generation 40nm manufacturing technology. There is a lot of speculation surrounding the RV790's specifications, with some of the more plausible ones hinting it has two additional SIMD clusters (960 SPs) and a total 48 texture memory units (TMUs). Both the RV740 and RV790 are slated for March, there's also a little indication of AMD using the occasion of CeBIT for its announcements and product launches.
Source: Hardware-Infos
Add your own comment

36 Comments on AMD's Response to G200b Slated for March

#26
Tatty_Two
Gone Fishing
I think one of the major issues we now face is the fact that in many cases it seems to take longer to develop a game than it does to develop some GPU's, as was said earlier, generally as a rule we used to be in a situation where the GPU was always trying to play catchup with games and their architecture, it now seems more and more that it's the reverse (possibly with that foolishly coded Crysis as an exception) so we are starting to find ourselves in a position where by we have many games in the gamecharts that cannot make full use of the GPU's many have in their system, for example, take COD World at War, I play at 19xx resolution with max detail and AA/AF, one of my single HD4850 1GB cards (albeit overclocked) can play it smoothly, which of course means that all those 4870's, 4870x2's, 4850x2's, GTX260's, GTX280's, 285's and 295's are not being used to their full potential so I think we will perhaps see, more and more that ATi and NVidia struggle to shift cards to many gamers as those gamers see no need to upgrade with each new generation (or maybe even alternate generations)...... a viscious circle mefinks for NVidia and ATi in the longer term, of course you will always have "bench junkies" who will invest in every new release but we are only talking around 3-5% of PC users there.
Posted on Reply
#27
Rebo&Zooty
Tatty, your hit the nail on the head.

Games today are struggling to keep up, but they also are forced to try and support tech from years ago because most of the market still has that CRAP(i know people still using 9550 256mb cards for god sake)

Maby card makers should hold off on putting out new standalone videocards, work on addin/booster cards for a while, Hell if i could get a "physx" card that could run cuda or stream(or better both) OpenCL based stuff, well, I WOULD GET ONE, as would alot of companys, If they could run medical imaging or other gpgpu based stuff on an addin card, or add them to systems they already have to boost perf, i can see them doing that.

servers could make use of them as well, alot of uses for a card that can be readly programed for and used for more then just 1 primary thing.

Done properly i could see this boosting the game market as well as encoding, folding, video playback.........honestly If amd and nvidia could just work togather on OpenCL based apps (like combinding cuda and stream) and support for said apps, that would go along way to boosting the computing experiance for EVERYBODY!!!!

imagin if you could add a pci-e 1x card and boost perf of most of your apps, take the load off the cpu, hell so many things COULD be run thru this kinda card... the more i think about it the more i think about it, it reminds me of that toshiba cell proc card that was talked about some time back, but this would be FAR more easly supported IMHO........

main use i would love for it is encoding tho....
Posted on Reply
#28
EastCoasthandle
The major problem we are facing is that the PC gaming market isn't the dominate factor as in days past. There was a time when game developers could force you to buy video cards as they would shift from one directx version to another (for example). Yeah we moaned but we did go out and buy video cards that were capable of running those games. Now a days, the console market is much stronger then days past and, current video cards are able to play pc games at acceptable frame rates. This IMO means that the need for new directx version and their hardware requirements has to decrease.

Another tidbit, according to valve's survey only 22.XX% are using directx 10 capable PC systems. With directx 11 on the horizon (can be found in 64-bit win7...not sure about 32-bit win7 yet) what do you think the numbers will be? How many will find a need to buy DX11 video cards/software?

The revolving door of "upgrading" has IMO taken it's toll on the consumers.

Edit:

Having said that, it is with my opinion that in order to draw the attention needed to get people to buy it has to be:
A. Cheap
B. Innovative: It has to be something completely new to the market
C. Offer something more compelling then fraps.
D. PC gaming developers return
Posted on Reply
#29
Rebo&Zooty
I got a feeling 11 wont be anything drastickly new, just an evolution on 10, since 10 currently isnt being used much at all anymore.
Posted on Reply
#30
DarkMatter
Rebo&ZootyI got a feeling 11 wont be anything drastickly new, just an evolution on 10, since 10 currently isnt being used much at all anymore.
hehe, ypou have a feeling. I thinks it's already been confirmed. Well sorta.
Posted on Reply
#31
Bundy
EastCoasthandleThe major problem we are facing is that the PC gaming market isn't the dominate factor as in days past. There was a time when game developers could force you to buy video cards as they would shift from one directx version to another (for example). Yeah we moaned but we did go out and buy video cards that were capable of running those games. Now a days, the console market is much stronger then days past and, current video cards are able to play pc games at acceptable frame rates. This IMO means that the need for new directx version and their hardware requirements has to decrease.

Another tidbit, according to valve's survey only 22.XX% are using directx 10 capable PC systems. With directx 11 on the horizon (can be found in 64-bit win7...not sure about 32-bit win7 yet) what do you think the numbers will be? How many will find a need to buy DX11 video cards/software?

The revolving door of "upgrading" has IMO taken it's toll on the consumers.

Edit:

Having said that, it is with my opinion that in order to draw the attention needed to get people to buy it has to be:
A. Cheap
B. Innovative: It has to be something completely new to the market
C. Offer something more compelling then fraps.
D. PC gaming developers return
I agree. I'd hope this issue plays on the minds at Redmond. Their software costs us $ in more ways than one.
Posted on Reply
#32
Darkrealms
GPU's are not only for gaming ; P
I've gotten to 16th in TPU's F@H team by running my GTX260 almost 24/7. Thats only one card on 1 computer. Granted I'm 18th now because I've been using it for other things.
Thats just one example of apps that crunch numbers/video/audio. I almost bought a second one when the 216 version came out because mine went down and I'd be able to run my apps that much faster.

Yes, I do game and yes I played Crysis Warhead at max settings with no problems with my system. BTW F@H kills your frame rates, lol.
Posted on Reply
#33
DarkMatter
Rebo&ZootyTatty, your hit the nail on the head.

Games today are struggling to keep up, but they also are forced to try and support tech from years ago because most of the market still has that CRAP(i know people still using 9550 256mb cards for god sake)

Maby card makers should hold off on putting out new standalone videocards, work on addin/booster cards for a while, Hell if i could get a "physx" card that could run cuda or stream(or better both) OpenCL based stuff, well, I WOULD GET ONE, as would alot of companys, If they could run medical imaging or other gpgpu based stuff on an addin card, or add them to systems they already have to boost perf, i can see them doing that.

servers could make use of them as well, alot of uses for a card that can be readly programed for and used for more then just 1 primary thing.

Done properly i could see this boosting the game market as well as encoding, folding, video playback.........honestly If amd and nvidia could just work togather on OpenCL based apps (like combinding cuda and stream) and support for said apps, that would go along way to boosting the computing experiance for EVERYBODY!!!!

imagin if you could add a pci-e 1x card and boost perf of most of your apps, take the load off the cpu, hell so many things COULD be run thru this kinda card... the more i think about it the more i think about it, it reminds me of that toshiba cell proc card that was talked about some time back, but this would be FAR more easly supported IMHO........

main use i would love for it is encoding tho....
And what makes a pcie x1 add in card better than a GPU with CUDA or Brook+?? What I mean is that they are already doing what you want with GPUs, there's no need for add in cards that would be much slower than a GPU. From the little I know heavy GPGPU applications need much more bandwidth and memory than games do, so pcie x1 would cripple performance a lot.
Posted on Reply
#34
kysg
well hell what good is tech if it's underutilized as hell, I'm not even going to bring crysis into the equation because crysis has already come and gone, pcie x1 cards are only for those of us that can't use nvidia cards and don't have windows 7 yet, me personally it will be a while before I get 7 probably end up bootlegging man, so don't want to do that. Also as far as games are concerned they just aren't that good as we would love to say they are. And yes GPU's are used for more than gaming but hell ask anyone what's the primary reason your buying a GPU, obvious answer or at least 75% percent will tell you gaming...and like I said I hate to sound like a total jerk but new tech is nothing to write home about if underutilized its like tatty said, but I also argue that a 2nd chance is useless when you saturate your market with crap anyways. I swear I hate to say it but man nothing makes sense nowadays devs are just tossing out shitty games like there candy, I haven't been able to rally behind square since FF8, The only thing sony has remotely done that made my eye blink was shadow of colussus and grand turismo, and god of war. EA just keeps acting like its the hollywood of the gaming industry and can't act like it knows the what right hand is doing from the left. I hate to sound like a whiney little bastard but damn...I wonder if devs even give a damn anymore.
Posted on Reply
#35
eidairaman1
The Exiled Airman
newtekie1Yep, exactly, the cycle continues. I just wish it would slow the hell down.



I'm not a fanboy of either side, and I'm doing the exact opposite. I want a rest. I want a product that actually lasts at the top for a while. I liked the days where I could spend $300 on a graphics card, and not even think about upgrading to a new card for a year.
ya no kidding, this 6 month release of cards makes it not worth keeping 1 anymore, no wonder ive stuck with this card for so long, if they continue to make stuff faster and faster instead of waiting for a year then they risk the same deal with the Auto Industry. Also our current economy doesn't help them make better money.
Posted on Reply
#36
Rebo&Zooty
they should put a new card out a month :P
Posted on Reply
Add your own comment
Apr 16th, 2024 12:52 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts