• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Response to G200b Slated for March

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
kysg, newtelie and rebo, it's obvious that we disagree, so it's better to agree to disagree, but think about these things:

- We are able to get a better card for the same money now than when only a card or two a year were released, because competition brings prices down. Maybe is not going to be the best after a month, but it certainly is comparatively better. If only few cards were released competition wouldn't exist, think about G80 days. So what is what we want, better cards to be able to play better games or the best card for a long period of time (even if that's not the best that could exist) that serves nothing but to be able to say you are above others...

- Forget about one mayor release per year, a release that will make a card 2x as fast, that's something of the past. As complexity has incresed the development cycle is goig up, just as with CPUs and is probably now somewhere around 18 months or more. Companies just can't be the loser for so long, again remember G80 days. In that time, new processes can appear, yields can improve and so on, and those things make it pssible to release cards that are up to around 50% faster. It'd be stupid not to use those improvements. Specially if you are the one behind.

- Related to the above: this industry is an egg and chicken thing. Without a card that could run it, games would never improve, they just can't take the risk. On the other hand GPU manufacturers can't neither take the risk of releasing a card that would be overkill. But in order to improve someone has to take the risk. Well Crytek took the risk and we know how that ended up. Yet they knew better cards were coming out soon, imagine if they had to wait 18 months, that simply wouldn't be profitable and all developers would just make their release coincide with card releases. Even then that wouldn't be profitable, only best games would sell and developers don't know if their game will be the best one, they can hope, they can put as much energy as they can, but they never know. And that's unsustainable, no one works for 3 years just to get nothing in turn.

That also applies to the technologies behind the GPUs: fab process, ram, PCBs, everything. If they know their advancements will not be used until 18 months later they wouldn't put much effort into it. Who would want to put money into something so uncertain that would happen every 2 years without knowing you could have a second chance they use your tech in a later product? Bacause if you develop something every 18-24 months and you happen to loose to another company or if you end up better but you are late, you'd have to wait another 18 months and by then your product wouldn't be the best one anyway.

The industry advances so fast because the wheel keeps rolling for every link in the chain and the ones above in the chain use the best at their hands to make the best they can in all moments. Break one link and everything falls apart.

- Sorry for the rant, but there's one more thing to take into account: the market today is not as it was in the past, it's already saturated. In it's infancy all markets are easier. When only a 10% of the target population has your product or one of your competitors product, you fight so that you can convince buyers into buying your product. You don't care about competition. But when it's saturated, you have to convince them to upgrade over what they have, so being the looser even if it's only by a bit is unaceptable. People will upgrade to what is better. Price wars doesn't help there, the competitor with the best product can always fight you there: Intel vs AMD.
 
Last edited:

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,801 (3.87/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
I think one of the major issues we now face is the fact that in many cases it seems to take longer to develop a game than it does to develop some GPU's, as was said earlier, generally as a rule we used to be in a situation where the GPU was always trying to play catchup with games and their architecture, it now seems more and more that it's the reverse (possibly with that foolishly coded Crysis as an exception) so we are starting to find ourselves in a position where by we have many games in the gamecharts that cannot make full use of the GPU's many have in their system, for example, take COD World at War, I play at 19xx resolution with max detail and AA/AF, one of my single HD4850 1GB cards (albeit overclocked) can play it smoothly, which of course means that all those 4870's, 4870x2's, 4850x2's, GTX260's, GTX280's, 285's and 295's are not being used to their full potential so I think we will perhaps see, more and more that ATi and NVidia struggle to shift cards to many gamers as those gamers see no need to upgrade with each new generation (or maybe even alternate generations)...... a viscious circle mefinks for NVidia and ATi in the longer term, of course you will always have "bench junkies" who will invest in every new release but we are only talking around 3-5% of PC users there.
 

Rebo&Zooty

New Member
Joined
May 17, 2008
Messages
490 (0.08/day)
Tatty, your hit the nail on the head.

Games today are struggling to keep up, but they also are forced to try and support tech from years ago because most of the market still has that CRAP(i know people still using 9550 256mb cards for god sake)

Maby card makers should hold off on putting out new standalone videocards, work on addin/booster cards for a while, Hell if i could get a "physx" card that could run cuda or stream(or better both) OpenCL based stuff, well, I WOULD GET ONE, as would alot of companys, If they could run medical imaging or other gpgpu based stuff on an addin card, or add them to systems they already have to boost perf, i can see them doing that.

servers could make use of them as well, alot of uses for a card that can be readly programed for and used for more then just 1 primary thing.

Done properly i could see this boosting the game market as well as encoding, folding, video playback.........honestly If amd and nvidia could just work togather on OpenCL based apps (like combinding cuda and stream) and support for said apps, that would go along way to boosting the computing experiance for EVERYBODY!!!!

imagin if you could add a pci-e 1x card and boost perf of most of your apps, take the load off the cpu, hell so many things COULD be run thru this kinda card... the more i think about it the more i think about it, it reminds me of that toshiba cell proc card that was talked about some time back, but this would be FAR more easly supported IMHO........

main use i would love for it is encoding tho....
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (0.99/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
The major problem we are facing is that the PC gaming market isn't the dominate factor as in days past. There was a time when game developers could force you to buy video cards as they would shift from one directx version to another (for example). Yeah we moaned but we did go out and buy video cards that were capable of running those games. Now a days, the console market is much stronger then days past and, current video cards are able to play pc games at acceptable frame rates. This IMO means that the need for new directx version and their hardware requirements has to decrease.

Another tidbit, according to valve's survey only 22.XX% are using directx 10 capable PC systems. With directx 11 on the horizon (can be found in 64-bit win7...not sure about 32-bit win7 yet) what do you think the numbers will be? How many will find a need to buy DX11 video cards/software?

The revolving door of "upgrading" has IMO taken it's toll on the consumers.

Edit:

Having said that, it is with my opinion that in order to draw the attention needed to get people to buy it has to be:
A. Cheap
B. Innovative: It has to be something completely new to the market
C. Offer something more compelling then fraps.
D. PC gaming developers return
 
Last edited:

Rebo&Zooty

New Member
Joined
May 17, 2008
Messages
490 (0.08/day)
I got a feeling 11 wont be anything drastickly new, just an evolution on 10, since 10 currently isnt being used much at all anymore.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
I got a feeling 11 wont be anything drastickly new, just an evolution on 10, since 10 currently isnt being used much at all anymore.

hehe, ypou have a feeling. I thinks it's already been confirmed. Well sorta.
 
Joined
Oct 27, 2007
Messages
1,132 (0.19/day)
System Name Grandpa
Processor i5 4690K
Motherboard Gigabyte Z97X-UD5H-BK
Cooling water
Memory 8GB Corsair Vengence 2400MHz
Video Card(s) Gigabyte 5850 x2
Storage Samsung SM951
Display(s) Catleap 27"
Case coolermaster stacker
Power Supply corsair AX860i
Mouse logitech g5 original
Keyboard Ducky
Software Windows 8.1
The major problem we are facing is that the PC gaming market isn't the dominate factor as in days past. There was a time when game developers could force you to buy video cards as they would shift from one directx version to another (for example). Yeah we moaned but we did go out and buy video cards that were capable of running those games. Now a days, the console market is much stronger then days past and, current video cards are able to play pc games at acceptable frame rates. This IMO means that the need for new directx version and their hardware requirements has to decrease.

Another tidbit, according to valve's survey only 22.XX% are using directx 10 capable PC systems. With directx 11 on the horizon (can be found in 64-bit win7...not sure about 32-bit win7 yet) what do you think the numbers will be? How many will find a need to buy DX11 video cards/software?

The revolving door of "upgrading" has IMO taken it's toll on the consumers.

Edit:

Having said that, it is with my opinion that in order to draw the attention needed to get people to buy it has to be:
A. Cheap
B. Innovative: It has to be something completely new to the market
C. Offer something more compelling then fraps.
D. PC gaming developers return

I agree. I'd hope this issue plays on the minds at Redmond. Their software costs us $ in more ways than one.
 
Joined
Feb 26, 2007
Messages
850 (0.14/day)
Location
USA
GPU's are not only for gaming ; P
I've gotten to 16th in TPU's F@H team by running my GTX260 almost 24/7. Thats only one card on 1 computer. Granted I'm 18th now because I've been using it for other things.
Thats just one example of apps that crunch numbers/video/audio. I almost bought a second one when the 216 version came out because mine went down and I'd be able to run my apps that much faster.

Yes, I do game and yes I played Crysis Warhead at max settings with no problems with my system. BTW F@H kills your frame rates, lol.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Tatty, your hit the nail on the head.

Games today are struggling to keep up, but they also are forced to try and support tech from years ago because most of the market still has that CRAP(i know people still using 9550 256mb cards for god sake)

Maby card makers should hold off on putting out new standalone videocards, work on addin/booster cards for a while, Hell if i could get a "physx" card that could run cuda or stream(or better both) OpenCL based stuff, well, I WOULD GET ONE, as would alot of companys, If they could run medical imaging or other gpgpu based stuff on an addin card, or add them to systems they already have to boost perf, i can see them doing that.

servers could make use of them as well, alot of uses for a card that can be readly programed for and used for more then just 1 primary thing.

Done properly i could see this boosting the game market as well as encoding, folding, video playback.........honestly If amd and nvidia could just work togather on OpenCL based apps (like combinding cuda and stream) and support for said apps, that would go along way to boosting the computing experiance for EVERYBODY!!!!

imagin if you could add a pci-e 1x card and boost perf of most of your apps, take the load off the cpu, hell so many things COULD be run thru this kinda card... the more i think about it the more i think about it, it reminds me of that toshiba cell proc card that was talked about some time back, but this would be FAR more easly supported IMHO........

main use i would love for it is encoding tho....

And what makes a pcie x1 add in card better than a GPU with CUDA or Brook+?? What I mean is that they are already doing what you want with GPUs, there's no need for add in cards that would be much slower than a GPU. From the little I know heavy GPGPU applications need much more bandwidth and memory than games do, so pcie x1 would cripple performance a lot.
 

kysg

New Member
Joined
Aug 20, 2008
Messages
1,255 (0.22/day)
Location
Pacoima, CA
System Name Workhorse lappy
Processor AMD A6 3420
Memory 8GB DDR3 1066
Video Card(s) ATI radeon 6520G
Storage OCZ Vertex4 128GB SSD SATAIII
Display(s) 15inch LCD
Software Windows 7 64bit
well hell what good is tech if it's underutilized as hell, I'm not even going to bring crysis into the equation because crysis has already come and gone, pcie x1 cards are only for those of us that can't use nvidia cards and don't have windows 7 yet, me personally it will be a while before I get 7 probably end up bootlegging man, so don't want to do that. Also as far as games are concerned they just aren't that good as we would love to say they are. And yes GPU's are used for more than gaming but hell ask anyone what's the primary reason your buying a GPU, obvious answer or at least 75% percent will tell you gaming...and like I said I hate to sound like a total jerk but new tech is nothing to write home about if underutilized its like tatty said, but I also argue that a 2nd chance is useless when you saturate your market with crap anyways. I swear I hate to say it but man nothing makes sense nowadays devs are just tossing out shitty games like there candy, I haven't been able to rally behind square since FF8, The only thing sony has remotely done that made my eye blink was shadow of colussus and grand turismo, and god of war. EA just keeps acting like its the hollywood of the gaming industry and can't act like it knows the what right hand is doing from the left. I hate to sound like a whiney little bastard but damn...I wonder if devs even give a damn anymore.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.59/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Yep, exactly, the cycle continues. I just wish it would slow the hell down.



I'm not a fanboy of either side, and I'm doing the exact opposite. I want a rest. I want a product that actually lasts at the top for a while. I liked the days where I could spend $300 on a graphics card, and not even think about upgrading to a new card for a year.

ya no kidding, this 6 month release of cards makes it not worth keeping 1 anymore, no wonder ive stuck with this card for so long, if they continue to make stuff faster and faster instead of waiting for a year then they risk the same deal with the Auto Industry. Also our current economy doesn't help them make better money.
 
Top