- Sep 11, 2009
- 2,680 (0.88/day)
- Reaching your left retina.
The difference between 30 and 34 fps is not massive. It is there, but it is relatively subtle. Now between 30 and 45 fps, that's another matter. I consider anyone who will pay $500 to go from 30 to 34 fps in most of his games an utter fool, or someone with just lots and lots of money to waste.
Personally I don't spend that much, far from it, but even in my price range I'd pay x% more for x% more performance. IMO your case is a falacy, like I said because there's no card that will give you 30-34 fps in all games. On some games, a particular card might be enough, in others it won't and let's not start talking about settings. So like I said a card that s faster it's faster and always will. It's not my bussiness or your bussiness to decide if those extra $100 are worth it for the people who are willing to pay $400++ for a card.
When you say "anyone who will pay $500 to go from 30 to 34 fps in most of his games an utter fool" you are calling a fool to ALL enthusiasts, because that's what you get. A 15% increment no matter if it's at 30 fps or at 200, its a substantial difference and worth paying for some people. At 30 fps is going to be ever more important than at 200 plain and simple.
According to your logic overclocking is useless, because you'll never achieve much more than 15% more actual performance and the best you would do is obtain those extra 3-4 fps.
My thoughts exactly. I could have upgraded to like an HD6950 or a 560Ti, but it never seemed worth it coming from an HD5850. I don't really care for upgrades unless they give me upwards of a 20% performance gain, and if I'm gonna spend $400-500, it better be like a 50% gain.