Discussion in 'techPowerUp! Club Forum' started by TheMailMan78, Feb 2, 2011.
Well...I guess I'm out then...
lol same here, it's just marketing garbage though.
Still, it makes me wonder if two GTX580s are needed for playable framerates for 1920x1080 and upwards.
I don't know what performance requirements were like for the alpha compared to the retail release, but if they're at all similar, you definitely won't need two 580's to max it out.
That would be insane, i would demand an immediate patch to optimize the engine.
Good to hear.
I know, me too. But see it this way, that would drive the PC gaming forward in terms of graphic quality and hardware requirements.
That's a good point. On one hand i want to be able to run the game without a hitch, but on the other hand i want it to really utilize everything my system has to offer.
Kind of tough to have games swing both ways i guess, but i hope they do.
You do realize that was a tiny map with only 32 players right? Granted it was an Alpha but I am willing to bet once you see 64 players on maps BIGGER then BF2 a 580 will be needed.
If a 580 is needed they just alienated a ton of gamers.
A 580 may be needed for a 64-player map at 2560x1600 w/ all details and aftereffects cranked, but I don't think it's going to be needed to play at 1080P with medium to high details in any aspect of the game. Besides, the Frostbite engine has always been and still is CPU intensive. I think the average gamer is going to max out their CPU before their GPU.
With my GTX 570 and i5 2500K both at stock, I experienced no hiccups in the alpha. Battlefield has always been nicely optimized too.
Don't read into the event, guys. Did you not notice "the way it's meant to be played," one of nVidia's slogans? It's obviously being funded in part by them to not only showcase the game, but do some marketing for nVidia.
Yeah needing a 580 to MAX out a game should be a good thing. Its not a minimum after all.
You people bitch if its a port and cry if you need top tier hardware. Stop being coochies.
Embrace the high end.....EMBRACE IT!
It was also running low res. textures.
If everybody had high end money to spend on high end hardware i'd say the same, but since a large chunk of us don't. We want a game that's optimized and can run decently at max at 1080p without having to drop $500 on a GPU.
Most of us are ''budget enthusiasts'', and not ''i can afford 3 GTX 580's'' enthusiasts.
I think that a 580 to max 1080p and an SLI/Xfire setup for higher resolutions is where they should aim. (and I would love to see "quad core processor at 4.0 Ghz or higher" somewhere on the sheet)
Then you should buy a console.
I think a 580 in SLI at normal resolution is where they should aim. I want graphics that makes my eyes bleed!
I disagree, i think at 2560x1600 and above should require a GTX 580 and above. 1080p is a very standard resolution for gamers, hence such a requirement wouldn't be plausible.
At 1080p we should be able to max it out with at least a 6950 for it to be reasonable.
But that's going by my own scale of what's reasonable.
I think that's pushing it a bit far; mostly because I am afraid of what the CPU heavy frostbite engine would require to keep it from bottle necking.
the market needs something to push it. I want frostbite to drive people (and manufacturers) to want/need new hardware.
Bingo. The linked article is on NVidia.com. Also, the GDC Fault Line demos ran on a single 580, and looked much better than the alpha.
Thats not pushing the envelope far enough. Thats hardly above a crappy port. I want higher then Crysis.
Well i do own a Console, but that's irrelevant. I don't understand why you're so adamant in making people drop $500 and above on a GPU to max games out. Yes we want PC games to stress our hardware, but within reason. You're overgeneralizing computer enthusiasts, by making the assumption that everybody is on the same level hardware wise, so developers don't have to take into account those who have decent machines but not the best.
Didn't say it couldn't scale like Crysis. But I don't want it to be maxed out on an abacus ether. You wanna max it out? I wanna have to drop 500 bucks.......and be worth it.
exactly. if software manufacturers can't be bothered to make $500 dollar hardware worth it, then hardware manufacturers will stop making it; then they get to charge $500 for lesser hardware in order to increase profit margins.
we are not saying that lesser rigs will not be able to lay it at medium to high settings; just not MAX settings.
If the engine is capable of scaling on a variety of systems with decent enough settings, then that is more then reasonable. I agree that that if BF3 can be maxed out on something like a 8800GT, then that's an issue. But we don't want a Crysis, Crysis was so heavy that i remember people with 8800 Ultras complaining it didn't run well, Crysis scaled horribly to a point where it took a few years for hardware to catch up to it instead of the other way around and that's what im mainly scared of if they create a game that either needs high end to look pretty or you might as well just buy the Console version.
That's reasonable. I was actually combining the two terms ''Max'' and ''High'' settings, and thinking you wanted that to be inaccessible to those without a $500 GPU.
I'll go with that too, as that is what card I'll probably get next unless they drop the 6970 to the $300 range by then. I'm probably gonna see how it does at 1080p with the 6870 first and adjust from there.
Classic Amiga/PC management sim Theme Park and Call of Duty rival Battlefield 3 heading to iOS
Separate names with a comma.