Discussion in 'News' started by btarunr, Jan 21, 2013.
I guess you must never go to theaters and watch regular TV?
Have you ever played these games vsynced to 60Hz or 120Hz? I suspect you haven't. If you haven't, then that 30Hz cap might well seem ok to you.
In addition, LightBoost, makes a very big difference by eliminating motion blur. The effect is nothing short of awesome. You basically get to have your cake and eat it.
So I take it that you find Blu-Ray video "laggy and jittery" as well because it's capped at either 24p, 25p, 23.976p, or 29.97i and will never exceed 30 FPS (Sans 29.97i but even that might jump because you have to deinterlace the video).
I think that what people think of as "jittery and laggy" is actually the render time from frame to frame. Just because you're running at 30 FPS doesn't mean that every frame is rendered in the same amount of time. So if you render 30 frames in 1 second but the first 25 get rendered in the first half three quarters of a second, and the last 5 takes that last quarter of a second, you have changes in frame-rate that causes the jittery-ness you describe and you have the reduced framerate for that quarter of a second that introduces lag (and if it reoccurs often, it introduces more jitter).
So all in all, the frame-rate argument is dumb. The only reason 60FPS and 120FPS feels "smoother" is because that difference in render time from frame to frame that that much less. So I'm willing to bet that a 30 FPS game with equal render times from frame to frame versus 60 FPS that doesn't, the 30 FPS has the potential to feel smoother if not just as smooth because the instantaneous frame rate would be consistant.
The most important thing to take away from this is:
Average frame rate is not the same thing as instantaneous frame rates and variation between frame render times. Also keep in mind that it will be hard to make it perfect, as scene's change the render time can change as well.
Wrong. Wrong. Wrong. Seriously people, I don't see what's so difficult to understand? I suspect that there's a certain amount of denialism going on here.
I might just write that article on framerate sooner rather than later.
Then do it because it seems like most people disagree with you. I'll read it if you write it. I'm not saying that 60 and 120 FPS isn't smoother. I'm just saying that the jitter that people notice with 30 FPS is more likely inconsistant render times rather than the rate. That has nothing to do with other rates.
I doubt that "most" people disagree. There's just a few vocal ones that insist on trying to negate what I'm saying, lol.
This isn't rocket science. I've seen all this stuff for myself and the effects are all very obvious, so I know I'm right. There's no way someone can "prove" me wrong with a counter argument, as it's inevitably flawed.
Aside from what other people stated about 85% of GTX 690, I'd also want to point out that those are pre-Catalyst 12.11b/13.1WHQL scores, so the difference might be less than 30%... Even so, I take most these wide-margin calculations with shovel full of salt.
Again, don't compare a movie with a game, the frame rates are "packed" waaaay different, Earth to Moon different. Besides, I believe we all want that virtual image, that virtual experience to be more life like, no? That also needs a high frame rate, just in the same way, a movie is more true to life close to 60 rather than 24.
Plus, a higher frame rate gives better control over your avatar.
Higher framerates can certainly lower frame latencies. But like the techreport reviews have proven that it can go from 2ms between frames all the way to 50+ms between frames and still maintain a higher avg frame rates. There's no special "packed" way a movies does, it's just a smooth consistent frame rate throughout. That is why 24 fps works in theaters and 30/60 fps works for TVs. Nobody ever complained about that! Your brain will adjust to a consistent frame rates(eg. you can't see fluorescent lights blinking). Inconsistent frame rates will need to have a minimum latency for you to perceive it as smooth. I think anything below 16.7ms(60hz) is hard to detect. I can tell a slight difference between 60 and 75hz but not too many people can.
Write it and stop wasting data and bandwidth posting this crap. You're just being rude and annoying.
I swear I'm going to shoot the next person that tries to compare movie framerates to videogame framerates.
Apparently we can try to inform these people until we're blue in the face but they'll keep skipping over what we're saying (and backing up with facts) and going on comparing movies/TV to games...
whoa, raging much? I agree with you but frame rates are not that serious. Let's just solve world hunger or cure cancer
I think it is time to move on folks. Feel free to continue this conversation about frame rates in a new thread or in PM's.
The only people being rude, annoying and thread crapping are those like yourself who take a pop at me for no good reason.
Anyway, it's the wrong thread to talk about framerates here, like HammerON said.
I'm only going to say it once and be done with it: yes, there is. The frames overlap in such a way that fools the eye. I saw that someplace, but I can't find it any more. Believe or don't believe, your choice.
Yeah, I can settle with 30FPS locked on if I have to, but I don't like it, even if it's 33,(3)ms perfect. I want life like experience, faster response from me and prompt response from my character. That is given by a high FPS of at least 60. That rocks my boat.
And why on God's green Earth people think this card would be overkill? My 2500@4,5GHz and 7950 (1170/1600MHz - somewhere equal to 7970ghz ed./gtx680) can't give a solid 60FPS in all games even on 1680x1050 if I choose highest in game settings. There is never to much performance, just to expensive to get it.
The issue I have with that 85% of a 690 comment is that there's no unit of measurement.
Is it 85% in game performance? Then that pretty much says right there how the GK110 card will perform.
Is it 85% of the render/shader performance? In this case the actual in-game performance might be even better than the 690, because there aren't any SLI scaling issues to deal with.
I'm expecting this to play out much like 560Ti vs 580, which is really where GK104 probably stands against GK110.
This had better have all of the 15 SMX units enabled. This card should've launched a year ago, I doubt anybody is going to want or accept anything less than a perfect card, especially at the rumoured price of $900...Nvidia have had a year to get their shit together so this had better be their full 2880 shader GPU.
Yes, film makers use tricks like using the same frame 4 times, look read here and this will help to get a better understanding on both film, and games, then for the rest its sometime better to agree to disagree, beauty is in the eye of the beholder, and same can be said for FPS cos we are all different hence why we feel so strongly about our own view its cos that how we see things
Last warning. Any more discussion about frame rates/movies will lead to an infraction.
Worth the wait
I'll certainly get one, I deliberately skipped the 600 series as they weren't much of an upgrade compared to my 3 570's, and only a retard would upgrade EVERY year.
Would you pay $900 for it though? I'd have to be very well off to spend that much on a card.
If it's close to a gtx690 performance, some will. At least you'll get a much better experience and there is no need for SLI profiles anymore.
I don't get this.. You're saying with every new generation of cards, the price should increase by almost twofold?
I remember when the 5870 replaced the 4870... The performance was almost double and the price was actually cheaper!
I dont' think it makes sense at this price point. If it's coming out to pre-empt AMD's 8 series cards, it should be priced around what the GTX680 was when it was released.
I suppose this also depends on what this card is branded as.
Only a retard would use such a sweeping generalisation but seriously I probably do upgrade once a year, I have a Sapphire Vapor-X 7950 (overclocks to 1200/1450 on stock volts)
I buy mainly mid-high end (last few cards: GTX 470, GTX 570, HD 7950) so to some a marginal upgrade every generation but the way I look at it is that I sold my 470 for 2/3 the cost of the 570 and the 570 was better at stock, able to far exceed the 470 OC, used less power ran cooler etc etc. The same is true when I sold my 570 and went with my current 7950, I paid a small upgrade fee after I sold my card and benefitted from a newer architecture, far greater performance when taking overcloking into consideration and of course it's the latest gen hardware so will hold the same kind of resale value as the previous cards when the new generation of cards come out enabling me to again upgrade for a marginal expense whilst having the latest gen hardware, it's a no brainer to me.
Nope erocker, the huge ripoff price for the 7xxx series kept me away from it until the magic driver and bundle came true. What I'm saying guys, is that no matter how high the price will be, there are folks that will pay... at least if the performance is wright.
At the same time, if this would launch at the same price point as gtx 680, it will mean price drops for every card underneath and the killing of gtx680/690. Sure, that should happen if we are talking about a new series, but at least for now, we know to little. If AMD launches at a better price/performance, then it lowers the prices (they have a margin to play with), if not... not. It's a win-win for them.
Separate names with a comma.