Discussion in 'News' started by btarunr, Nov 2, 2012.
This thread is just....
I love that show.
If the next gen consoles could run Battlefield 3 Ultra settings style games @1080p at 60 FPS average and min 30FPS, then i would be a very happy man. Albeit BF3 Ultra setting at 30-40FPS is pretty unplayable. About human eye not noticing past 30FPS... thats really bull... i can see the ghosting difference between a 2 and 5ms screen. You don't focus on those things all the time, but when you notice them... they annoy the hell out of you.
- Accually the worst thing on PS3 and Xbox360 right now is that when you run games at 720P/1080P - The more graphically intense games have a pretty noticeable stutter/lag in them.
The Reality Synthesizer is a modified GeForce 7800. It has a core clock of 550 mhz, 24 pixel-shader pipes each capable of 27 flops/cycle, and 8 vertex pipes capable of 10 flops/cycle, for a total flops performance of
550mhz x (24x27 + 8x10) = 550 x 728 = about 400 gigaflops.
This is the gpu from a Trinity A10-5800K APU
Compare the pair. With 7 years of optimization, more efficient game engines, most likely more video memory, RAM, processors and in total more powerful hardware full 1080p @ 60fps seems easily done to me.
So a PS4 emulator will be very doable
Emulators are made by indie developers, hence why we can just emulate ps2 games @ full speed. The amount of coding a single or a few people can do and optimizations for a completely different platform to a console is pretty hard. No need to be sarcastic.
No, no it will not. The hardware will be LIKE what we have in PCs, but in reality the specialty chips they will make will behave much differently and will be addressed differently in the system.
Good for AMD. If 2 gpus are going to be powering the games it should make sli/crossfire that much easier to code for when we get the ports. I for one can't wait for the new consoles, even though I won't be buying one. Going to be another year though... pfft
There were some people who made a "lite" version of XP a few years ago that was bootable, it made only mild differences in the performance of the games and applications they ran and tried.
The difference here is the lack of other configurations. Optimize for one machine, with one set of known instructions. The difference is at the software level of the application, much like a patch can improve frame rates, and a GPU driver can increase performance.
I don't know about you people. But when I play games I spend 90% of my time with the story and gameplay and could care less about visual quality. Remember the days of NES and 64 when it was actually fun to play and you didn't just spend your whole time staring at a screen and saying OH OH look I found a rock that's not properly rendered!! Let me go online and thread crap the internet over it.
The XB360 GPU is much more capable than 2600xt. A hell of a lot more capable, and so is RSX. 2600xt was 1/4 a HD2900, while Xenos was more like 2/3. Your 3dmark numbers are irrelevant and unrelated. Trinity GPU is not much more capable than the old HD2900 and by extension the Xenos GPU. And by this I mean generationally speaking, yes it's maybe 4x, maybe 6x faster in some regards, but nearly any dedicated GPU is 10x or 20x faster. APUs are also much more limited by memory bandwidth in it's PC form (which makes it much much slower than HD6570, the chip that can be said t be 1/4 a HD6970), but that's something I expect them to solve in the PS4.
All in all it's incredibly irrelevant whether it's 4x-6x or 8x faster, it' won't do new games at steady 1080p@60fps unless it's at least one order of magnitude faster. Same games as the PS3 (old games)? Yes of course, but what's the point. If a resolution upgrade is the only thing the new consoles are going to offer, I don't think anyone's going to pay so much for a resolution upgrade, unless the consoles sell for <$200. I mean yeah, people are stupid, but if after 7 years dismissing it, they suddenly awaken to the fact that 1080p, true 1080p, is much better and they decide they now need to shell out $400 just for that upgrade... I swear I'll visit them one by one and punch them in the face. No kidding.
I remember those days. They sucked bad. I spent all days knowing how much better my uncle's PC games were compared to the crap I was forced to play because I was a child.
Better graphics don't make games worse. There's games with bad graphics that suck too, etc. One thing does not exclude the other. Better graphics means more immersion and in some games that contributes and makes it 100x better. i.e. Metro 2033 with worse graphics would have not had the same atmosphere and wouldn't have been so immersing.
First of all, I'm not trolling. Second, what I said is correct, and its not something to take as lightly. But of course I made a mistake of applying it to gaming, and it really is another story that movies. (As I am a professional video editor, and not a gamer, at all)
Just to polish my argument, take a bluray movie, if its NTSC it runs at 30fps. You don't see any movie losing frames at that framerate. Even a computer animated movie like toy story, unless the disc gets dirty or something and the thing begins stuttering. Of course, toy story was smoothly rendered prior its final release, not real time rendering. This makes difference in games, talking with my colegues about it, it may happen that motion blur and other things don't look good when rendering at 30fps and more frames are needed indeed.
Also, in a real time rendering scene, a colegue of mine demoed to me a 3D scene where some elements were rendered at a different pace. Yes it may happen, depending on the developer of course, that some elements could be rendered on faster framerates than others, on the same scene. So of course, a videocard that reaches 120frames per second may be able to overcome this issue very easily.
You do know Ghosting a FPS have nothing to do with each other, and just because a monitor is rated gtg 2ms or 5ms does mean a 2ms would look the same as the other 2ms gtg. And Ghosting has to do with the monitor ability to change from one color to another or from on to off, when it's slower ghosting occurs where you get sorta a bleeding of the previous frames, fps in a game is independent of that.
I'd also point out that in 1 second 1000ms 30 frames equates to a difference of 33.3ms per a frame 60 fps results in 16.6ms per frame, to get down to your 2ms it would take a 500fps monitor. Also i'd point out if you believe you can actually see the difference between 2 and 5ms you're delusional that means you can see humming bird sings flap flawlessly, the bleeding effect in ghosting varies by monitor have ghosting as long as 100ms+ in duration in early monitors as long as 300ms+. 2ms GTG is not a good measure of how the overall monitor handles a picture it's insinuates it's faster at turning on and off and changing color of pixels but it's not a 100% flawless.
Am I the only one that read the 3rd paragraph of that linked article
Dev kit will change 2 more times.
So how many pages is the thread for the first Dev kit at ?
Cant wait for the next 2 dev kit threads
Oh no... the only thing worse than the "human eye can't see more than [x] fps" troll is the "human eye can't see more than [x] fps" person who seriously believes that argument.
It's bogus. It's been debunked a billion times and has no credibility at all. It's not worth going into. Like I said, if you think the human eye can't see more than 25-30fps, go get a 120 Hz monitor, set its refresh rate to 30 Hz, 60 Hz, and 120 Hz respectively, and drag some windows around the desktop. You tell me if you see a difference.
WOULD JUST ALL SHUT THE FUCK UP! ESPECIALLY YOU BIGMACK70 HOLY SHIT YOUR SUCH A CHILD!! I went to go do my work for the day and after a simple post I come back you've argued over resolution with some kids OVER 9 GOD DAMN PAGES!! LET IT GO SHIT!!!
bigmack, . . . . . . . that name . . . . . . . sounds so familiar . . . . . . and you act just like that guy did . . . . . . .. :shadedshu
I think you're taking the internet too seriously
New egg selling PS4s for $438
I N WIN BP655.200BL Black Steel Mini-ITX Desktop Computer Case 200W Power Suppl
Western Digital WD Blue WD5000AAKX 500GB 7200 RPM SATA 6.0Gb/s 3.5" Internal Hard Drive -Bare Drive
G.SKILL Ares Series 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 2133 (PC3 17000) Desktop Memory Model F3-2133C9D-8GAB
ASUS Black Blu-ray Drive SATA Model BC-12B1ST/BLK/B/AS
AMD A10-5800K Trinity 3.8GHz (4.2GHz Turbo) Socket FM2 100W Quad-Core Desktop APU (CPU + GPU) with DirectX 11 Graphic AMD Radeon HD 7660D AD580KWOHJBOX
ASRock FM2A75M-ITX FM2 AMD A75 (Hudson D3) SATA 6Gb/s USB 3.0 HDMI Mini ITX AMD Motherboard
I'm just happy that the console generation, after so many years, has picked up in graphics capability.
The only issue with these new consoles is most of our 3+ year old PC's are superior to these new consoles which will once again be aged at release.
But of course it's a he'll of a lot better than still mucking around with current ones that's been blowing down gaming quality IMO.
Yes sir, i hope you happy now.
I can't see the difference yet,i'm browsing TPU from 4,3" inch HTC Sensation.
Last word,please enjoy any future game came from console ported.
A lot of a haters.No,seriously.
When TPU staff wrote article about AMD overclocking,Intel fanboys will swarm in spreading flamebait.Someone posted news about Apple,all Apple haters come to arise.Now the best part,even news about upcoming console has been raid with some expertise claiming knew better about hardware capabilities even they never had nor play any console.
So if i use Intel,not using Apple and never touched console,does that make me rich and smart?
Everybody chill out and remember we are all here for the same reason. We all have an interest in all things Computer/Gaming.
Can we just agree that this will make gaming better finally for PC and console gamers?
Hot chicks always help!!
Has Sony given up on any sort of motion control device?
Separate names with a comma.