• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

"Fireburst" New racing game using Unreal Engine 3

why are all the cars on fire lol
 
That's silly, just need to turn it off and play it without physx, i can't believe you moved because physx crashed your games :laugh:.

Batman and Mirror Edge worked just fine for me, I only played Mirror Edge for like 20 mins or so though.

8800GT vs 4870 crossfire. it wasnt physx that made me move, it was physx that made me choose crossfire over SLI - i got DX 10.1 + HDMI audio instead of a buggy, crashy physX.
 
8800GT vs 4870 crossfire. it wasnt physx that made me move, it was physx that made me choose crossfire over SLI - i got DX 10.1 + HDMI audio instead of a buggy, crashy physX.

and thats why i went CF :toast:
 
the unreal engine doesnt do AA in DX9. many unreal engine games dont bother adding DX10.

Mass effect 2 does with nhancer, and a few others:rockout:
 
Mass effect 2 does with nhancer, and a few others:rockout:

thats not the engine doing it. thats you forcing a brute force method via drivers - at a massive performance hit.
 
Another game using the UE3 engine blah =( most games on the engine are buggy and unoptimized
 
it looks . . . generic in a way
 
you never had the game crashes or driver crashes? i sure had them in UE3 and mirrors edge - housemates did too (and why i moved to ATI in the first place)

I did in the beginning, but there wasn't any games worth really using PhysX in the beginning. The two levels in UT3 wasn't really worth it. I lost interest in Mirror's Edge after about an hour of play...

Though I played all the way through Batman:AA w/ PhysX and didn't have a single game issue or driver crash. And it is really the first game I played where PhysX really did make a good game turn into an awesome game.

8800GT vs 4870 crossfire. it wasnt physx that made me move, it was physx that made me choose crossfire over SLI - i got DX 10.1 + HDMI audio instead of a buggy, crashy physX.

So...basically it came down to DX 10.1 vs. PhysX. On the one hand you have PhysX which you think is buggy and crashy, but you can just disable, or DX 10.1 which is pretty much useless as just as few games use it as games that use PhysX, and there is no real visual difference from DX10. The difference is mostely performance improvements...which nVidia didn't need because their cards still outperformed ATi cards in the games that did support DX10.1...

And HDMI audio works fine on my nVidia cards(except the 8800GTS which doesn't support it, but all the others do).
 
Looks alright...

I'd rather they bring back Super Off Road done in Unreal Engine!

Super-Off-Road.jpg


I'd buy a wheel and pedal for it :)
 
Another game using the UE3 engine blah =( most games on the engine are buggy and unoptimized

Say what? No way is it unoptimized and can be tweaked easily for added performance.
 
Say what? No way is it unoptimized and can be tweaked easily for added performance.

Indeed, that is why it is so popular. The only real draw back is not having built in AA, but the time the developer saves by using a pre-built engine is huge compared to the minor time to add AA support.
 
TBH I think when its used in the correct way it can still look very pretty, any news on future updates to the engine?
 
I did in the beginning, but there wasn't any games worth really using PhysX in the beginning. The two levels in UT3 wasn't really worth it. I lost interest in Mirror's Edge after about an hour of play...

Though I played all the way through Batman:AA w/ PhysX and didn't have a single game issue or driver crash. And it is really the first game I played where PhysX really did make a good game turn into an awesome game.



So...basically it came down to DX 10.1 vs. PhysX. On the one hand you have PhysX which you think is buggy and crashy, but you can just disable, or DX 10.1 which is pretty much useless as just as few games use it as games that use PhysX, and there is no real visual difference from DX10. The difference is mostely performance improvements...which nVidia didn't need because their cards still outperformed ATi cards in the games that did support DX10.1...

And HDMI audio works fine on my nVidia cards(except the 8800GTS which doesn't support it, but all the others do).

i had an 8800GTX, 8800GTS (320) and 8800GT/9800GT, and none did HDMI audio. Some had SPDIF connectors, but they didnt work - it was a broken feature (like the G80 had with H264 decoding)


and actually, i've got a few 10.1 games... assasins creed had it (pre patch), some stalker games use it, HAWX uses it, bad company 2 uses it...
 
i had an 8800GTX, 8800GTS (320) and 8800GT/9800GT, and none did HDMI audio. Some had SPDIF connectors, but they didnt work - it was a broken feature (like the G80 had with H264 decoding)


and actually, i've got a few 10.1 games... assasins creed had it (pre patch), some stalker games use it, HAWX uses it, bad company 2 uses it...

How is something that is a straight pass-though, AKA connector A is directly connected to connector B, a broken feature. There is literally no way for anything to go wrong... I've used it with my 9800GTXs, my GTX260(65nm), my GTX285, my 9600GT, and my GTX260(55nm), it has worked flawlessly with every card.

You've pretty much just named all the DX10.1 games, and again, DX10.1 did pretty much nothing in any of them...it is impossible to tell the difference in any of those games visually, but that makes sense considering visually DX10.1 isn't supposed to be that much different, it is a performance thing...of course as I already said, nVidia cards didn't have a problem with those cards, they didn't need the extra performance...
 
How is something that is a straight pass-though, AKA connector A is directly connected to connector B, a broken feature. There is literally no way for anything to go wrong... I've used it with my 9800GTXs, my GTX260(65nm), my GTX285, my 9600GT, and my GTX260(55nm), it has worked flawlessly with every card.

You've pretty much just named all the DX10.1 games, and again, DX10.1 did pretty much nothing in any of them...it is impossible to tell the difference in any of those games visually, but that makes sense considering visually DX10.1 isn't supposed to be that much different, it is a performance thing...of course as I already said, nVidia cards didn't have a problem with those cards, they didn't need the extra performance...

it simply didnt work on the early cards. thats what i mean by broken feature. it was never implemented. late model 8800GT cards were the first it actually worked on.
 
it simply didnt work on the early cards. thats what i mean by broken feature. it was never implemented. late model 8800GT cards were the first it actually worked on.

It wasn't broken, it was non-existant. If the card has an SPDIF connector, it worked.
 
It wasn't broken, it was non-existant. If the card has an SPDIF connector, it worked.

i'm telling you thats not the case. I tested many, they had the adaptor (2 pin SPDIF prong), it didnt work.

Seriously here... you're telling me you tested and confirmed it worked on LATER generation cards. i'm telling you it didnt work on the first generation G92 cards - can you see how that fits together?

My original point was that i tested them, it didnt work on the ones available at the time so i chose ATI. This is unrelated to the topic at hand.
 
I want a new Carmageddon too :) How about Left 4 Cars :D
 
i'm telling you thats not the case. I tested many, they had the adaptor (2 pin SPDIF prong), it didnt work.

Seriously here... you're telling me you tested and confirmed it worked on LATER generation cards. i'm telling you it didnt work on the first generation G92 cards - can you see how that fits together?

My original point was that i tested them, it didnt work on the ones available at the time so i chose ATI. This is unrelated to the topic at hand.

ive got one working now lol :roll:
 
i'm telling you thats not the case. I tested many, they had the adaptor (2 pin SPDIF prong), it didnt work.

Seriously here... you're telling me you tested and confirmed it worked on LATER generation cards. i'm telling you it didnt work on the first generation G92 cards - can you see how that fits together?

My original point was that i tested them, it didnt work on the ones available at the time so i chose ATI. This is unrelated to the topic at hand.

Ummm...maybe it is just me, but the 9800GTX and 8800GT are the same generaion...of course so was the 8800GS and 8800GTS I used this feature on also...those two even used the same PCB as the 8800GT. Of course I had to take off the heatsink cover on the 8800GS to get to the two prongs, since they were sticking straight up instead of out the side liket hey should have been... So while you are telling me that it didn't work on the first generation cards, I'm telling you I've tested it and it does. If the prongs where there, it worked. The G80 cards didn't support this feature, but I haven't seen a reference G92 card that didn't, and only a handful of non-reference ones that lacked the prongs.

And that doesn't make any sense that it didn't work on the ones available at the time, so you went with ATi, you original post talks about HD4870 vs 8800GT, the GTX200 series was available when the HD4870 was...
 
Back
Top