Friday, August 30th 2013

NVIDIA Teams Up With Warner Bros. on Batman: Arkham Origins

NVIDIA today announced it is working with Warner Bros. Interactive Entertainment and WB Games Montréal to make Batman: Arkham Origins, the next installment in the blockbuster Batman: Arkham videogame franchise, a technically advanced and intensely realistic chapter in the award-winning saga for PC players.

Gamers who purchase a qualifying GPU from a participating partner will receive a free PC edition of Batman: Arkham Origins, which will be released worldwide on Oct. 25, 2013.

Developed by WB Games Montréal, Batman: Arkham Origins features an expanded Gotham City and introduces an original prequel storyline set several years before the events of Batman: Arkham Asylum and Batman: Arkham City. Taking place before the rise of Gotham City's most dangerous criminals, the game showcases a young Batman as he faces a defining moment of his early career and sets his path to becoming the Dark Knight.

Batman has immense power, strength and speed - the same attributes that make a GeForce GTX GPU the ultimate weapon to take on Gotham's dark underworld. The NVIDIA Developer Technology Team has been working closely with WB Games Montréal to incorporate an array of cutting-edge NVIDIA gaming technologies including DirectX tessellation, NVIDIA TXAA antialiasing, soft shadows and various NVIDIA PhysX engine environmental effects, such as cloth, steam and snow. Combined, these technologies bring the intricately detailed worlds of Gotham to life.

"The Batman: Arkham games are visually stunning and it's great that we are able to continue building upon the amazing graphics with Batman: Arkham Origins," said Samantha Ryan, Senior Vice President, Production and Development, Warner Bros. Interactive Entertainment. "With NVIDIA's continued support, we are able to deliver an incredibly immersive gameplay experience."

NVIDIA will be unveiling a sneak peek of Batman: Arkham Origins at PAX Prime in Seattle, during the NVIDIA stage presentation at the Paramount Theater on Monday, Sept. 2 at 10 a.m. PT. Entry is free.

Additionally, any PAX attendees that purchase a qualified bundle from the special kiosk at the NVIDIA booth on the show floor will receive for free a limited edition Batman lithograph -- one of only 1,000 being produced.

For a full list of participating bundle partners, visit: www.geforce.com/free-batman. This offer is good only until Jan. 31, 2014.
Add your own comment

27 Comments on NVIDIA Teams Up With Warner Bros. on Batman: Arkham Origins

#1
Xzibit
I just hope its not as bad as what Nvidia did with Splinter Cell: Blacklist

Once you disable the Nvidia only goodies AMD does okay for itself.

Posted on Reply
#2
damage
by: Xzibit
I just hope its not as bad as what Nvidia did with Splinter Cell: Blacklist

Once you disable the Nvidia only goodies AMD does okay for itself.

[url]
And why should anyone with a capable hardware will run with eye candy turned off?U high?

Posted on Reply
#4
theoneandonlymrk
by: RCoon
ARE YOU HIGH?
look can we all stop knocking a brilliant place to be:D:toast:

does that mean this is defo not a gameing evolved or never settle game then, just they both seem to be working hard with everyone these days its hard to keep up especially since i came across grid2s intel evolved nonesense , feck cant they all sort their heads out:ohwell: its us poor gamers getting the short shift every time:wtf:.
Posted on Reply
#6
Fluffmeister
You can always rely on the latest Batman game to get a few knickers in a twist. :laugh:

Looking forward to it regardless, hopefully WB Games Montréal have done the franchise justice.
Posted on Reply
#7
Roph
Still pushing that proprietary, anti-competitive bullshit I see.
Posted on Reply
#8
Prima.Vera
FXAA looks horrible on Splinter Cell. I keep wondering why are they not implementing SMAA already??!?!
Posted on Reply
#9
Fourstaff
by: Roph
Still pushing that proprietary, anti-competitive bullshit I see.
Still butthurt Nvidia is sending engineers over to improve a game I see.
Posted on Reply
#10
Xzibit
by: Fourstaff
Still butthurt Nvidia is sending engineers over to improve a game I see.
It would have been better if the actual game engine was updated not a re-hash of the old one just like SC:B was.

Might as well just DLC the previous game using UE 2.5 (10yr old game engine) and add the features if the core engine isn't going be changed much. Then you go into proprietary features which they are tauting.

Doesn't take a genius to figure that out.

by: Cristian_25H
Batman has immense power, strength and speed - the same attributes that make a GeForce GTX GPU the ultimate weapon to take on Gotham's dark underworld. The NVIDIA Developer Technology Team has been working closely with WB Games Montréal to incorporate an array of cutting-edge NVIDIA gaming technologies including DirectX tessellation, NVIDIA TXAA antialiasing, soft shadows and various NVIDIA PhysX engine environmental effects, such as cloth, steam and snow. Combined, these technologies bring the intricately detailed worlds of Gotham to life.
Last time I checked there are no CUDA acceleration in any consoles this game will be released on.


Sending engineers to a studio for marketing to sell GPU cards :rolleyes:
Posted on Reply
#11
Roph
by: Fourstaff
Still butthurt Nvidia is sending engineers over to improve a game I see.
So you think it's good that one vendor should get away with locking out other vendors? The supreme irony is that Physx would run so much better on an AMD GPU (since they don't artificially gimp compute performance), were it not artificially blocked from doing so.

This vendor proprietary stuff needs to stop. I won't support Nvidia as long as they continue to do it.

Would you be happy if an "AMD Evolved" title you were excited to play had various special features and effects that were blocked for you if it's detected that you have an Nvidia GPU?
Posted on Reply
#12
omnimodis78
What's with all this nvidia hate? Seriously, AMD has followed suit and has done the whole Gaming Evolved, and TressFX and "all next-gen games will be AMD tuned" so please, spare us the 'evil nvidia' nonsense. Totally agree with the PhysX comment though, nvidia artificially locking it down is kind of stupid because nothing would be funnier than having a Gaming Evolved game that uses nvidia PhysX, but then again, I fall back on TressFX in Tomb Raider that I had to disable with a 770 because it cut my frame-rates in half, if not more.
Posted on Reply
#13
HumanSmoke
by: Roph
So you think it's good that one vendor should get away with locking out other vendors?
Well, its proprietary tech, so Nvidia can do with it as they please. I'm pretty certain AMD didn't call up Santa Clara offering to share Eyefinity IP for example.

As for the whole PhysX farrago, its just as much a case of ATI's dithering as anything else. ATI originally looked into buying Ageia ( I suppose they would buy the IP then give it away free to everyone else?), decided that they'd hitch their wagon to Havok FX ...Havok got swallowed by Intel and development goes into a tailspin and Nvidia buys Ageia (for the not inconsiderable sum of around $US150 million)- offers PhysX to ATI (BTW this is the same Roy Taylor that does the talking head thing for AMD now)...ATI says no thanks, 'cos the Future is HavokFX™ ....mmm ok). AMD begin a public thrashdown of PhysX thinking that HavokFX will eventually run riot. A couple of months later the PhysX engine is incorporated into the Nvidia driver - AMD locked out. Now who didn't see that coming?

If ATI/AMD wanted physics so badly they'd either stump up for a licence or help develop an engine. They did neither. If they cant be bothered and by all accounts, the majority of the AMD user-base have no time for PhysX...what precisely is the issue?
Posted on Reply
#14
okidna
by: omnimodis78
What's with all this nvidia hate? Seriously, AMD has followed suit and has done the whole Gaming Evolved, and TressFX and "all next-gen games will be AMD tuned" so please, spare us the 'evil nvidia' nonsense. Totally agree with the PhysX comment though, nvidia artificially locking it down is kind of stupid because nothing would be funnier than having a Gaming Evolved game that uses nvidia PhysX, but then again, I fall back on TressFX in Tomb Raider that I had to disable with a 770 because it cut my frame-rates in half, if not more.
From here : http://www.techpowerup.com/forums/showthread.php?p=2893228


You're right, it's funny :D:D:D
Posted on Reply
#15
HumanSmoke
by: Xzibit
[quote="Cristian_25H, post: 2970328"] Batman has immense power, strength and speed - the same attributes that make a GeForce GTX GPU the ultimate weapon to take on Gotham's dark underworld. The NVIDIA Developer Technology Team has been working closely with WB Games Montréal to incorporate an array of cutting-edge NVIDIA gaming technologies including DirectX tessellation, NVIDIA TXAA antialiasing, soft shadows and various NVIDIA PhysX engine environmental effects, such as cloth, steam and snow. Combined, these technologies bring the intricately detailed worlds of Gotham to life.
Last time I checked there are no CUDA acceleration in any consoles this game will be released on[/quote]Comprehension is key :rolleyes: Who said anything about consoles (and by that I mean the article you quoted)? I can understand how you might be confused since Christian's article only mentioned PC twice


by: Xzibit
Doesn't take a genius to figure that out.
Not doing yourself any favours are you. :slap:
Posted on Reply
#16
Prima.Vera
by: omnimodis78
...I fall back on TressFX in Tomb Raider that I had to disable with a 770 because it cut my frame-rates in half, if not more.
No, it's not just your nVidia card. TressFX works properly ONLY on the 7xxx generation cards. On my 5870CF I had to disable TressFX because from ~70FPS in some scenes it would go down to 10-15FPS, while the average was around 25FPS. As you can see is more than half of th FPS being lost. So you are lucky with your nVidia card actually. :D
Posted on Reply
#17
Recus
by: Roph
So you think it's good that one vendor should get away with locking out other vendors? The supreme irony is that Physx would run so much better on an AMD GPU (since they don't artificially gimp compute performance), were it not artificially blocked from doing so.

This vendor proprietary stuff needs to stop. I won't support Nvidia as long as they continue to do it.

Would you be happy if an "AMD Evolved" title you were excited to play had various special features and effects that were blocked for you if it's detected that you have an Nvidia GPU?
Sony: Exclusive PlayStation game content from the following third party devs/pubs



Roph butthurt. :laugh:
Posted on Reply
#18
omnimodis78
by: okidna
From here : http://www.techpowerup.com/forums/showthread.php?p=2893228
http://img.techpowerup.org/130428/BioShockInfinitePhysX.png

You're right, it's funny :D:D:D
Damn I've never thought I'd see that - it just looks odd. Although, to be fair, Gaming Evolved is not a proprietary tech, it just more marketing than anything else, just like nvidia's 'The Way It's Meant to be Played' awareness program ("awareness", ha! but that's what they call it).
Posted on Reply
#19
AsRock
TPU addict
Only thing that bugs me about the whole nVidia thing and PhysX is that they purposely block ATI cards being the main card.
Posted on Reply
#20
Fourstaff
by: AsRock
Only thing that bugs me about the whole nVidia thing and PhysX is that they purposely block ATI cards being the main card.
If you don't pay up you don't get it. Welcome to capitalism, I hope you enjoy your stay. See post #14
Posted on Reply
#21
Roph
by: Recus
Sony: Exclusive PlayStation game content from the following third party devs/pubs

http://img208.imageshack.us/img208/9281/nz6w.jpg

Roph butthurt. :laugh:
How is your comparison supposed to make sense? I'm not a console gamer, I don't care about and don't play that platform. Console platforms are fixed/static. We're talking about the PC platform here, and the proprietary / locking out bullshit happening within the same platform.
Posted on Reply
#22
Xzibit
by: HumanSmoke
Comprehension is key :rolleyes: Who said anything about consoles (and by that I mean the article you quoted)? I can understand how you might be confused since Christian's article only mentioned PC twice

http://images.vg247.com/current//2013/05/batman-arkham-origins-box-art-pc-etc.jpg

Not doing yourself any favours are you. :slap:
Pathetic as always.

I see you renewed your contract as Nvidia PR puppet again.

Try reading the post and who I was responding to before you make an ass out of yourself like always.

I was responding to Fourstaff accertion that Nvidia engineers were sent over to make a game better. Which if that was the case the game would be improved through out all platforms not just limited to PC platform where only a certain percentage of the users whom happen to have the corresponding hardware to take advantage of the proprietary API.

In-coming excuses in 3,2,1...
Posted on Reply
#23
AsRock
TPU addict
by: Fourstaff
If you don't pay up you don't get it. Welcome to capitalism, I hope you enjoy your stay. See post #14
You do pay when you buy the second card lol.. It would more likely benefit nVidia than damage them.. They just tight asses Just like MS is.

Not as i really care as the only thing i have seen it good at is killing FPS.

It's not some thing that will get most people buy a nVidia card although it might encourage people with ATI cards to get a nVidia card as their second.

Blocking shit though drivers is so lame. and all they have to say is that ATI and nvidia configs are not supported but may work.

Pricks..
Posted on Reply
#24
Fourstaff
by: Roph
How is your comparison supposed to make sense? I'm not a console gamer, I don't care about and don't play that platform. Console platforms are fixed/static. We're talking about the PC platform here, and the proprietary / locking out bullshit happening within the same platform.
PC is a fragmented platform. Mix Linux, Mac and windows, AMD and Nvidia, AMD and Intel and you get a melting pot of compromises everywhere. AMD can easily license PhysX but chooses not to, so Nvidia got shafted because they use propriety tech.

by: AsRock
You do pay when you buy the second card lol.. It would more likely benefit nVidia than damage them.. They just tight asses Just like MS is.

Not as i really care as the only thing i have seen it good at is killing FPS.

It's not some thing that will get most people buy a nVidia card although it might encourage people with ATI cards to get a nVidia card as their second.

Blocking shit though drivers is so lame. and all they have to say is that ATI and nvidia configs are not supported but may work.

Pricks..
They are being pricks, but that is how you get the money to develop or buy new tech. CUDA is a very good example: it did a lot to kickstart GPGPU adoption, once the ball got rolling a lot of people jumped on the OpenCL bandwagon. Nvidia's PhysX does not have a proper competitor just yet, so as of now we are still using propriety tech. Simply relying on open source or multiparty agreements is often too slow, see: FMA3/FMA4 instruction set, standardisation of SSD over PCIE, etc.
Posted on Reply
#25
MxPhenom 216
Corsair Fanboy
by: Prima.Vera
No, it's not just your nVidia card. TressFX works properly ONLY on the 7xxx generation cards. On my 5870CF I had to disable TressFX because from ~70FPS in some scenes it would go down to 10-15FPS, while the average was around 25FPS. As you can see is more than half of th FPS being lost. So you are lucky with your nVidia card actually. :D
TressFX works fine.

What people do not understand, is that it is a compute based physics feature. Basically meaning you need a GPU good at compute, which is what GCN architecture is very good at. 5870s, were junk with compute, and Nvidia Kepler, is the same way unless you have a Titan, or a Quadro/Tesla card.

Kepler has had its double precision part of the GPU completely stripped from it for the GeForce series, which is why performance with TressFX isnt that great on Nvidia cards, or other cards that are weak in compute (5870s).

I played and beat Tomb Raider with a gtx680, and once Nvidia was able to get drivers released that worked well with the final game code, the game ran like butter. I was getting 50-60 FPS constant maxed (it would drop below that, but not long enough for me to notice. And my eyes are pretty tuned to dropping frame rate too). I get even more now with the 780. Geforce cards handle TressFX through pretty much brute force, since their compute performance is lack luster.
Posted on Reply
Add your own comment