1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Teams Up With Warner Bros. on Batman: Arkham Origins

Discussion in 'News' started by Cristian_25H, Aug 30, 2013.

  1. Cristian_25H

    Cristian_25H News Poster

    Joined:
    Dec 6, 2011
    Messages:
    3,773 (4.34/day)
    Thanks Received:
    1,110
    Location:
    Still on the East Side
    NVIDIA today announced it is working with Warner Bros. Interactive Entertainment and WB Games Montréal to make Batman: Arkham Origins, the next installment in the blockbuster Batman: Arkham videogame franchise, a technically advanced and intensely realistic chapter in the award-winning saga for PC players.

    Gamers who purchase a qualifying GPU from a participating partner will receive a free PC edition of Batman: Arkham Origins, which will be released worldwide on Oct. 25, 2013.

    [​IMG] [​IMG] [​IMG]

    Developed by WB Games Montréal, Batman: Arkham Origins features an expanded Gotham City and introduces an original prequel storyline set several years before the events of Batman: Arkham Asylum and Batman: Arkham City. Taking place before the rise of Gotham City's most dangerous criminals, the game showcases a young Batman as he faces a defining moment of his early career and sets his path to becoming the Dark Knight.

    Batman has immense power, strength and speed - the same attributes that make a GeForce GTX GPU the ultimate weapon to take on Gotham's dark underworld. The NVIDIA Developer Technology Team has been working closely with WB Games Montréal to incorporate an array of cutting-edge NVIDIA gaming technologies including DirectX tessellation, NVIDIA TXAA antialiasing, soft shadows and various NVIDIA PhysX engine environmental effects, such as cloth, steam and snow. Combined, these technologies bring the intricately detailed worlds of Gotham to life.

    "The Batman: Arkham games are visually stunning and it's great that we are able to continue building upon the amazing graphics with Batman: Arkham Origins," said Samantha Ryan, Senior Vice President, Production and Development, Warner Bros. Interactive Entertainment. "With NVIDIA's continued support, we are able to deliver an incredibly immersive gameplay experience."

    NVIDIA will be unveiling a sneak peek of Batman: Arkham Origins at PAX Prime in Seattle, during the NVIDIA stage presentation at the Paramount Theater on Monday, Sept. 2 at 10 a.m. PT. Entry is free.

    Additionally, any PAX attendees that purchase a qualified bundle from the special kiosk at the NVIDIA booth on the show floor will receive for free a limited edition Batman lithograph -- one of only 1,000 being produced.

    For a full list of participating bundle partners, visit: www.geforce.com/free-batman. This offer is good only until Jan. 31, 2014.
  2. Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,008 (1.39/day)
    Thanks Received:
    200
    I just hope its not as bad as what Nvidia did with Splinter Cell: Blacklist

    Once you disable the Nvidia only goodies AMD does okay for itself.

    [​IMG]
  3. damage New Member

    Joined:
    Aug 30, 2013
    Messages:
    1 (0.00/day)
    Thanks Received:
    0
  4. RCoon

    RCoon Forum Gypsy

    Joined:
    Apr 19, 2012
    Messages:
    5,382 (7.32/day)
    Thanks Received:
    2,095
    Location:
    Gypsyland
    adulaamin, Xzibit and suraswami say thanks.
  5. theoneandonlymrk

    theoneandonlymrk

    Joined:
    Mar 10, 2010
    Messages:
    3,162 (2.10/day)
    Thanks Received:
    489
    Location:
    Manchester uk
    look can we all stop knocking a brilliant place to be:D:toast:

    does that mean this is defo not a gameing evolved or never settle game then, just they both seem to be working hard with everyone these days its hard to keep up especially since i came across grid2s intel evolved nonesense , feck cant they all sort their heads out:ohwell: its us poor gamers getting the short shift every time:wtf:.
  6. erocker

    erocker Super Moderator Staff Member

    Joined:
    Jul 19, 2006
    Messages:
    39,155 (13.81/day)
    Thanks Received:
    13,607
  7. Fluffmeister

    Joined:
    Dec 22, 2011
    Messages:
    549 (0.64/day)
    Thanks Received:
    154
    You can always rely on the latest Batman game to get a few knickers in a twist. :laugh:

    Looking forward to it regardless, hopefully WB Games Montréal have done the franchise justice.
  8. Roph

    Roph

    Joined:
    Nov 1, 2008
    Messages:
    372 (0.19/day)
    Thanks Received:
    104
    Still pushing that proprietary, anti-competitive bullshit I see.
    Xzibit and Prima.Vera say thanks.
  9. Prima.Vera

    Prima.Vera

    Joined:
    Sep 15, 2011
    Messages:
    2,021 (2.12/day)
    Thanks Received:
    261
    FXAA looks horrible on Splinter Cell. I keep wondering why are they not implementing SMAA already??!?!
  10. Fourstaff

    Fourstaff Moderator Staff Member

    Joined:
    Nov 29, 2009
    Messages:
    9,101 (5.66/day)
    Thanks Received:
    1,943
    Location:
    Home
    Still butthurt Nvidia is sending engineers over to improve a game I see.
    Recus and Fluffmeister say thanks.
  11. Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,008 (1.39/day)
    Thanks Received:
    200
    It would have been better if the actual game engine was updated not a re-hash of the old one just like SC:B was.

    Might as well just DLC the previous game using UE 2.5 (10yr old game engine) and add the features if the core engine isn't going be changed much. Then you go into proprietary features which they are tauting.

    Doesn't take a genius to figure that out.

    Last time I checked there are no CUDA acceleration in any consoles this game will be released on.
    [​IMG]

    Sending engineers to a studio for marketing to sell GPU cards :rolleyes:
  12. Roph

    Roph

    Joined:
    Nov 1, 2008
    Messages:
    372 (0.19/day)
    Thanks Received:
    104
    So you think it's good that one vendor should get away with locking out other vendors? The supreme irony is that Physx would run so much better on an AMD GPU (since they don't artificially gimp compute performance), were it not artificially blocked from doing so.

    This vendor proprietary stuff needs to stop. I won't support Nvidia as long as they continue to do it.

    Would you be happy if an "AMD Evolved" title you were excited to play had various special features and effects that were blocked for you if it's detected that you have an Nvidia GPU?
    Prima.Vera says thanks.
  13. omnimodis78

    omnimodis78

    Joined:
    Aug 2, 2012
    Messages:
    33 (0.05/day)
    Thanks Received:
    9
    Location:
    Canada
    What's with all this nvidia hate? Seriously, AMD has followed suit and has done the whole Gaming Evolved, and TressFX and "all next-gen games will be AMD tuned" so please, spare us the 'evil nvidia' nonsense. Totally agree with the PhysX comment though, nvidia artificially locking it down is kind of stupid because nothing would be funnier than having a Gaming Evolved game that uses nvidia PhysX, but then again, I fall back on TressFX in Tomb Raider that I had to disable with a 770 because it cut my frame-rates in half, if not more.
    Recus, okidna and Fluffmeister say thanks.
  14. HumanSmoke

    HumanSmoke

    Joined:
    Sep 7, 2011
    Messages:
    1,085 (1.13/day)
    Thanks Received:
    318
    Well, its proprietary tech, so Nvidia can do with it as they please. I'm pretty certain AMD didn't call up Santa Clara offering to share Eyefinity IP for example.

    As for the whole PhysX farrago, its just as much a case of ATI's dithering as anything else. ATI originally looked into buying Ageia ( I suppose they would buy the IP then give it away free to everyone else?), decided that they'd hitch their wagon to Havok FX ...Havok got swallowed by Intel and development goes into a tailspin and Nvidia buys Ageia (for the not inconsiderable sum of around $US150 million)- offers PhysX to ATI (BTW this is the same Roy Taylor that does the talking head thing for AMD now)...ATI says no thanks, 'cos the Future is HavokFX™ ....mmm ok). AMD begin a public thrashdown of PhysX thinking that HavokFX will eventually run riot. A couple of months later the PhysX engine is incorporated into the Nvidia driver - AMD locked out. Now who didn't see that coming?

    If ATI/AMD wanted physics so badly they'd either stump up for a licence or help develop an engine. They did neither. If they cant be bothered and by all accounts, the majority of the AMD user-base have no time for PhysX...what precisely is the issue?
  15. okidna

    okidna

    Joined:
    Jan 2, 2012
    Messages:
    465 (0.55/day)
    Thanks Received:
    340
    Location:
    Indonesia
    From here : http://www.techpowerup.com/forums/showthread.php?p=2893228
    [​IMG]

    You're right, it's funny :D:D:D
    HumanSmoke and Fluffmeister say thanks.
    Crunching for Team TPU
  16. HumanSmoke

    HumanSmoke

    Joined:
    Sep 7, 2011
    Messages:
    1,085 (1.13/day)
    Thanks Received:
    318
    Comprehension is key :rolleyes: Who said anything about consoles (and by that I mean the article you quoted)? I can understand how you might be confused since Christian's article only mentioned PC twice

    [​IMG]
    Not doing yourself any favours are you. :slap:
  17. Prima.Vera

    Prima.Vera

    Joined:
    Sep 15, 2011
    Messages:
    2,021 (2.12/day)
    Thanks Received:
    261
    No, it's not just your nVidia card. TressFX works properly ONLY on the 7xxx generation cards. On my 5870CF I had to disable TressFX because from ~70FPS in some scenes it would go down to 10-15FPS, while the average was around 25FPS. As you can see is more than half of th FPS being lost. So you are lucky with your nVidia card actually. :D
  18. Recus

    Recus

    Joined:
    Jul 10, 2011
    Messages:
    476 (0.47/day)
    Thanks Received:
    161
    Sony: Exclusive PlayStation game content from the following third party devs/pubs

    [​IMG]

    Roph butthurt. :laugh:
  19. omnimodis78

    omnimodis78

    Joined:
    Aug 2, 2012
    Messages:
    33 (0.05/day)
    Thanks Received:
    9
    Location:
    Canada
  20. AsRock

    AsRock TPU addict

    Joined:
    Jun 23, 2007
    Messages:
    10,458 (4.19/day)
    Thanks Received:
    1,565
    Location:
    US
    Only thing that bugs me about the whole nVidia thing and PhysX is that they purposely block ATI cards being the main card.
  21. Fourstaff

    Fourstaff Moderator Staff Member

    Joined:
    Nov 29, 2009
    Messages:
    9,101 (5.66/day)
    Thanks Received:
    1,943
    Location:
    Home
    If you don't pay up you don't get it. Welcome to capitalism, I hope you enjoy your stay. See post #14
    MxPhenom 216 says thanks.
  22. Roph

    Roph

    Joined:
    Nov 1, 2008
    Messages:
    372 (0.19/day)
    Thanks Received:
    104
    How is your comparison supposed to make sense? I'm not a console gamer, I don't care about and don't play that platform. Console platforms are fixed/static. We're talking about the PC platform here, and the proprietary / locking out bullshit happening within the same platform.
  23. Xzibit

    Joined:
    Apr 30, 2012
    Messages:
    1,008 (1.39/day)
    Thanks Received:
    200
    Pathetic as always.

    I see you renewed your contract as Nvidia PR puppet again.

    Try reading the post and who I was responding to before you make an ass out of yourself like always.

    I was responding to Fourstaff accertion that Nvidia engineers were sent over to make a game better. Which if that was the case the game would be improved through out all platforms not just limited to PC platform where only a certain percentage of the users whom happen to have the corresponding hardware to take advantage of the proprietary API.

    In-coming excuses in 3,2,1...
  24. AsRock

    AsRock TPU addict

    Joined:
    Jun 23, 2007
    Messages:
    10,458 (4.19/day)
    Thanks Received:
    1,565
    Location:
    US
    You do pay when you buy the second card lol.. It would more likely benefit nVidia than damage them.. They just tight asses Just like MS is.

    Not as i really care as the only thing i have seen it good at is killing FPS.

    It's not some thing that will get most people buy a nVidia card although it might encourage people with ATI cards to get a nVidia card as their second.

    Blocking shit though drivers is so lame. and all they have to say is that ATI and nvidia configs are not supported but may work.

    Pricks..
  25. Fourstaff

    Fourstaff Moderator Staff Member

    Joined:
    Nov 29, 2009
    Messages:
    9,101 (5.66/day)
    Thanks Received:
    1,943
    Location:
    Home
    PC is a fragmented platform. Mix Linux, Mac and windows, AMD and Nvidia, AMD and Intel and you get a melting pot of compromises everywhere. AMD can easily license PhysX but chooses not to, so Nvidia got shafted because they use propriety tech.

    They are being pricks, but that is how you get the money to develop or buy new tech. CUDA is a very good example: it did a lot to kickstart GPGPU adoption, once the ball got rolling a lot of people jumped on the OpenCL bandwagon. Nvidia's PhysX does not have a proper competitor just yet, so as of now we are still using propriety tech. Simply relying on open source or multiparty agreements is often too slow, see: FMA3/FMA4 instruction set, standardisation of SSD over PCIE, etc.
    HumanSmoke and MxPhenom 216 say thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page