Tuesday, June 11th 2013

"Thief" Optimized for AMD Technologies

AMD today announced exclusive collaboration with Square Enix to optimize "THIEF" for the Graphics Core Next architecture in select AMD Radeon graphics processors, as well as the x86 and graphics architectures featured in AMD A-Series APUs. Developed in conjunction with the AMD Gaming Evolved program, "THIEF" will extensively leverage the advanced capabilities of AMD Radeon graphics processors, including AMD Eyefinity multi-display technology for panoramic gaming, AMD CrossFire multi-GPU technology for supreme performance, and state-of-the-art DirectX 11 rendering for pristine image quality.

"The 'THIEF' franchise has a storied history that we are proud to join in this latest installment," said Matt Skynner, corporate vice president and general manager, Graphics Business Unit, AMD. "We are even more pleased to work so closely with their development team to realize the vision for these games with the incredible gaming performance of a PC powered by AMD Radeon graphics. And as the exclusive hardware partner for 'THIEF,' we continue to demonstrate that the best experience for gamers and developers lives at AMD with the Gaming Evolved program."
Square Enix is present at this year's Electronic Entertainment Expo (E3) with a wide variety of games and demonstrations to be discovered and experienced hands-on in the South Hall at booth #1647, or online with the Square Enix Presents YouTube channel.

"AMD, Square Enix and Eidos-Montréal have a strong and notable relationship," said Stephane D'Astous, general manager, Eidos-Montréal. "It was only logical that we extend the cooperative efforts of our teams to 'THIEF,' imbuing it with the expertise that made 'Deus Ex: Human Revolution' such a technical achievement. Those efforts include a broad range of exclusive performance optimizations for AMD CPUs, APUs and graphics cards, and we are excited about making our game a technology showcase on the PC platform."

AMD also has a large presence at this year's E3, located in the South Hall booth #423. From an array of gadgets powered by AMD APUs and CPUs, to UltraHD gaming driven by the world's most advanced graphics cards, fans can experience AMD's full commitment to gaming on a diverse range of state-of-the-art computing solutions.
Add your own comment

46 Comments on "Thief" Optimized for AMD Technologies

#1
diopter
"Optimized for AMD Technologies" - a.k.a. doing nothing except making it run artificially worse on nvidia hardware after passing some cash under the table.
Posted on Reply
#2
jigar2speed
diopter"Optimized for AMD Technologies" - a.k.a. doing nothing except making it run artificially worse on nvidia hardware after passing some cash under the table.
If i were you, i would wait for the actual benchmarks before making such troll bait statements.

EDIT: Anyway, i feel like the upcoming games should finally use AMD's 4 modules optimally, which should breath some life in FX series.
Posted on Reply
#3
Jorge
As usual the clueless trolls talking thru their arse...because they can't deal with the reality that AMD is offering better products and working with vendors to provide a better user experience.

I hope AMD also goes after mainstream programmers who code specifically for Intel CPUs at the expense of AMD CPUs. This would include those who write benchmarks and traditional software. It would be great to see a level playing field for once in the past 40+ years.
Posted on Reply
#4
Prima.Vera
The only thing I have noticed during the years with this Gaming Evolved program is a better support for Crossfire and multiple display. That's it and nothing else. The frame-rates are on pair with nVidia's ones, so no improvements here.
Posted on Reply
#5
Mindweaver
Moderato®™
diopter"Optimized for AMD Technologies" - a.k.a. doing nothing except making it run artificially worse on nvidia hardware after passing some cash under the table.
No different than Nvidia doing it to AMD for how many years? The biggest reason we are even seeing "Optimized for AMD Technologies" is because consoles are using AMD. Why would a game dev create a game to run better on an Nvidia chip when it's a console port? All new games will be optimized for AMD tech because of console's not PC's, but they are using PC hardware so we pc gamers win in the long run as well. :toast:
Posted on Reply
#6
BigMack70
Dear AMD,
You cannot keep bragging about crossfire performance until you fix your stuttering issues.
Sincerely,
A crossfire customer who you ran off to Nvidia
Posted on Reply
#7
SIGSEGV
BigMack70Dear AMD,
You cannot keep bragging about crossfire performance until you fix your stuttering issues.
Sincerely,
A crossfire customer who you ran off to Nvidia
so you wanna say that SLI configuration is stuttering free, isn't it?
Posted on Reply
#8
RCoon
SIGSEGVso you wanna say that SLI configuration is stuttering free, isn't it?
I ran 7950 crossfire for a year. Runt frames render crossfire performance almost pointless in some cases. SLI has far fewer and much lighter effects on frame latency. This is not a debate on nvidia versus amd anyway.
Posted on Reply
#9
Xenturion
Gotta love how they keep touting CrossFire like there's nothing wrong. If you're sending reviewers an alpha driver shortly after the release of a card, there's probably something amiss.

Whenever I see an ad for the 7990, I always think, "Yeah, I'll admit that the number of attempted frames per second the card achieves is impressive, but the number of completed frames per second reveals something else entirely." It does frustrate me that this far after the release of the card, there's still no publicly available driver fix, and that AMD continues to promote its CrossFire technology when there is a well-documented flaw in its implementation (at least in the GCN-based cards) just confuses me. Granted, marketing is always targeting the uninitiated, so there's nothing new going on there.

As someone who has supported AMD for years and always championed the value proposition many of their products offer, I'm disappointed with where they are today. Hopefully these console deals will bolster AMD and bring some competition back both in the CPU and GPU spaces.

Probably not a bad thing for us PC gamers that these games are being written for X86-64 and the GCN architecture, which should mean that the inevitable ports won't be too atrocious.
Posted on Reply
#10
btarunr
Editor & Senior Moderator
diopter"Optimized for AMD Technologies" - a.k.a. doing nothing except making it run artificially worse on nvidia hardware after passing some cash under the table.
Another way of looking it is "working correctly (bug-free) on AMD hardware."
Posted on Reply
#11
BigMack70
SIGSEGVso you wanna say that SLI configuration is stuttering free, isn't it?
It is not 100% stutter free - from time to time you can see it hitch up and stutter for a half second or so. But it's never anywhere near as bad as 7970 crossfire was; it is never distracting, never occurs frequently or for an extended period of time, and does not require either vsync or a framerate limiter to deal with.

The difference is night and day. Crossfire can be made to work fine with framerate limiters in most cases, but SLI works better than even crossfire+framerate limiters without any tweaking at all.
Posted on Reply
#12
TheLaughingMan
All three next gen consoles run AMD hardware, nearly exclusively. Everything with be "optimized for AMD hardware" for a while to come.

They improved the issue with stutter with the 7990 release. Its not as good as Nvidia, but much better than it was.
Posted on Reply
#13
SIGSEGV
RCoonI ran 7950 crossfire for a year. Runt frames render crossfire performance almost pointless in some cases. SLI has far fewer and much lighter effects on frame latency. This is not a debate on nvidia versus amd anyway.
because i want to know how sli perform before i do SLI in my rig. so sure, this is not a debate on nvidia versus amd. most people said that xfire and sli configuration is causing headache rather than enjoying games. i did xfire with HD6870 before but i haven't had any experience with SLI.
Posted on Reply
#14
douglatins
Dont care if its a dude i want him to have laras hair
Posted on Reply
#15
HalfAHertz
I hope this game is good and SE haven't ruined the franchise. I grew up with the series and loved each and every one of the games. Plus the community generated content entertained me for years afterwards.
Posted on Reply
#16
Prima.Vera
douglatinsDont care if its a dude i want him to have laras hair
This FX thing works like crap on older generations...
Posted on Reply
#17
Xzibit
BigMack70It is not 100% stutter free - from time to time you can see it hitch up and stutter for a half second or so. But it's never anywhere near as bad as 7970 crossfire was; it is never distracting, never occurs frequently or for an extended period of time, and does not require either vsync or a framerate limiter to deal with.

The difference is night and day. Crossfire can be made to work fine with framerate limiters in most cases, but SLI works better than even crossfire+framerate limiters without any tweaking at all.
Nvidias Tom Petersen didnt want to get too into the weeds with Scott on their FCAT explanation.

50min mark

I dont think AMD X-Fire is up to snuff but the more Nvidia discloses on how SLI and they're better doesnt seam too convincing that method is the right one either.

He also says a 2-10 variance is noticeable and if thats the standard both AMD and Nvidia fail miserably. Unless variance is 5 or below you still will have people noticing it no matter if its X-fire or SLI.
^By that standard even single cards have stuttering and Nvidia trails AMD on single card setups.
Posted on Reply
#18
okidna
douglatinsDont care if its a dude i want him to have laras hair
Uh, he's wearing a hood.
Posted on Reply
#19
AsRock
TPU addict
Prima.VeraThis FX thing works like crap on older generations...
Worked perfectly fine on my 6970, unless you mean even older gen ?.
Posted on Reply
#20
erocker
*
diopter"Optimized for AMD Technologies" - a.k.a. doing nothing except making it run artificially worse on nvidia hardware after passing some cash under the table.
These comments amuse me. Only because, in the past you could replace the word "AMD" with "Nvidia" and it would be equally true. :)

At least, it's good to see these companies learning from each other.
Posted on Reply
#21
BigMack70
XzibitNvidias Tom Petersen didnt want to get too into the weeds with Scott on their FCAT explanation.

50min mark
[...]

I dont think AMD X-Fire is up to snuff but the more Nvidia discloses on how SLI and they're better doesnt seam too convincing that method is the right one either.

He also says a 2-10 variance is noticeable and if thats the standard both AMD and Nvidia fail miserably. Unless variance is 5 or below you still will have people noticing it no matter if its X-fire or SLI.
^By that standard even single cards have stuttering and Nvidia trails AMD on single card setups.
All I can say is that from a year+ of 7970 crossfire and switching to 780 SLI, I can say that 780 SLI is almost as smooth as a single card experience without any tweaking at all. 7970 CF was almost always tweakable to be smooth (the lone exception for me being Crysis 3 single player), but if you just run crossfire without vsync & framerate limiters..... yeesh it's bad. I never ran into any driver problems, though, in terms of CF scaling issues or games not being supported.
Posted on Reply
#22
Doc41
Thieeeeeeeeef 8D, wonder when's the release.
i loved 1 & 2, and while 3 was still pretty good i liked the older ones better, i hope they don't ruin this one as i was looking forward to it.
Posted on Reply
#23
Ahhzz
jigar2speedIf i were you, i would wait for the actual benchmarks before making such troll bait statements.

EDIT: Anyway, i feel like the upcoming games should finally use AMD's 4 modules optimally, which should breath some life in FX series.
Eh, to be fair, I believe we've all seen articles here where both of the big boys (ATI at the time, and NVIDIA) have done exactly that. Not so much troll bait as cynical, altho somewhat likely, statements.

I was a die-hard Nvidia (1st card I bought was a Viper V550 16Mb, with that NVIDIA Riva TNT processor), all the way up thru my 9800 GT ultimates. However, by the time I got around to my dual 6950's, ATI/AMD was showing a stronger horse in the field. Having had some sweet cards in some sweet setups from both, I can easily say, they've both got their limitations, and issues. I just hope the game runs well, no matter the video platform, without too many bugs. I would hope that's what any of us want.
Posted on Reply
#24
Widjaja
MindweaverNo different than Nvidia doing it to AMD for how many years? The biggest reason we are even seeing "Optimized for AMD Technologies" is because consoles are using AMD. Why would a game dev create a game to run better on an Nvidia chip when it's a console port? All new games will be optimized for AMD tech because of console's not PC's, but they are using PC hardware so we pc gamers win in the long run as well. :toast:
Lol

Yes it was pretty obvious the AMD fanatic missed out the fact about nVidia TWIMTBP games can also be flaky in performance on AMD cards.
"No...AMD cards are just sub par defective pieces of junkz Trololo."

Same thing backwards and forwards when a developer chooses to back a specific manufacturer with the utilization of their fancy heavy on performance or plain gimmick graphics enhancements where it be AMD direct compute power or PhysX.
Posted on Reply
#25
OneCool
You know they will claim this and watch it run better on nVidia :nutkick: :roll:
Posted on Reply
Add your own comment
Apr 25th, 2024 16:26 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts