• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Editorial AMD Bets on DirectX 12 for Not Just GPUs, but Also its CPUs

Interesting that it showed a much more substantial improvement with the AMD graphics, almost double the improvement shown by the base nvidia card which had a 600% improvement.
 
That impressive jump in draw calls. If it led also to the same jump in frames per second. :eek:


Many major hardware news sites report about Samsung attempting to acquire AMD.

If this happens, the best news for quite a while.

Samsung To Allegedly Acquire AMD To Compete With Intel And Qualcomm Head On

Read more: http://wccftech.com/amd-allegedly-merge-samsung/#ixzz3VVRRd3g5
 
Not gone happen. Totally BS. And in my opinion if it does happen, we say goodbye to AMD CPUs/APUs and probably GPUs. I don't think Samsung is interested in those markets, it could be interested in business markets and probably for the patents and nothing more. If they do let AMD to continue their X86 and GPU designs for the desktops, probably we can forget the VFM from AMD that we use to know. Samsung is a bigger and brighter sticker than that from Nvidia and big, bright and shiny stickers cost a lot.
 
I never said it was the purpose of DX12. What I was saying is that the performance increases will be across the board. Meaning ALL CPUs(again regardless of core/thread count) will benefit greatly from DX12 or Mantle.

Sorry...but I have to laugh now...:laugh: Why?

Because my "casual gaming rig" with an ancient dual core(E8600) and a 280X runs anything you've got at more than playable speeds. And when I run Star Swarm, D3D vs. Mantle, it's the "slideshow" vs. "playable"(if it were) scenario. Soo...

BTW, I've already shifted to Windows 10. I've had it running on said machine for a couple months now.:pimp:

Seems like you were actually implying what you claim you didn't say... Lets go over it again...
"Making super powerful mega multicore CPUs even more of a waste of money(than they are now)." To put you in your place since you probably have a absolutely horrible testing methods of your own. My Q9550 is 2 of your cpu's in 1 package. If you run any modern game in window mode while running task manager in the background, you'll see that your cpu is using alot more % than even mine. Mine uses 60-70% while running any modern AAA title. I am not going to get 60fps in bf4 at 1080p with my gtx760 even at lowest gfx settings because the cpu itself is a bottleneck.

On the other hand... perhaps your monitor is just garbage where you just cant see the clarity in motion compared to better monitors and that's why you are fine with it. If you had a more modern monitor to compare with yours side by side then you'd see the instant clarity in motion which you'll also see every other bit of imperfection. Furthermore 60fps isnt even quality either compared to 120fps consistent framerate.

http://www.web-cyb.org/images/lcds/blurbusters-motion-blur-from-persistence.jpeg
This will show you the difference in clarity while in motion
 
Seems like you were actually implying what you claim you didn't say... Lets go over it again...
"Making super powerful mega multicore CPUs even more of a waste of money(than they are now)." To put you in your place since you probably have a absolutely horrible testing methods of your own. My Q9550 is 2 of your cpu's in 1 package. If you run any modern game in window mode while running task manager in the background, you'll see that your cpu is using alot more % than even mine. Mine uses 60-70% while running any modern AAA title. I am not going to get 60fps in bf4 at 1080p with my gtx760 even at lowest gfx settings because the cpu itself is a bottleneck.

On the other hand... perhaps your monitor is just garbage where you just cant see the clarity in motion compared to better monitors and that's why you are fine with it. If you had a more modern monitor to compare with yours side by side then you'd see the instant clarity in motion which you'll also see every other bit of imperfection. Furthermore 60fps isnt even quality either compared to 120fps consistent framerate.

http://www.web-cyb.org/images/lcds/blurbusters-motion-blur-from-persistence.jpeg
This will show you the difference in clarity while in motion
It's funny you mention that. Because I actually use a 1600 x1200 75Hz CRT monitor for most of my gaming. Not for fps performance, or visual acuity. Since I don't really care what fps my games run at, so long as it's better than 24fps. That's when things get too slow for my taste. Plus I won't play without v-sync, so you get what you get with that. The only reason I sometimes use 1080p is so I can chill on my couch and game, and not lose any significant visual clarity. Instead of sitting at my "desk".

You fps whores really crack me up. Like it matters THAT much. As if. Just turn the fps monitor off and play your game. You see how much funner that is?

If it makes you feel better to have a more "powerful" CPU than mine, good for you. I happy for you. Whatever it does for you, that's great. Have at it. More "power" to you. Pardon the pun.

BTW, you haven't taught me anything I didn't already know. I'm not stupid. I've been doing this for a long time now. This is not my first rodeo. But I also have a life that doesn't revolve entirely around my computer. Frankly, I've got better things to do than play games...of any sort. But I've been addicted to video games since I was ~4 years old. So I doubt I'll ever stop playing them, at least occasionally.
 
You fps whores really crack me up. Like it matters THAT much. As if. Just turn the fps monitor off and play your game. You see how much funner that is?
It does! But it's extremely game-related.
I find BF4 very playable at as low as 35 fps. I laugh at idiots like xfaptor or levelcap or whoever these "pro" players might be running BF4 at low details just so that they can have 144 fps (anything less is extremely noticeable and significantly affects gameplay. pro note: they were saying/doing the same crap when 120Hz was the thing).
At the same time, Skyrim is absolutely unplayable for me when fps dips below 50 (I just get sick of how the image moves or something, I cannot explain it).
 
Different strokes for different folks. It's all good, people.
 
Okay, so who cares about dx 12, when there's still gonna be majority of dx 9-11 out there, and amd cpu's as we know is lackluster in that field.

Which is why we need Devs to either support their chips better or adopt an API that make it easier. Even with DX12 I don't think people are going to run out and buy an 8350 when you can get the i5 4460 for around the same price. What I am saying is those who have the FX chips or any of the a8/10 chips will have some extended life hopefully giving them a chance to wait for Zen so they have another cheap upgrade option.
 
Back
Top