• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Graphics Announces DirectX 11 Performance Uplifts and Frame-time Reductions

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,845 (7.39/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel Graphics today announced the Q3-2023 major update of its Arc GPU Graphics drivers, which will be released shortly. The latest driver promises to be a transformative update recommended for all Intel GPU users. The company says that it has re-architected several under-the-hood components of the drivers to make A-series GPUs significantly faster. The company also put in engineering effort to reduce frame-times, and introduce a new way of measuring the GPU's contribution to it; so users can figure out whether they are in a CPU-limited scenario, or a GPU-limited one. Lastly, the company updated its PresentMonitor utility with a new front-end interface.

Intel Arc "Alchemist" is a ground up discrete GPU graphics architecture that was designed mainly for DirectX 12 and Vulkan, but over time, relied on API translation for DirectX 9 games. With its Spring driver updates, the company had released a major update that uplifted DirectX 9 game performance by 43% on average. This was because even though API translation was being used for DirectX 9 games, there was broad scope for per-game optimization, and DirectX 9 remains a relevant API for several current e-sports titles. With today's release, Intel promises a similar round of performance updates, with as much as 19% performance uplifts to be had in DirectX 11 titles at 1080p, measured with an A750 on a Core i5-13400F based machine. These gains are averaged to +12% on the fastest i9-13900K processor. The logic being that the slower processor benefits greater from the changes Intel made to its DirectX 11 driver.



Intel has developed a new performance metric called "GPU Busy." Put simply, this is the time taken by the GPU alone to process an API call from the CPU. Game rendering is a collaborative workload between the CPU and GPU. For the generation of each frame in a game, the CPU has to tally the game-state with what needs to be displayed on the screen; organize this information into an API call, and send it to the GPU, which then interprets the API call and draws a frame.


Every time the CPU's end of calculations for a frame is done, it puts out a "present" call to the GPU driver. As the GPU is rendering the frame, the CPU thread responsible for the frame is essentially idle, until the GPU can post a "present return" state back to the CPU, so it can begin work on the next frame. The time difference between two presents is basically frame-time (the time it takes for you machine to generate a frame). After the generation of a frame by the GPU, it is pushed to the frame-buffer, and onward to the display controller and the display. Intel figured out a way to break down frame-time further into the GPU's specific contribution toward it, which the company calls GPU Busy.


With the new GPU Busy counter, the company is able to show just how much the new latest drivers contribute to minimizing the frame-time. To do so, the company first showed us a frame-time graph of "Overwatch 2" with the launch driver, showing a wild amount of jitter. It then showed the GPU-Busy contribution to it. Since GPU Busy is a subset of frame-time, on a time-scale, it is a lower value. Every time there is a large gap between the GPU Busy value and the overall frame-time, you experience a CPU-limited scenario, whereas in regions of the graph with finer gaps between the two values, you are either GPU-limited, or balanced.

With that out of the way, the company showed us the frame-time graph of its latest driver, which shows significantly lower frame-times, and much less jitter. When a Core i5-13400F-powered machine is overlaid with an A750 using the latest driver, you notice that not only is the overall frame-time lower and jitter suppressed, but also the GPU Busy time is reduced. There's a greater coherence between the CPU and GPU performance. Overall, Intel's effort isn't directed toward improving frame-rates (performance), but also "smoothness" (reduced frame time jitter).

The GPU Busy metric can be measured using the latest version of PresentMon, which the company is releasing as its own standalone overlay application. You can read all about it here.

The complete slide-deck from Intel follows.


View at TechPowerUp Main Site
 
@W1zzard I'm afraid the bell is ringing for you (I know you are already planning or preparing an Arc GPU re-test with the improved drivers). We will lazily wait for your work...:cool:
 
It's good to see Intel doing the hard yards. While Arc is much slower than its competition for equivalent die sizes and TDP, this commitment to its improvement will hopefully translate to better successors.
 
It's good to see Intel doing the hard yards. While Arc is much slower than its competition for equivalent die sizes and TDP, this commitment to its improvement will hopefully translate to better successors.
I look forward to battlemage, specifically high end battlemage.
 
If only NVIDIA didn't have a monopoly on Machine Learning software, Intel GPUs would be cool...
 
I know I've shit pretty heavily on Intel for Arc being so underwhelming, but the fact that they're continuing to pump out improvements has gone a long way to letting me hope that Batllemage will hit and be good.
 
Take note for Apex Legends, they did their tests in the standard DX11 renderer with great uplift in performance on the A750.

If you use the new beta DX12 renderer, it can maintain 200+ FPS at 3440x1440p with the same graphical settings (on a A770 that I’ve tested).

EDIT: I tested it on a 5800X/DDR4-3733/B550 system.
 
Imho, with competitive price and (hopefully) continuous improvement like this, someday I will switch to intel gpu..
 
Imho, with competitive price and (hopefully) continuous improvement like this, someday I will switch to intel gpu..

im ready to go intel right now, if only they would drop the price on the a770 16gb , its way too expensive for what it is.
 
Well that's good news.

I only have older games and the BL3 benchmark was kind of a turn off.

 
It's good to see Intel doing the hard yards. While Arc is much slower than its competition for equivalent die sizes and TDP, this commitment to its improvement will hopefully translate to better successors.

This is sadly not new additions.
This is from the launch, not new, these uplifts are old.
 
I'd like to see a review dedicated to testing these new improvements when they go live
 
good to see Intel start doing some innovations.
GPU Busy metric seems like a good way to see know whether the bottleneck come from GPU or CPU+RAM
 
That driver team is still up for the years to come to fix all those bugs, or relatively lower performance. It's up to their next gen of hardware to provide more efficient GPU's compared to camp red or green if it even wants to be succesful. It's always good to have competition - it will thrive sharper pricing for all of us.
 
Unfortunately a 3 years late to the party. They were ambitious but rubbish.
 
Unfortunately a 3 years late to the party. They were ambitious but rubbish.

one of hte more cringy things coming from top gear
 
I know I've shit pretty heavily on Intel for Arc being so underwhelming, but the fact that they're continuing to pump out improvements has gone a long way to letting me hope that Batllemage will hit and be good.
you & every reviewer out there.
most the people don't understand you can't get prefect drivers from a lab.
 
I like the “look busy” utility; it will help determine where a user’s investment spend should go, cpu or gpu. This is very relevant for older cpus./platforms. I hope reviewers will be able to use this tool to add further insight into the benchmarks, and I welcome reviewers using older platforms rather than latest bleeding edge. It will be a heck of a lot more work for the reviewers.

Nonetheless, well done Intel on improving the drivers.
 
I like the “look busy” utility; it will help determine where a user’s investment spend should go, cpu or gpu. This is very relevant for older cpus./platforms. I hope reviewers will be able to use this tool to add further insight into the benchmarks, and I welcome reviewers using older platforms rather than latest bleeding edge. It will be a heck of a lot more work for the reviewers.

Nonetheless, well done Intel on improving the drivers.
Imagine needing a look busy utility, how embarrassing to need to know if it's your GPU being a pile of steaming dog shit or the CPU in this day and age. How embarrassing.
 
Imagine needing a look busy utility, how embarrassing to need to know if it's your GPU being a pile of steaming dog shit or the CPU in this day and age. How embarrassing.
Amen, and while we're at it let's get rid of x-ray and ultrasound machines too. Idiots ought to know if they have colon cancer or appendicitis based on feel alone.
 
Amen, and while we're at it let's get rid of x-ray and ultrasound machines too. Idiots ought to know if they have colon cancer or appendicitis based on feel alone.
This utility should not be needed. Is AMD not embarassed that they *need* this utility to fix their drivers?

Blind ignorance is bliss I guess
 
I look forward to battlemage, specifically high end battlemage.
At the moment, it looks like there will only be 1 desktop and 1 laptop Battlemage GPU.

From what I understand, they are targeting RTX 4070 performance. That would be over 90% of the market.

im ready to go intel right now, if only they would drop the price on the a770 16gb , its way too expensive for what it is.
It is a 16gb card, performs about as well as every other card at it's price range, and has a year's worth of driver updates before the next generation hits the shelves.

In all seriousness, what do you think you can get for $350USD
 
In all seriousness, what do you think you can get for $350USD
The elephant in the room is the 6700XT/6750XT, which can be had for about $330 (going by Newegg prices) atm. If you don't mind refurb, there's even an RX6800 for $370
 
The elephant in the room is the 6700XT/6750XT, which can be had for about $330 (going by Newegg prices) atm. If you don't mind refurb, there's even an RX6800 for $370
Yeah, but right now, the drivers are as good as they are going to get (and they have roughly the same performance as an a770 - minus the extra memory), whereas we are looking at about a year's worth of driver updates before Battlemage launches.
 
Back
Top