• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc B580 Battlemage Unboxing & Preview

I mean, it sounds promising at the price point. We really need some better value cards in the market as I feel that area gets overlooked in this day in age. I do want to see a lot more competition in the 500 and under areas as above that I feel that is where we need it the most as above that starts to get expensive for most people.
 
Anyways, I'm not sure how to look at this one, given all the rumors stating that Intel will abandon the GPU market.

they will not exit the GPU market, simply because of integrated graphics is so important.

Now, discrete GPU? May be.

I think reducing the cost would be wise, but they could go for a couple of discrete cards on segment where AMD and Ngreedia are weak / absent.
The B580 makes sense, as there will be no new card on that segment for 6+ months.

The B770 or B780? I am not so sure, as the competition is fiercer there (as you said yourself).
 
Love the looks, Intel has without a doubt the best looking reference cards.

Now that Fortnite is on Unreal Engine 5.5! I've got a 4090 and it dips below 60fps on max video settings at 1440p, no DLSS or equivalents used.
Didn't Fartnight run even on a potato not that long time ago?
 
Using rumors and Intel's B580 results as a baseline, it looks like an epic matchup is coming in 25Q1: 5070, 8800XT, B770 might all be around 300W, $500 and have similar performance to the 4070 Ti Super.

I think the sub $500 GPU market might be coming back!

Nvidia isn’t going to give away 4070ti super performance for free. 5070 won’t be sub $599 on release.

OT: Looks like a decent card, there’s currently nothing priced well that isn’t doa to replace the 2070 in the htpc atm, which is a little worse for the wear in recent days. Actually looking forward to a review on Intels new cards.
 
Last edited:
Love the looks, Intel has without a doubt the best looking reference cards.


Didn't Fartnight run even on a potato not that long time ago?
Yeah, they still got DirectX 11 Performance Mode still available for older/weak computers/consoles , cough cough Switch.
 
Here's an important slide: (B580 available Dec 13. B570 available Jan 16)

1733239538333.png
 
As we were discussing on another thread, the factor Mindshare is really hard at work.

See how many positive messages are here for poor ol'Intel and of course, the Ngreedia fanbois have to mention their brand.

But if it was an AMD GPU article, the trashing would be epic, as usual.

Anyways, I'm not sure how to look at this one, given all the rumors stating that Intel will abandon the GPU market.
I see we're already firing the furnaces for Another Major Disappointment.

Is the mindshare department right behind you? Are they whispering sweet nothing into your ears? It couldn't have anything to do with generation after generation of disappointment, would it? Maybe it's just that AMD has made it clear they dont care about budget users, so intel making advancements here is actually exciting, as opposed to another 6600xt 8G brebrand.
I mean, it sounds promising at the price point. We really need some better value cards in the market as I feel that area gets overlooked in this day in age. I do want to see a lot more competition in the 500 and under areas as above that I feel that is where we need it the most as above that starts to get expensive for most people.
In intel has actually made some significant arch improvements, the B770 is gonna be actually interesting. 4070 for $400?
 
Thing is, Intel only looks positive compared to Nvidia, but in this price class your main competition is AMD, not Nvidia.
You can tout that your raytracing is miles better, but frankly at this performance tier it's almost never worth it.
With an MSRP of 250$ it is more expensive than a 7600.
By just saying it's 10% faster than a 4060 they're effectively saying they're about matching AMD's value.
Now, let me ask again. Do you want an AMD card? Or an Intel card? Same performance, same price.
I'd pick AMD any day of the week no questions asked.

We can mill about the VRAM difference, but this is likely to be addressed when AMD (and Nvidia maybe? lol?) releases the next gen in a couple months.
Which is why I mentioned the rushing part.
It's just stupid comparing value to a card with an MSRP of almost 2 years ago of which the price no longer is relevant.

At this price it's an expensive toy which can double as your budget GPU, but it needs to offer a more compelling value.

And don't even let me get started on Intel pricing in EU compared to AMD.
I think for a lot of people XeSS trumps FSR in image quality, combined with better ray tracing performance that would be the reason to take Intel over AMD.
 
I think for a lot of people XeSS trumps FSR in image quality, combined with better ray tracing performance that would be the reason to take Intel over AMD.
I'd take consistent performance any day of the week but I can see your reasoning.
 
I do not care for gaming but I hope this card will pack punch as encoder and NPU on power diet.
 
I like the pricing, the VRAM capacity at that pricing, the extra features and the RT and AI performance improvements. If I was still with my RX 580, I would be definitely considering jumping on Intel.
Let's hope that people who are educated to avoid the AMD brand will give a chance to Intel, because if Intel succeeds, the others will have to at least stop this joke of 8GB VRAM capacity for cards that cost even over $250.
 
If the performance is what I think it will be, it's an excellent price point, power draw is within reason. Really looking forward to the full review, the card looks sleek and clean.
 
Considering their claim that 140V is 16% faster than 890m and turning to be 10% slower, the B580 is going to match RX 7600 not XT and be 10% behind 4060. For $250...
 
So just a single 8-pin for the reference card is assuring. Seeing those dual 8-pin OEM cards suggested that power consumption was going to be bad. Instead, it’s probably just some factory overclocking and RGB madness.

Must have resizable BAR? Guess I can’t consider one of these without a full upgrade. Just hope it helps drag prices down across the board.
 
Last edited:
I think I hurt some feelings...so so so sorry. However, I have zero faith in Intel GPUs and their drivers. Do ya'll really think Intel is seriously invested in this space? I don't think so. Looks to me like an attempt at a quick cash grab that backfired.

I guess we didn't learn from last time - specs mean nothing.
 
Looks decent, both design and spec-wise. If the drivers are on point Day 1 (and I do mean Day 1) it might be an interesting option for the most popular market segment. Obviously, performance testing will show what’s what, but the Intel slides (grain of salt and all) look promising.
‘Course, the elephant in the room is still that Intel is potentially a gen behind. The fact of the matter is that BM is built to compete with cards that are months away from EOL. We’ll see how that pans out.
 
Here's an important slide: (B580 available Dec 13. B570 available Jan 16)

View attachment 374274
I really hope to be wrong, but at those prices I don't expect the B580 to be better than the A770 with native settings (without FG or upscaling).
 
Any OEMs selling Intel cards that are part of their prebuilts like Dell, HP, Lenovo? I'd figure the main revenue here would be from bigger bundling deals with these companies.
 
Now that Fortnite is on Unreal Engine 5.5! I've got a 4090 and it dips below 60fps often on max video settings at 1440p. It fluctuates between 40-65 fps, with huge dips to 5fps for areas not loaded before or too much action. (no DLSS, FSR, motion blur, frame insertion used)
Some might say ... Unreal 5.6 even (somehow)
 

Attachments

  • UE 5.6.jpg
    UE 5.6.jpg
    10.8 KB · Views: 79
I like the "50% more hierarchical Z/Z/Stencil cache, Earlier Hierarchical Z culling for small primitives" part = 50% less drawing what won't be seen on-screen.
RTX 4060 performance for a normal price for a change.
 
Do these have the precision capabilities to run Folding@Home yet?
 
I'm looking for two things in Intel's newest dGPUs:

- consistent driver/software support
- much lower idle power

If they can settle those two concerns, then I'll consider Battlemage at least a limited success. Value doesn't look especially thrilling, but it also doesn't look bad. The B700 models will be more interesting in that area, IMO. Intel wisely chose to push out their lower end cards first, in time for the Christmas season, and perhaps also in recognition of the fact that next-gen product stacks from Nvidia (and possibly AMD) will be released from the top down, leaving Intel a niche below.

Intel does have advantageous features relative to AMD. XeSS is generally considered superior to FSR (though tbh I find complaints about the latter overblown), and their video encoder is excellent. RT is another area of relative strength, but it's also basically irrelevant at this performance tier. None of this is worth spit if the drivers regularly break, though, or if the card sits idle at 40+ watts.
 
Last edited:
Hardware is OKish, however they seriously need to pick up the pace on the Software level. Their XeSS is still miles away compared to DLSS Quality, both on image and FPS.
Also, they should bring frame generation asap to their product, otherwise the RTX 4060 will still be the evident winner and buy.
 
Do these have the precision capabilities to run Folding@Home yet?
No FP64 mentioned on the Architecture page, so pretty sure not.

I think I hurt some feelings...so so so sorry. However, I have zero faith in Intel GPUs and their drivers. Do ya'll really think Intel is seriously invested in this space? I don't think so. Looks to me like an attempt at a quick cash grab that backfired.

I guess we didn't learn from last time - specs mean nothing.

Popping in to say simply, "5090?" is actively detrimental to constructive conversation.
 
Back
Top