• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel B580 paper launch Analysis -Moore's Law is Dead -

Joined
Dec 6, 2022
Messages
640 (0.72/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
Some good points in this video , including one that I suspected, losing money per GPU.

 
Didn't even give him my view.

1734148628826.png


I'll let him know when it gets here.
 
including one that I suspected, losing money per GPU.
Already A770 used a very big chip comparably on the newest node back then (6nm), with very weak performance that then had to be sold cheaply - so I'm not surprised this is also not making them (much or any) money. B580 ~300mm² on newest node (that is used for GPUs anyways), so relatively big, expensive and still it only competes with stuff of Nvidia half the size. There go your margins, into the bin. AMD also uses a smaller chip, on a older node and is competitive with this. So why do all people here (nearly all) hype this GPU? It's 1 or 2 gens behind the competing stuff and not really amazing aside from the RT performance. In US it's "okayish" pricewise, but you lose Nvidia features, while in Europe it's so expensive it competes with the way better RX 7700 XT. RIP.
 
B580 is small ugly chip, ultra wide form, unsymmetrical installed on package, node isn't bleeding edge.
 
So why do all people here (nearly all) hype this GPU?

People are mainly happy to see a 3rd player in the GPu market. It makes for better competition and in theory should also drive prices down for the consumer. Battlemage is a step forward in progress for Intel and hopefully they can continue to get better till they can release proper high end cards
 
People are mainly happy to see a 3rd player in the GPu market. It makes for better competition and in theory should also drive prices down for the consumer. Battlemage is a step forward in progress for Intel and hopefully they can continue to get better till they can release proper high end cards
Yes and I wouldn't buy it anywhere in the world rn. In US it firmly competes with 4060, which has better features, comparable performance and is efficient, this isn't. And in Europe it competes with the far better RX 7700 XT due to terrible pricing. It's deadweight, hyped and 2 years too late. Soon it gets eclipsed entirely by the newest gen.
 
It is almost as if someone releasing a compelling product at given price point leads to the product being sold out...

B580 is small ugly chip, ultra wide form, unsymmetrical installed on package, node isn't bleeding edge.
How often do you look at the bare die of your GPU? :o
And it is a cheaper GPU on 5nm. High-end GPUs today are manufactured on 4nm, which is a variation of 5nm. AMD competition is actually even still on 6nm, a full node behind.

People are mainly happy to see a 3rd player in the GPu market. It makes for better competition and in theory should also drive prices down for the consumer. Battlemage is a step forward in progress for Intel and hopefully they can continue to get better till they can release proper high end cards
Adding to that, Intel seems to be targeting low-mid range with aggressive price points. The entire segment that both Nvidia and AMD have effectively abandoned.
 
Last edited:
I Can't bring myself to watch nails on a chalkboard, waste of oxygen Moores law is dead, but from what I gather locally we had reasonably healthy stock(with ETA's for more in coming days) it just flew off shelves at such sharp pricing after gamers in this segment have been desperately waiting for price to performance improvements and an above average VRAM buffer.
 
i'll kick out my 4090 and use the B580 as my daily driver just because it looks like a fun project and i am excited to use something new.
 
Affect what, 4060 is a 4050 in disguise and almost useless at this point. Besides $50 cheaper for 50% more power is a bitter tradeoff

B580 is small ugly chip, ultra wide form, unsymmetrical installed on package, node isn't bleeding edge.

The ultrawide chip does fit more dies on a wafer compared to the square. off center is not a concern. but the N3B node would sweetened the deal for me, employed also by arrow and moon lakes that are not selling particularly well. Might just scrap them and use it for GPU. Personally I'm not buying anything until N2. Holding off until they stop beating around the bush.
 
Last edited:
400 euros, not sure what they are thinking, for that price you can get a new rx 7700xt or a new rx 6750 XT, with better performance and less issues.

AMD just takes 50 out of nvidia prices, Intel is even worst.
 
People are mainly happy to see a 3rd player in the GPu market. It makes for better competition and in theory should also drive prices down for the consumer. Battlemage is a step forward in progress for Intel and hopefully they can continue to get better till they can release proper high end cards


The idea that they are selling this card at cost doesn't inspire me to believe they will bring down prices in any meaningful way, in fact Intel using up the node space may be inflating the prices we see from the other two.
 
Personally I'm not buying anything until N2.
GPUs are not likely to be on N2 but will be on N2P. On roadmap for 2026, with mass production delays meaning cards using this stuff may slip into late 2026 or 2027.
 
The idea that they are selling this card at cost doesn't inspire me to believe they will bring down prices in any meaningful way, in fact Intel using up the node space may be inflating the prices we see from the other two.
Yea won't happen, didn't happen last time as well. It's just something else to buy and that's it, some hype and then back to 0% market share. Intel would have needed to be actually faster AND then have lower prices to compel the other two companies to lower the prices to level - this is not what is happening. They are ~ same performance, and 50$ cheaper (only in US, in Europe actually super expensive and competing with faster products), having 0 mind share and worse features than Nvidia, people will simply still buy Nvidia. And why bother? The 4060 is way more efficient on top, it's simply the better product. AMD also is lost at the mid range currently, no difference. The more compelling products of AMD are in the upper mid range and semi highend.
 
Last edited:
@AcE, I understand your point about price in EU, it makes sense that the price/performance is different there and if Intel hasn't found a way around VAT price increases while retaining margin (the reason everything is more expensive when shipped to EU), then they're not going to compete in EU. That's a valid point and I'm not going to argue with it. However, your other points contradict themselves. 4060 is not a better product if it doesn't outperform the competition and in this case it does not. Efficiency is not why ~90%+ people buy graphics cards. They buy the best performance card they can afford. There are always a few outliers, but every time there's competition, the only people that argue for efficiency are the ones with the "more efficient" and often lower performance card. This was true when Nvidia was the power hungry card and AMD fans said "but AMD is more efficient even though they're not faster", it was said when Intel was more efficient than AMD. It is now said the other way now that AMD CPUs and Nvidia GPUs are both the most efficient and best performing, but what drives actual purchases is the best performing part of the equation... Every time. That's why even the power hungry Nvidia/Intel parts were bought over their competitors when they performed best.

Finally, you admit that AMD and Nvidia have abandoned this price point and don't care about it, while saying Intel can't compete there? That's a contradiction in itself.

It definitely looks like (from die size), Intel is not making anywhere near as much margin as AMD/Nvidia, but they have decided they can sell at this price to get some market share. It may not help their finances much for now, but it is a good forward-looking move and will help everyone in the long run. I buy high end cards so this isn't even for me, but if you think about the crowd that hasn't bought anything since 1550/1660 series cards because of the price increases, here's a card selling for roughly the same amount (in the US) that blows it away. It's a compelling product to lots of people, even if it doesn't work for you.
 
Looking at how the Intel card was put together, I doubt that it cost them more than 100 bucks to make. Not sure if they are actually giving them away..
 
so VAT is what is making Intel products look like lemons in Europe, but that doesn't apply to AMD. That makes no sense at all
 
I did a quick check on Amazon US and assume RTX4060 is USD300...

At current pricing (assuming RTX4060 at ~USD300 and Arc B580 at ~USD250, no scalping / supply issue / price increase happens) B580 is faster / at least trade blows with 4060, is significantly cheaper than 4060, and the compromises (efficiency, much milder potential driver issues) are good enough. And say whatever you will about power efficiency, a significant portion of gamers is gonna say "efficiency my ass", and in terms of Total Cost Ownership it is doing better than 4060 at USD300 unless energy crisis happens. I don't see why B580 won't sell like hot breads, at least in US.
It's super sad that B580 is not cheap everywhere though.

I would love a "back in the day USD250 = actual midrange product" argument, but as the current reality of the market, I don't see what the bitching is all about yet.

B580 is a genetic freak in the USD250 segment. Now what will drastic go down? If 4060 / 7600XT prices go down, that's still a very net positive result. I hope Intel's stock price doesn't.
 
I Can't bring myself to watch nails on a chalkboard, waste of oxygen Moores law is dead, but from what I gather locally we had reasonably healthy stock(with ETA's for more in coming days) it just flew off shelves at such sharp pricing after gamers in this segment have been desperately waiting for price to performance improvements and an above average VRAM buffer.
I don't listen to his regular "news" episodes with his brother which are just glorified rumor brainstorming sessions but that is only half of his content. His interview podcasts are actually quite good with knowledgeable guests. I would recommend these, his content alternates between the "news" episodes and the interview episodes.
 
I don't listen to his regular "news" episodes with his brother which are just glorified rumor brainstorming sessions but that is only half of his content. His interview podcasts are actually quite good with knowledgeable guests. I would recommend these, his content alternates between the "news" episodes and the interview episodes.
Agreed.

One thing that I find hilarious is that both Intel gpus had horrible drivers, yet when anyone mentions bad drivers, everyone automatically say “AMD”. :D

Thats said, Jan 2025 will be interesting, assuming that we do end up with a price war.

Low to mid tier really need an infusion of better gpus and we might get that.

Looking at how the Intel card was put together, I doubt that it cost them more than 100 bucks to make. Not sure if they are actually giving them away..
Which begs the question, isnt that price dumping, which is illegal (i think)?

Or for how long can they keep that up, given their current situation?

Dont get me wrong, if their “sacrifice “ ends up resetting the gpu market insane pricing, would be a noble one for our benefit.
 
Last edited:
4060 is not a better product if it doesn't outperform the competition and in this case it does not.
It does. DLSS is way better than XeSS and in way more games, the card is also way more efficient and has generally better features. Intel is a noob when it comes to GPUs, you pretend as if Intel has feature parity, but they are far from it. Their software is barebones, their drivers are inferior, just everything reeks of "noob company". TBH, Nvidia is just 10x better than Intel, it's nice for nerds who want to experiment, and that's it. Average people will never buy Intel GPUs, they never heard of "Arc" they will only buy Geforce, maybe Radeon.
Finally, you admit that AMD and Nvidia have abandoned this price point and don't care about it, while saying Intel can't compete there? That's a contradiction in itself.
Then reread what I wrote, nowhere did I say Nvidia abandoned anything. Jensen is well known for giving the competition not even one inch to move, he's a extremist in this way. Maybe stop interpreting my words and clearly read what I say.
 
It is almost as if someone releasing a compelling product at given price point leads to the product being sold out
Let's not kid ourselves, everything sells out when the supply is limited.
 
DLSS is way better than XeSS

Way better? Looking at the dozens of XeSS tests TPU has done it is barely behind DLSS in performance uplift, and generally equal visuals at each quality level minus the shadow details in UE5 games.

Intel is a noob when it comes to GPUs

Intel has had at least one dedicated GPU development team since 1996. Their first visual processors came out in the 80s. They have had the Intel Gen team since ~2008. iGPUs are still GPUs.

TBH, Nvidia is just 10x better than Intel

This level of hyperbole makes me think that perhaps you are not arguing with facts. I would expect if they were "10x better" they would not have such an obvious inadequacy in their product stack that ANY prospective customers would be excited to buy into competition. Surely if NVIDIA were "10x better" they wouldn't even have people questioning their products at all when a competitor comes along. Which, according to the comments on every single B580 review and all the Reddit threads, people are absolutely questioning NVIDIA's products in this particular market segment.

Average people will never buy Intel GPUs, they never heard of "Arc" they will only buy Geforce, maybe Radeon.

This statement was proven false before you even typed it...
 
Back
Top