• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

B580 tanks performance with low end CPUs

True. Though actually the 4060 was slightly cheaper than 3060 if memory serves. I mean.... less bandwidth.... less memory... and more money... for the poorest people? Maybe nvidia saw sense for a brief moment and realized they just can't charge more for that one. But I agree with the rest of what you said even though I'm not totally understanding the scale of this cpu overhead problem yet.
IIRC, going back to the older TPU reviews, the 3060 12GB was a $330 MSRP card. So the 4060 8GB being $300 one could argue is actually a downgrade from the previous generation. I'm having trouble even finding the MSRP of the 6GB 3060.
You're not wrong, but still, the B580 offers very little to no improvement over the 7600 / 4060,
Technically has a performance advantage, and has 12GB VRAM and not 8GB
costs more
MSRP is $250, below both its competitors, and the majority of 4060s on newegg right now are $350+.
and isn't available.
Popular products usually have availability issues after launch see also - 9800x3d, RTX 4090, amd 7900xtx, ece.
To me, this is a big nothing burger.
To me, Intel brought some much needed competition to the space, and did so while lowering the price, something people are often mad about. Yes now they have driver overhead issues, but when AMD had the same issue from not having multithreaded drivers, people acted like it was nothing. Intel has the same issue, suddenly its the end of intel's GPU run.
 
I just skimmed a short gameplay review, and my 8GB 3070Ti rocks the 580. That 580 was just sipping power, maybe they need a few more tiles..
 
I just skimmed a short gameplay review, and my 8GB 3070Ti rocks the 580. That 580 was just sipping power, maybe they need a few more tiles..
Here's hoping for the B770. Or better yet, a B790/B970 with 40 cores? One can hope...
 
I just skimmed a short gameplay review, and my 8GB 3070Ti rocks the 580. That 580 was just sipping power, maybe they need a few more tiles..
God I would hope so.
There's a large performance gap between 3060 to 3070 and 4060 to 4070.

The deal is, the Arc B580 is entry level with a stupid well used part number that typically designates to high tier levels. Then Compare it to puny 4060, it's still slower and I found only 1 place priced at 250$ for Arc B580, and that's my local best buy for a reference card. They are sold out. I have notify set :)

Nice thing is, I have benched 4060 ti, so have plenty of synthetics to compare with :)
 
Yeah I had no idea.. I just surprised myself with some reviews to see what the chatter was about :D
 
Yeah I had no idea.. I just surprised myself with some reviews to see what the chatter was about :D
Hopefully there will be tests covered with what i wrote in my previous reply.
 
In HUB testing the 6 cores 7600 is consistency faster than the 8 cores 5700X3D, so nah it's not MT performance related.

Would still be interesting to see how B580 perform on Intel 12/13/14th gen CPU though
The 7600 is faster in MT than the 5700x 3d though. Slightly, but it is
 
My take is, this a low end GPU designed to make use of the power of a modern system?

Seems like cheating a bit.. like how windows uses an M.2 without dram..
 
Here's hoping for the B770. Or better yet, a B790/B970 with 40 cores? One can hope...
B770 definitely has a niche to fill. But more premium cards... not sure. People spending $700+ on a toy usually aren't exactly driven by value alone.
 
I'd like to see this tested with mid range intel chips (i5 13500-13600 etc.).
I'd like to see the full spectrum honestly.

I mean lets say they do play better with lga1700 cpus period, then perhaps the 12400f would be an excellent budget choice, you could build a quite capable pc for fairly cheap. Perhaps put some used parts in there, idk. Use ddr4 if you are really tight. Buy the cheapest case that comes with fans and a 660 or 760 mobo, wherever you can find a deal. Point is if these two play well together you could make a very cheap capable PC.

Then we could go to something more middleground if the person is on budget now but maybe will have some money later and then can upgrade to any number of GPUs.

Then I just want to see the high end... well kinda for personal reasons, I might buy one if it ever comes down to market price, to 1) help intel in its time of need, and 2) to have a backup GPU and a new toy to play around with.

But yeah thats all if it goes well, in reality, it really could be either extreme or somewhere in the middle. Maybe it plays nice with all of them. Maybe it has a moderate-severe penalty across the board. I mean I know my 14700kf is nowhere near the 9800x3d in gaming, thats no secret. But I want to know I can use all its power if I buy it. Its still data I want to know. I mean intel likely worked on these and tested these primarily on intel chips.

So yeah, I would really like to know.

Anyway I'm sure we'll find out eventually....


IIRC, going back to the older TPU reviews, the 3060 12GB was a $330 MSRP card. So the 4060 8GB being $300 one could argue is actually a downgrade from the previous generation. I'm having trouble even finding the MSRP of the 6GB 3060.
Huh? There was no discrete 6gb 3060 that I'm aware of, The 3060 only had the 12GB version and the 8gb version and the msrp was the same for both. And the 8gb version didn't last long at all. Look up the 3060 anywhere and you'll find mostly the 12gb models.

Downgrade in vram certainly, its generally more powerful UNLESS you are vram limitted then you might find the 3060 doing better. ff16 is a game where this can happen. So many people with 8gb cards come to complain, yet people with 3060s generally say it runs well. Its hard to know exactly what settings they are using and everything but ff16 doesn't flush the buffer so it does do ram swaps so no doubt a lack of vram is going to slow a card down and that game uses an insane amount of vram. I was playing 4k high (not ultra) with dlss on balanced ( btw dlss doesn't affect vram in this game for some reason) and the highest number I saw was 15.5GB.
 
Last edited:
My take is, this a low end GPU designed to make use of the power of a modern system?

Seems like cheating a bit.. like how windows uses an M.2 without dram..
I guess at this rate it will be treated as a failure to launch or a false start by people unfortunately, now we will see if the testing actually gets done, after that maybe driver tweaks. I'm not so sure with the internal fighting intel is having if they really even are focussing on this matter. I mean it appears it would be embarrasing to them as a corp, who knows maybe the b700 series would be an uplift or sigh of relief. Its a very high mountain to climb considering before arc the last dgpu was the i740 which flopped, and only focussed on igps being just a display adapter.
 
Hopefully there will be tests covered with what i wrote in my previous reply.
14nm potato would under-dog any modern gpu. Why go that far back. Intel is on weird sauces lately. 285K, similar type of flop, now the core Ultra, hardly even mentioned in this thread let alone alltogether.
 
14nm potato would under-dog any modern gpu. Why go that far back. Intel is on weird sauces lately. 285K, similar type of flop, now the core Ultra, hardly even mentioned in this thread let alone alltogether.
Its like a dirty family secret that no one wants to acknowledge :D
 
Its like a dirty family secret that no one wants to acknowledge :D
Haha, seems like it funnily.
The products aren't slow by any means though. Just doesn't seem to fit criteria "gaming" very well anymore. The argument is, "good for work" cause e-cores. And I would consider that questionable at the wattage numbers.
 
i wonder why he didn't test a single intel cpu @.@
said he didnt have time as he wanted to sleep, but the intel ones will be coming
 
Not yet, maybe third time will be the charm..

In reality, I would like to see them come out of left field and blindside the other players, shock them into submission type of thing.. maybe later in the decade when they master AI :rolleyes:
 
I don't think this card is worth the hassle. I mean, testing on intel might reveal it's slightly less affected, but does anyone really expect a miracle ?

I think intel propably tested he card with their CPUs and not with AMDs. I expect that their stack is going to be substantially less affected.
 
I think intel propably tested he card with their CPUs and not with AMDs. I expect that their stack is going to be substantially less affected.
The pricing makes the B580 look good in general. IF, big IF, you can find it at only 250$. This GPU would be fine entry level card over buying a 4060/ti.

That being said, I had no issues running Cyber Punkt 77' on 14100F driving the 4070 Super. Only difference from that ND 14700K is frequency, the same experience would be had on a 12400F.

We would see the impact on 14nm, because the IPC is similar to 5000 series AMD processors and lower where all video cards' performance would tank.
 
I don't think this card is worth the hassle. I mean, testing on intel might reveal it's slightly less affected, but does anyone really expect a miracle ?
Im not just curious

said he didnt have time as he wanted to sleep, but the intel ones will be coming
Waiting on it to be done asap.

I think intel propably tested he card with their CPUs and not with AMDs. I expect that their stack is going to be substantially less affected.
Its why i want a test from 9th gen and ryzen equiv by year launched to the latest parts.

Haha, seems like it funnily.
The products aren't slow by any means though. Just doesn't seem to fit criteria "gaming" very well anymore. The argument is, "good for work" cause e-cores. And I would consider that questionable at the wattage numbers.
Any way of setting affinity to use pcores only?
 
I don't think this card is worth the hassle. I mean, testing on intel might reveal it's slightly less affected, but does anyone really expect a miracle ?
Performing as originally promised I don't think is outside the bounds of possibility. Is that what you consider a miracle? I mean the guys work at intel. Likely they had more chances to test on intel cpus than amd ones and may have missed something on amd. The two cpu architectures do have some weird quirky differences about them...

But I fully admit I do not know so just hoping for the best.
 
Last edited:
Any way of setting affinity to use pcores only?
Certainly. Turn off e-cores in bios, then 285K is 8 threads p-cores, not 16 threads. That's why it's trash for gaming. Absolutely no reason to get Core Ultra for a gaming rig. AMD has way better offerings because their 8 cores have SMT and all powerful cores. 9800X3D is pure gaming monster. For the price. For the wattage, it's just all around way better. In reality, Intel offers nothing current to compete on the gaming market sector.
 
Certainly. Turn off e-cores in bios, then 285K is 8 threads p-cores, not 16 threads. That's why it's trash for gaming. Absolutely no reason to get Core Ultra for a gaming rig. AMD has way better offerings because their 8 cores have SMT and all powerful cores. 9800X3D is pure gaming monster. For the price. For the wattage, it's just all around way better. In reality, Intel offers nothing current to compete on the gaming market sector.
Yet there are users from 10th-14th gen still running so a test for them is a must imho
 
Yet there are users from 10th-14th gen still running so a test for them is a must imho
I can tell you a 11700K and lower is going to have vastly poor results to anything 12th gen and up. Otherwise I'd stuck with 8700K 5.2ghz which is vastly slower for 3D benchmarking over 14th gen and core Ultra. I would wager to say up to, possibly over 20% loss in performance based on past benchmarking experiences with 14nm Intel chips.
 
Back
Top