• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Arc "Alchemist" PCB Closeup Shows Up on Intel Graphics Discord

not possible.

DAG size of Ethereum is 4.7GB. It won't reach 6GB till 2024.

The 6500XT is already a faster card (for gaming) than the 1060 6GB. And you think it's terrible.

A hypothetical 6GB 6500 XT card would earn around $550, whereas the 4GB version is useless, because, 4GB.

There is absolutely zero chance of a 6GB card for $150-$200.

You can have a hopefully slightly better 4GB card, or you can have something costing $400++. Those are your choices.
To me, it is a bad card as well. Performance wise, if the system that you are using it has a PCI-E 4.0 slot, then you will get the full performance of the card. And even if you get 100% of what you paid for, you are comparing a GPU that is 2 generations back, launched 6+ years ago with likely no driver optimisation by now. The improvement in performance is "meh" to say the least. And the difference in 2GB VRAM will also hurt performance at higher graphical settings. The story is worst when you compare both cards on the same system that only has PCI-E 3.0.
Objectively, the performance is decent if certain condition is met. But this is a card that I will only recommend if someone must buy a GPU because their existing GPU is dead or dying. For users for GTX 1060 6GB and RX 570/ 580, they should just continue to use their card until the opportunity presents itself to buy a better GPU at their target price, or until their GPU bites the dust. From an attractiveness standpoint, the RX 6500 series is a fail.
 
Last edited:
Intel needed to release this last year already. Soon Nvidia and AMD will do a refresh and later this year release new cards.
The only thing that might help Intel then is good pricing and Intel already indicated it won't be cheap and this will fail.
 
The 6500 XT is what I call, a pandemic card of mining times. It's kinda terrible but good enough, scares off the miners and can be produced easily in masses. The issue is it is simply not usable for Ultra settings and that's why it sucks in many reviews, as soon as 4 GB is used up the perf of it tanks hard. If you use it well it's good enough. The greater issue it then has is the fact you can not use ReLive with it.
 
To me, it is a bad card as well. Performance wise, if the system that you are using it has a PCI-E 4.0 slot, then you will get the full performance of the card. And even if you get 100% of what you paid for, you are comparing a GPU that is 2 generations back, launched 6+ years ago with likely no driver optimisation by now. The improvement in performance is "meh" to say the least. And the difference in 2GB VRAM will also hurt performance at higher graphical settings. The story is worst when you compare both cards on the same system that only has PCI-E 3.0.
Objectively, the performance is decent if certain condition is met. But this is a card that I will only recommend if someone must buy a GPU because their existing GPU is dead or dying. For users for GTX 1060 6GB and RX 570/ 580, they should just continue to use their card until the opportunity presents itself to buy a better GPU at their target price, or until their GPU bites the dust. From an attractiveness standpoint, the RX 6500 series is a fail.
It serves one market well, and that's people building new Alder Lake PCs who don't want to spend $500+ on a GPU.

It's not at all a solution for people wanting to upgrade ageing cards, it's a card which goes in your new build an lets you play AAA games at acceptable quality levels.

I don't think it's exactly a question of whether it is an objectively good or bad product. Objectively it's a bad product, in that it should be faster, have more PCIE lanes, encoding, etc. But OTOH, it's better than the available alternatives, so if you need a GPU for your new PC it's the only choice in many cases (e.g., if you don't want to spend more than $250, and used is not an option for you).
 
I'm incredibly impressed that you're able to determine this from an image.


And that's where the unicorn comes in.

It's not terribly hard to tell the difference between a mechanical mockup and an ES. Clearly not a final product with the jtag on it. Looks like they took off a backplate and then covered up some identifying chips. If it was just a thermal test unit it would be wired up with thermistors. In its current configuration it has every indication that it is an ES meant to test gpu functions, but I guess you have not worked through prototype hardware stages before. :)
 
Back
Top