• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Continues to Develop GPUs Beyond Arc Battlemage

Joined
May 10, 2023
Messages
286 (0.50/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Anyone predicting the end of DIY, sounds like what he said/she said, back in the Windows 8x era. :(

Me thinking positively, I can't see DIY literally being dead, in fact, there was backlash over Surface RT. Any such negativity, reminds me of the early-Surface era. But, we got very close to losing DIY already, before there was a lot of push back against Surface RT and Windows RT.

On top of that, we aren't being forced to get a laptop, despite Microsoft's advertisement showing off laptops on Windows 10 users' screens.

I still have high hopes for Arc! Especially if GeForces are going to be too expensive for what they should be. I rather have an Arc A770 than a GeForce RTX 4080, anyways! Based on how expensive they are!

I feel like Nvidia is pushing it, even with the price of 'RTX 4070. :(
I don't think the DIY market will literally die (sales are still increasing even!), but rather shrink in the bigger picture and become more niche as time goes.

Desktops alone are already a minority of shipments, and have a way minor growth rate when compared to laptops (source).
Given how DIY is an even smaller fraction of the overall desktop market, I think it's fair to say that an off-the-shelf mini-PC that's powerful enough for most users will likely achieve a higher market share compared to DIY setups. Mac minis have already shown that this is doable for many people, maybe an equivalent with Strix Halo can prove the same for the gaming crowd.
Another thing is how a gaming setup has only became more and more expensive as time goes, making it harder for people to keep with the hobby.

Of course that's just my opinion and I can be totally wrong. If I were able to predict the future with certainty I'd have won the lottery already :p

Really? I am talking about a possible future, you translate it as if I am talking about what is happening today and you don't see the difference?
OK.....
No, because I don't see how a 5090 would be different for ARM or x86.
I am talking about full attack in the consumer market, not a few boards for developers.
How would any of that be an "attack"? If anything, having more options in the market would be a good thing.
I am talking about the top discrete GPUs, for example, RTX 5090 performing better on the ARM platform, getting released 3 months earlier and being 10% cheaper than the same model for the x86 platform. Players will start building Nvidia powered ARM based systems for gaming, instead of x86 ones with Intel or AMD CPUs.
That's the point I'm trying to understand. I don't see any way how a "5090 ARM-Exclusive edition" would exist, unless you're talking about fully integrated devices, which would fall into my mini-PC point. If you can elaborate how such thing would exist in a discrete manner, I guess it'd be better for both of us.
Strix Halo will be a nice example of a device that has no proper equivalent desktop-wise, but it's a SoC, just like consoles.
I can't see mini PCs taking over, but I could see laptops, consoles and pre build systems, designed by Nvidia, with minimal upgradability, becoming the preferred options for the majority of gamers. Those pre build systems could come not just as mini PCs, but also as typical desktop form factors, offering the chance for multiple storage solutions and probably a couple PCIe slots for upgrades targeting also semi professionals.
Fair enough. Technically such pre-built system you mentioned wouldn't be much different than a mini-PC, but the somewhat increased size for some expansion (like Sata, extra M2, and 1 or 2 PCIe slots) could still make it way smaller than your regular DIY setup. Maybe more similar to those ITX ones (without any of the upgradeability, apart from some peripherals).

I can also totally see Nvidia downscaling their DGX stations for a more "affordable" workstation with proper PCIe slots and whatnot, but that would be hella expensive still and outside of the range of your regular desktop buyer.

Gaming is what teenagers do, compute is what many professionals want. If x86 loses the advantage of being the de facto option for hi end gaming and hi performance software, it will start losing the battle against ARM in the PC market.
Compute has no hard requirements for x86. As I said, Nvidia on ARM is already a thing in the enterprise. Your only point seems to be gaming, for which the market, albeit significant, is a really small portion of the overall PC market share.
What is the biggest failure of the Qualcomm based Windows on ARM laptops? They are terrible in gaming. One of the reasons they are not flying of the selves.
I totally disagree with this point, and Apple products are a great counter argument to that.
Qualcomm products failed because software integration is still bad, performance was insignificant against the competition, and pricing made no sense compared to any other option.
People don't buy ultralight laptops for gaming anyway, at least I never even attempted to open a single game in my LG Gram.
 
Joined
Oct 15, 2011
Messages
2,444 (0.51/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sapphire Nitro+ Radeon RX 6750 XT
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2
I don't think the DIY market will literally die (sales are still increasing even!), but rather shrink in the bigger picture and become more niche as time goes.

Desktops alone are already a minority of shipments, and have a way minor growth rate when compared to laptops (source).
Given how DIY is an even smaller fraction of the overall desktop market, I think it's fair to say that an off-the-shelf mini-PC that's powerful enough for most users will likely achieve a higher market share compared to DIY setups. Mac minis have already shown that this is doable for many people, maybe an equivalent with Strix Halo can prove the same for the gaming crowd.
As long as laptops aren't always made to assume that we are all computer-illiterate, like that somebody who literally ends up with their bank account hacked, every single year for using a dictionary password in the 2020s!

Laptops that throttle abnormally for no specified reason=Scamtops.

Even back in the "CPU malaise era", (2012-2016 mostly) such thing would be insane! IIRC, I ranted in 2015 about that, after I have been hearing words claiming death of the PC.
 
Last edited:
Joined
Jul 5, 2013
Messages
27,978 (6.70/day)
So what do we define as 'success' in the case of Battlemage I wonder. Some % of market share? Some number of sales? General quality of product being in order, including drivers and legacy gaming on it? Being able to hit the performance of the competition? All of the above?

I'm interested in what people think :) I don't know myself, honestly. I wonder if Intel knows.
For a first go in the GPU market, it's been a good effort and Intel isn't losing money on the project. If Battlemage can raise the performance bar and have competitive pricing, it'll do well.
 
Joined
Sep 1, 2020
Messages
2,363 (1.52/day)
Location
Bulgaria
For the purpose of this discussion it also matters that PCIe works over a cable. External GPUs may become more common in the long term, and they may take the shape of a mini PC, meant to be stacked together with one.
More boxes on the desk. Thanks, but no thanks, despite the known number of advantages, there are also some drawbacks. The price is also further increased, because of the box, second power supply and premium communications.
 
Joined
Jun 11, 2019
Messages
629 (0.31/day)
Location
Moscow, Russia
Processor Intel 12600K
Motherboard Gigabyte Z690 Gaming X
Cooling CPU: Noctua NH-D15S; Case: 2xNoctua NF-A14, 1xNF-S12A.
Memory Ballistix Sport LT DDR4 @3600CL16 2*16GB
Video Card(s) Palit RTX 4080
Storage Samsung 970 Pro 512GB + Crucial MX500 500gb + WD Red 6TB
Display(s) Dell S2721qs
Case Phanteks P300A Mesh
Audio Device(s) Behringer UMC204HD
Power Supply Fractal Design Ion+ 560W
Mouse Glorious Model D-
Joined
May 11, 2018
Messages
1,266 (0.53/day)
I think there is a large range from cancelling a project to continuing the project with clear vision and goals, or just continuing, but spending minimal amounts. Intel could be splurging large amounts of cash on a project, or leave it on a lifeline, update it just a bit and call it a new generation (kind of something that AMD is doing right now), and wait for better times to compete on discrete GPU market.
 
Joined
Oct 3, 2019
Messages
155 (0.08/day)
Processor Ryzen 3600
Motherboard MSI X470 Gaming Plus Max
Cooling stock crap AMD wraith cooler
Memory Corsair Vengeance RGB Pro 16GB DDR4-3200MHz
Video Card(s) Sapphire Nitro RX580 8GBs
Storage Adata Gammix S11 Pro 1TB nvme
Case Corsair Caribide Air 540
i740, the ill fated larrabee, arc, battlemage... 5th time's a charm, hopefully?
 
Joined
Sep 6, 2013
Messages
3,360 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
No, because I don't see how a 5090 would be different for ARM or x86.
Look at Intel's ARC line and how REBAR changes performance. Now imagine on a future Nvidia platform having the option to enable something that offers 10-20% extra performance and that tech is a proprietary tech available only on Nvidia's platform. Nvidia is doing it all the time, it's the way they do business the last 20+ years.
1732882511693.png

How would any of that be an "attack"? If anything, having more options in the market would be a good thing.
Really? You don't understand the way I use the word "attack" here?
That's the point I'm trying to understand. I don't see any way how a "5090 ARM-Exclusive edition" would exist, unless you're talking about fully integrated devices, which would fall into my mini-PC point. If you can elaborate how such thing would exist in a discrete manner, I guess it'd be better for both of us.
Strix Halo will be a nice example of a device that has no proper equivalent desktop-wise, but it's a SoC, just like consoles.
A gave you an example from Intel. Nvidia is a company that "invents" proprietary techs all the time in a way to offer it's products an advantage and at the same time create disadvantages for the competing products, but you still can't imagine that this could happen. ASUS last year presented a way to offer power to a graphics card, through an extra connector that would connect to a special motherboard, avoiding having to use pcie power cables. Future(10+ years away) hi end graphics cards from Nvidia could have a connector as an expansion of the PCIe slot, like what ASUS is doing, that accelerates somehow the graphics card. Or Nvidia could be limiting it's cards to PCIe 8X on x86 platforms and use the extra 8 lanes for something proprietary that promises extra performance on their ARM platform. So, the same card model is compatible with both platforms, but offers higher performance on the Nvidia platform only. Again, think REBAR.
I can also totally see Nvidia downscaling their DGX stations for a more "affordable" workstation with proper PCIe slots and whatnot, but that would be hella expensive still and outside of the range of your regular desktop buyer.
MACs are outside of the range of the regular buyer, but enjoy a healthy market share. Imagine now the best graphics/compute options being available, or performing better on an Nvidia platform. I can see a significant market share moving to the Nvidia option.
Compute has no hard requirements for x86. As I said, Nvidia on ARM is already a thing in the enterprise. Your only point seems to be gaming, for which the market, albeit significant, is a really small portion of the overall PC market share.
So, compute could turn to Nvidia easier. As for gaming, it's not a small part. While gamers are not a majority, gaming is still a major function of any PC, even an office PC and someone choosing a PC or laptop could also have gaming as an important parameter. AMD's APUs and the efforts Intel is doing to improve it's iGPUs prove that gaming is an important parameter even for ultra portables, even for laptops that are mostly targeting office applications and internet browsing.
I totally disagree with this point, and Apple products are a great counter argument to that.
Qualcomm products failed because software integration is still bad, performance was insignificant against the competition, and pricing made no sense compared to any other option.
People don't buy ultralight laptops for gaming anyway, at least I never even attempted to open a single game in my LG Gram.
Apple products are targeting a very specific target group, offering a different ecosystem and services. Not to mention a shiny logo that screams "premium" and "rich". Qualcomm is trying to compete with PCs in everything PCs are doing and while it offers better battery life, it fails mostly in gaming. As I said above, AMD's APUs and the latest Intel CPUs with the new iGPUs prove that even ultralight laptops must be able to game today. You and me we are in a different category, because we have gaming PCs to use. Why game on LG gram when in your system specs you mention not one but two 3090s? Others buy laptops and they, or their kids expect to also game on such laptops. So a discrete GPU, or a strong enough iGPU is a necessity.
 
Joined
Jan 24, 2021
Messages
84 (0.06/day)
System Name Gaming Workstation
Processor AMD Ryzen 5900X @ 4.0 GHz
Motherboard AsRock X570 Steel Legend
Cooling DeepCool CASTLE 280EX
Memory G.Skill Trident Z - 32GB [2 x 16GB] @ 3600MHz C16
Video Card(s) Zotac Gaming GeForce RTX 4070Ti Trinity 12GB
Storage Acer GM7000 4TB NVMe [Boot] + MSI M480 1TB NVMe [Games] + 2 x WD 16TB HDD @ Raid1 [Data + Storage]
Display(s) ASUS ROG Swift PG278Q @ 1440p / 120Hz
Case Deepcool MATREXX 70 3F
Audio Device(s) Realtek Onboard Sound with Earphones
Power Supply Cooler Master 850W
Mouse Logitech G402
Keyboard Logitech Internet Keyboard Y-SS60 [it's really old, but my fingers reject any other keyboard]
VR HMD -
Software Windows 10 Pro 64-bit
It would be pretty ironic (for lack of a more appropriate word) if Intel ends up surviving because of its GPUs, while the management resets their whole CPU chain.

Who woulda thunk? :/
 
Joined
Jan 24, 2019
Messages
11 (0.01/day)
System Name The-Ratfink
Processor AMD Ryzen 9 7950X3D
Motherboard MSI B650 Tomahawk Wifi
Cooling Thermalright Phantom Spirit 120SE
Memory 64GB DDR5-6000 CL32
Video Card(s) MSI Nvidia RTX 2070 Duke OC
Storage WD SN580 500GB + WD SN580 2TB
Display(s) HP Z27s x2
Case Anidees AI Crystal Cube (Side Mounted)
Power Supply Corsair RM750e
Mouse Logitech M570
Keyboard Das Professional 4
Most consumers dont want to take that risk, especially if $250 is a big deal for them
Missed this piece. I assume you that in some cases money isn't the issue. The reality is that these people don't value a GPU over their specific price point. For years even my friends get confused as to why I won't hesitate to spend $600 on a CPU, but will hesitate to spend more than $500 on a GPU - the answer is simple. I use my CPU for my "job" and a GPU is a "toy". This can also be seen in other elements of my life where I don't buy games consoles or fancy handhelds or RGB, or Star Trek figurines (or other collectibles, insert your hobby here) because I personally don't value those things as highly as others do. Some people may not even need more than that 1080p card (or in my case, I don't need a card that can do more than 4k60, but I absolutely do care if my code takes 30 minutes to compile vs 10).
 
Joined
May 10, 2023
Messages
286 (0.50/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Look at Intel's ARC line and how REBAR changes performance.
But rebar is an standard feature for PCIe stuff. Heck, it was AMD who came up with it and miners/HPC folks have been using it way before it became a thing on games.
So far you only used examples about standard features.
Now imagine on a future Nvidia platform having the option to enable something that offers 10-20% extra performance and that tech is a proprietary tech available only on Nvidia's platform. Nvidia is doing it all the time, it's the way they do business the last 20+ years.
Only thing that I could imagine would be them have some native NVLink to increase the throughput between CPU and GPU transfers, akin to what they had in some POWER workstations, but that would be irrelevant for games or the regular user.
Really? You don't understand the way I use the word "attack" here?
No, not really, otherwise I wouldn't be asking.
A gave you an example from Intel.
And that was a bad example and not something proprietary.
Future(10+ years away) hi end graphics cards from Nvidia could have a connector as an expansion of the PCIe slot, like what ASUS is doing, that accelerates somehow the graphics card. Or Nvidia could be limiting it's cards to PCIe 8X on x86 platforms and use the extra 8 lanes for something proprietary that promises extra performance on their ARM platform. So, the same card model is compatible with both platforms, but offers higher performance on the Nvidia platform only. Again, think REBAR.
Ok, now I can see your point (stop using rebar please lol).
Still, that's not specific to ARM by any means, they could do so in any platform of theirs, the fact that it's ARM is just a minor detail. They could come up with a partnership with AMD and only offer it in Ryzen 6969 series on the x999 chipset, it would be the exact same thing.

Again, I don't see how this has any relation to the whole x86 vs ARM issue you brought at first.
Imagine now the best graphics/compute options being available, or performing better on an Nvidia platform. I can see a significant market share moving to the Nvidia option.
I mean, won't consumers move to what's best for a given price range anyway? I believe that would be natural if such product came to be.
So, compute could turn to Nvidia easier.
GPGPU compute is either Nvidia (with a near monopoly) or AMD already, just see the whole drama about A100/H100/B200s going around, while AMD picks up scraps for the customers Nvidia is not able to sell to due to not enough supply.
As for gaming, it's not a small part. While gamers are not a majority, gaming is still a major function of any PC, even an office PC and someone choosing a PC or laptop could also have gaming as an important parameter. AMD's APUs and the efforts Intel is doing to improve it's iGPUs prove that gaming is an important parameter even for ultra portables, even for laptops that are mostly targeting office applications and internet browsing.
In my opinion you are overestimating the value of gaming. It's not a majority, and is not a major function. I concede that it's not insignificant by any means either, but I don't think it's relevant to continue on this topic since it's more of our own personal perceptions of how important it is vs how not.
Shall we agree to disagree on this one and continue to discuss on the other points?

Apple products are targeting a very specific target group, offering a different ecosystem and services. Not to mention a shiny logo that screams "premium" and "rich". Qualcomm is trying to compete with PCs in everything PCs are doing and while it offers better battery life, it fails mostly in gaming. As I said above, AMD's APUs and the latest Intel CPUs with the new iGPUs prove that even ultralight laptops must be able to game today. You and me we are in a different category, because we have gaming PCs to use. Why game on LG gram when in your system specs you mention not one but two 3090s? Others buy laptops and they, or their kids expect to also game on such laptops. So a discrete GPU, or a strong enough iGPU is a necessity.
Apple products (at least the entry level ones like Airs and base MBPs) compete directly with premium x86 laptops, think Dell XPS, Lenovo X1 Carbon, HP Spectre, etc etc. All of those sure have the shiny logo, but the idea of this class is a lightweight laptop with really long battery life and that you can get work done with anywhere, period. No dGPUs (apart from the bigger models, but then you forgo both portability and battery life). $1k+ price tag, 0.9~1.6kg, 7h+ battery life, that's the group we are talking about.
Qcom tried to compete in that same space, but failed since they had a higher price tag with less performance. GPU is a minor detail, since those other laptops are not meant for gaming either. You can run your lightweight indie game, but you can also do the same on Macbooks and even Snapdragon devices anyway, so I don't think this is any relevant.

Qcom did not try to bring a desktop-class product with expansion, nor a gamer-like GPU.

FWIW, I don't play games, my 2x3090 are for compute (who the hell would be using 2 to game when SLI is dead? :laugh:). To be 100% honest, last time I played something on steam was back in May, and I did play some mario kart when at a friends house a couple months ago, but I don't think a console from a 3rd party is any relevant to our discussion.
Most of my friends rarely ever play either. Don't forget this forum is a bit of a bubble focused mostly on gamers, but if you go to a random bar the majority of people in there likely won't have gaming as regular habit of theirs.
 
Joined
Jun 1, 2011
Messages
4,632 (0.94/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
So what do we define as 'success' in the case of Battlemage I wonder.
Anything that impacts the market in a positive way for consumers
 
Joined
Jul 13, 2016
Messages
3,300 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Anything that impacts the market in a positive way for consumers

I mean this is nice for consumers but ultimately at the end of the day Intel needs to start seeing financial success in order for them to continue to make GPUs.

Would be nice to see Intel and AMD team up for open-source or vendor agnostic solutions that crack Nvidia's software lock-in for various markets.
 
Joined
May 10, 2023
Messages
286 (0.50/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Would be nice to see Intel and AMD team up for open-source or vendor agnostic solutions that crack Nvidia's software lock-in for various markets.
Do you mean for compute?
AMD has ROCm, Intel has oneAPI, both are opensource and make use of some underlying stuff that's also opensource and often has contributions from both (some even from nvidia, like some vulkan extensions and SYCL stuff).

CUDA has a really big traction mostly due to inertia, it has been a thing for over a decade already and really focused on developer experience and accessibility for everyone, whereas Intel and AMD are trying to catch up now.
 
Joined
Sep 6, 2013
Messages
3,360 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
But rebar is an standard feature for PCIe stuff. Heck, it was AMD who came up with it and miners/HPC folks have been using it way before it became a thing on games.
So far you only used examples about standard features.
Look, I am not an Nvidia engineer. If you can't understand the meaning of what the other person tries to tell you, then I will just stop here. My time is precious. REBAR was just an example where performance on ARC GPUs was affected. It was just a simple example. What where you expecting from me? Invent a new tech present it to you and tell you "I am Nvidia and I am locking it only on my platform?". Well guess what. Nvidia is doing it all the time. Just use your imagination, try to understand what the other person tries to tell you and stop that "don't use REBAR it is a standard". If it was an Nvidia tech, it would have been proprietary, not a standard. And that's all.

PS I did read the rest of your post.
 
Joined
May 10, 2023
Messages
286 (0.50/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Look, I am not an Nvidia engineer. If you can't understand the meaning of what the other person tries to tell you, then I will just stop here. My time is precious. REBAR was just an example where performance on ARC GPUs was affected. It was just a simple example. What where you expecting from me? Invent a new tech present it to you and tell you "I am Nvidia and I am locking it only on my platform?". Well guess what. Nvidia is doing it all the time. Just use your imagination, try to understand what the other person tries to tell you and stop that "don't use REBAR it is a standard". If it was an Nvidia tech, it would have been proprietary, not a standard. And that's all.

PS I did read the rest of your post.
Point is that your idea just doesn't seem realistic at all from a hardware standpoint, so I was trying to understand if there were something that I was not picking up. If Nvidia were to go as far as to make something different like you said with the PCIe slots, at this point they'd be better just making a product not compatible at all with ATX standards (so, a mini-PC thing with everything soldered together).
Going half-way between fully compliant to a standard and going full-blown integrated design doesn't make sense at all to me tbh, so I was expecting you to try to provide an example where this would make some sense. Dreading about an unlikely future just because "big corporation bad" sounds baseless, specially when this doesn't seen any relevant to the ARM x x86 idea that brought the entire discussion to begin with.

Anyhow, you are not obliged to get your point across, you are free to just not reply and leave it at that :)
 
Top