• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Surpasses Intel in Market Cap Size

Joined
Mar 8, 2013
Messages
20 (0.00/day)
Oh my. Here we go again.

Well I checked the date. SDK 3.0 came out in June 2011. I guess programmers also need time to learn it and implement it, so games using it came out when? Probably when hardware PhysX was clear that wasn't meant to became a standard.
So first, it was Nvidia crippled/hobbled "their" software to make their GPU's look better, then it was Nvidia didn't "fix" the software they inherited from Ageia, then it's Nvidia didn't fix it fast enough... seriously?
Ageia didn't had the connections, money, power to enforce that. So even if they wanted to do that, they couldn't. Also PPUs where not something that people where rushing to buy, so developers wouldn't cripple the game for 99% of their customers, just to make 1% happy. Nvidia was a totally different beast. And they did try to enforce physics on their GPUs. You are NOT reading or you just pretend to not read what I post.
Ageia got multiple developers to do exactly what you're saying they couldn't do, have you done any research on this?
Again, Nvidia completely supported porting GPU PhysX to Radeons.
It seems that you are a waste of time after all. What you don't like is not a lie. I could call you also a liar. But I am not 5 years old.
I come on. You keep posting like a 5 years old. I am just bored to post the parts of your posts where you make assumptions about what I think, what I mean, where I intentionally lie.
You've made claims like Nvidia designed the CPU portion of PhysX to make the GPU portion look better, which is impossible because the CPU portion was written before GPUs were even part of the equation and was not even "designed" by Nvidia in the first place. When I pointed this out, did you clarify or correct your claim... no you just moved on to more falsities. Are you saying that wasn't intentional?
You do reject reality. As for ignorance, it's your bliss.
You seemingly not reading the prior posts so as to easily put together what I meant by "you guys" is me.... rejecting reality? You're not even making sense anymore.
So you had nothing to say here.
It was already discussed before. Nvidia tried to extended their technology to AMD's products but AMD said no way, go to hell, we are backing Intel... so Nvidia said no YOU go to hell and locked out their products in response. Check the dates, AMD acted in bad faith first by stringing Eran Badit and consumers along and then sinking the whole thing. Nvidia supporting the porting effort directly contradicts the core of your notions.
 
Joined
Jan 8, 2017
Messages
8,933 (3.35/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
What's certain is that Physx always ran poorly, even if you had said Nvidia hardware, there is no doubt that Nvidia tried to turn it into a disadvantage for their competitor. I'm glad that basically no one is using it these days, good riddance, it was outclassed by many in-house physics engines anyway.
 
Joined
Sep 6, 2013
Messages
2,978 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
So first, it was Nvidia crippled/hobbled "their" software to make their GPU's look better, then it was Nvidia didn't "fix" the software they inherited from Ageia, then it's Nvidia didn't fix it fast enough... seriously?
Oh come on. I am saying the same thing in all cases. You just try to play with words here. If you don't fix something or postpone fixing it, it's no different than crippling it, or if you prefer let it being crippled.
Now if you need me to talk like a lawyer with 100% accuracy in what I am saying, and no freedom in what someone should consider of having in an honest dialog and not in a stupid brainless efford to just pass his opinion, we can continue in Greek. I am flount in Greek, I am just average at best in English.
Ageia got multiple developers to do exactly what you're saying they couldn't do, have you done any research on this?
Again, Nvidia completely supported porting GPU PhysX to Radeons.

You've made claims like Nvidia designed the CPU portion of PhysX to make the GPU portion look better, which is impossible because the CPU portion was written before GPUs were even part of the equation and was not even "designed" by Nvidia in the first place. When I pointed this out, did you clarify or correct your claim... no you just moved on to more falsities. Are you saying that wasn't intentional?

You seemingly not reading the prior posts so as to easily put together what I meant by "you guys" is me.... rejecting reality? You're not even making sense anymore.

It was already discussed before. Nvidia tried to extended their technology to AMD's products but AMD said no way, go to hell, we are backing Intel... so Nvidia said no YOU go to hell and locked out their products in response. Check the dates, AMD acted in bad faith first by stringing Eran Badit and consumers along and then sinking the whole thing. Nvidia supporting the porting effort directly contradicts the core of your notions.

You can keep posting bullshits(your word) if that makes you feel better. Nvidia locked PhysX when there was no compatibility problems to do so. Let me show you something and that will be my last post here. As I said you are a waste of time. Wasting time in weekend is a totally different thing than wasting time in workdays.

Here is that bone you said before. Newer Nvidia drivers WITHOUT a lock with HARDWARE PhysX running while an AMD GPU is PRIMARY.

PS. It's good that hardware physics didn't became the standard. Today the CPU is the system part that is underutilized in modern demanding titles, not the GPU. So, throwing more at the GPU it's not exactly the best idea today.

PhysX with AMD primary.jpg
 
Last edited:
Joined
Mar 8, 2013
Messages
20 (0.00/day)
What's certain is that Physx always ran poorly, even if you had said Nvidia hardware, there is no doubt that Nvidia tried to turn it into a disadvantage for their competitor. I'm glad that basically no one is using it these days, good riddance, it was outclassed by many in-house physics engines anyway.
The important time for it to have been used would have been back "then", in 2008-2009 and the beginning of the 2010s were broader usage would have steered the direction of GPU designs to be more flexible and suited to the task, something we are only starting to see now after a decade of stagnation. It might have happened if AMD had supported the PhysX porting effort or had actually pursued GPU acceleration for Bullet physics like they promised. In both situations AMD's initial promises were hollow, according to Erwin Coumans from Bullet, "AMD didn't allocate any resources on the project"... which certainly sounds familiar. AMD didn't really need GPU physics to work out like Nvidia did and I guess from their point of view it was probably better if Nvidia's resources were just wasted but ultimately it makes the whole "Nvidia gobbled up Ageia and in their greed, cost us all an amazing GPU physics filled future" notion particularly cringe-worthy.
 
Joined
Jan 8, 2017
Messages
8,933 (3.35/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
something we are only starting to see now after a decade of stagnation.

That's just not true, GPU accelerated particles effects, which is what Physx was mostly used for have been a thing since a long time ago.
 
Joined
Sep 6, 2013
Messages
2,978 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
That's just not true, GPU accelerated particles effects, which is what Physx was mostly used for have been a thing since a long time ago.
He doesn't want to understand that. He things that there where no physics effects before Ageia or that physics effects demand a GPU to run on it.
 
Joined
Mar 8, 2013
Messages
20 (0.00/day)
That's just not true, GPU accelerated particles effects, which is what Physx was mostly used for have been a thing since a long time ago.
Those effects map relatively well to even GPUs from yesteryear but even then it doesn't take much for things to get bogged down. Modern GPUs handle those kinds effects far better. You can see the difference in relative performance cost between Nvidia's older architectures and Turing for something like the Flex effects in Killing Floor 2.
 
Joined
Mar 21, 2016
Messages
2,197 (0.74/day)
It makes the whole "Nvidia gobbled up Ageia and in their greed, cost us all an amazing GPU physics filled future" notion particularly cringe-worthy.
Might have been inclined to agree with some of your points until that portion...Nvidia I bought out Ageia and stifled Physx and I would've gotten away with it if it wasn't for that pesky AMD/Intel.

PS. It's good that hardware physics didn't became the standard. Today the CPU is the system part that is underutilized in modern demanding titles, not the GPU. So, throwing more at the GPU it's not exactly the best idea today.

View attachment 162180
I wouldn't say good, but it's not bad at the same time the way things are right now on the CPU side. What we really need is something more like Vulkan on software and hardware side that numerous players could adapt. A open ended software/hardware Vulkan inspired FPGA combination mixed with re-programmable software profiles with a mixture of variable rate shading would probably be best for physics. I mean actual physics in games can probably get very computational depending on the complexity involved so them variable rate to dumb down and fake them more reasonably relative to the required hardware overhead seems like the obvious thing to do.
 
Last edited:
Joined
Mar 8, 2013
Messages
20 (0.00/day)
Might have been inclined to agree with some of your points until that portion...Nvidia I bought out Ageia and stifled Physx and I would've gotten away with it if it wasn't for that pesky AMD/Intel.
What don't you agree with?
 
Joined
Mar 21, 2016
Messages
2,197 (0.74/day)
What don't you agree with?
I guess it depends in the context of what you actually meant. It's defiantly not at all AMD or Intel's fault or problem what happened with Ageia and Physx after Nvidia bought the IP because ultimately they accountable for their own IP and no one else. Could it have been better with support from them as well yeah sure perhaps, but that's probably true of IP held by the other two companies as well.
 
Joined
Feb 9, 2020
Messages
404 (0.26/day)
Location
Panama City Beach, Florida
System Name EventHorizon
Processor Intel® Core™ Processor i9-13900KF 8P/16 + 16E 3.00GHz [Turbo 5.7GHz] 36MB Cache LGA1700
Motherboard ASUS PRIME Z790-P
Cooling CyberpowerPC MasterLiquid Lite 240mm ARGB CPU Liquid Cooler
Memory 32GB (16GBx2) DDR5/6000MHz Dual Channel Memory (KINGSTON FURY BEAST RGB)
Video Card(s) GeForce RTX™ 4080 16GB
Storage 2TB WD BLACK SN850X (PCIe Gen4) NVMe M.2 SSD - Seq R/W: Up to 7300/6600 MB/s, Rnd R/W up to 1200/110
Display(s) LG 34''
Case CyberPowerPC HYTE Y60 Dual Chamber Mid-Tower Gaming Case w/ Panoramic View Tempered Glass + 2x120mm
Audio Device(s) Asus Strix w/Alan Finote mod for Windows 11
Power Supply High Power 1300W 80+ GOLD Full Modular w/ PCIE 12+4Pins Connector for PCIe 5.0 graphics cards
Mouse Steelseries Rival 600 wired
Keyboard Steelseries Apex 7 TKL red Switch
Software Win 11 Pro
Yeah, Intel could have bought Nvidia at that time just like AMD acquired ATI but unlikely AMD, Intel opted not to and now Intel must be crying out loud in some corner hehe

If we compare the revenues of both companies, Intel is having much better performance. It had a revenue of 71.9 billion USD in 2019, while NVIDIA has 11.72 billion USD of revenue.

No crying at Intel, I assure you.
 
Top