• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Surpasses Intel in Market Cap Size

Low quality post by Adam Krazispeed
Joined
Jan 29, 2016
Messages
128 (0.04/day)
System Name Ryzen 5800X-PC / RyzenITX (2nd system 5800X stock)
Processor AMD Ryzen 7 5800X (atx) / 5800X itx (soon one pc getting 5800X3D upgrade! ;)
Motherboard Gigabyte X570 AORUS MASTER (ATX) / X570 I Aorus Pro WiFi (ITX)
Cooling AMD Wrath Prism Cooler / Alphenhone Blackridge (ITX)
Memory OLOY 4000Mhz 16GB x 2 (32GB) DDR4 4000 Mhz CL18, (22,22,22,42) 1.40v AT & ITX PC's (2000 Fclk)
Video Card(s) AMD Radeon RX 6800 XT (ATX) /// AMD Radeon RX 6700 XT 12GB GDDR6 (ITX)
Storage (Sys)Sammy 970EVO 500GB & SabrentRocket 4.0+ 2TB (ATX) | SabrentRocket4.0+ 1TB NVMe (ITX)
Display(s) 30" Ultra-Wide 21:9 200Hz/AMD FREESYNC 200hz/144hz LED LCD Montior Connected Via Display Port (x2)
Case Lian Li Lancool II Mesh (ATX) / Velkase Velka 7 (ITX)
Audio Device(s) Realtek HD ALC1220 codec / Onboard HD Audio* (BOTH) w/ EQ settings
Power Supply 850w (Antec High-Current Gamer) HC-850 PSU (80+ gold certified) ATX) /650Watt Thermaltake SFX (ITX)
Mouse Logitech USB Wireless KB & MOUSE (Both Systems)
Keyboard Logitech USB Wireless KB & MOUSE (Both Systems)
VR HMD Oculus Quest 2 - 128GB - Standalone + Oculus link PC
Software Windows 10 Home x64bit 2400 /BOTH SYSTEMS
Benchmark Scores CPUZ - ATX-5800X (ST:670) - (MT: 6836.3 ) CPUZ - ITX -5800X (ST:680.2) - (MT: 7015.2) ??? same CPU?
Clever company but very,very dubious business practices got them to where they are.

In the early days of PC graphics accelerator cards they used some pretty underhand tactics against its competitors to ensure they survived and others didn't.

yeah like the x86 compiler bs if intels x86 compiler found any CPU besides "GENUINE INTEL" THE X86 Compiler would divert the cpus code path to a slow, jumbled bs mess to make intels cpus only seem faster than they really were, amd was actually faster at the time and Intels "SHIT'N TELL" O.E.M'S "REBATES" DEBACLE! WITH DELL AND OTHER O.E.M.S BRIBING THEM TO NOT SELL DESKTOP PCS (AND PROB LAPTOPS TOO) WITH AMD, CYRIX, OR ANY OTHER DESKTOP/X86 CPU BASED CPU!! AT THAT TIME!!! FUK INTEL!!

INTEL IS STILL A SHADY ANTI-CONSUMER CPU ENGINEERING COMPANY!! (AND OTHER SHIT TOO)

ILL BUY A FASTER 8C/16T cpu from AMD (even if slower single core/thread ipc/ lower clocks) even if slightly higher than intels equivalent 8c/16t chips!!! aka intel 8c/16t 349-379$ vs amd 8c/16t @$399 (3800 XT) (intel prob dont have it cheaper, and i could still get a 3700X zen2 cpu at my local Microcenter for $269 for a very decent 8c/16t gen4.0 supported cpu 3.60ghz base & 4.3/4.4 max boost (single thread or single core
 
Last edited:
Joined
Dec 12, 2018
Messages
17 (0.01/day)
Yeah, Intel could have bought Nvidia at that time just like AMD acquired ATI but unlikely AMD, Intel opted not to and now Intel must be crying out loud in some corner hehe
AMD actually did try to buy Nvidia back in the day , ATI was there 2nd choice. The deal fell through because Jen sung wanted to run the company and be on the board.


Man, I'm so old and I remember all the hardware specs and news from decades ago but I don't remember what I did yesterday sigh ...
 
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
AMD uses a far superior production process (7nm) yet it can't compete with Nvidia on a 12nm Node. It is not even in the same ball-park.

I find it fascinating you bring up process technology yet you also conveniently omit the fact that Nvidia's highest end offering has a staggering 80% more transistors. Basically Nvidia has the performance lead right know because they made a chip twice as big, wow, imagine if it was in the same "ball-park".

Will fans of the green team ever understand how to properly compare hardware ? Probably not but one can only hope.

my good old HD 6850 was inferior to Nvidia offering in the stability department

Yeah I bet, "rock solid stability" is the definitely the first thing that pops in my head from that era, besides stuff like this : https://www.techpowerup.com/review/asus-geforce-gtx-590/26.html

I went to heat up the card and then *boom*, a sound like popcorn cracking, the system turned off and a burnt electronics smell started to fill up the room
 
Last edited:
Joined
Aug 6, 2017
Messages
7,412 (3.03/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
I find it fascinating you bring up process technology yet you also conveniently omit the fact that Nvidia's highest end offering has a staggering 80% more transistors. Basically Nvidia has the performance lead right know because they made a chip twice as big, wow, imagine if it wasn't in the same "ball-park".

Will fans of the green team ever understand how to properly compare hardware ? Probably not but one can only hope.
would amd make that 750mm2 chip possible with navi when 5700xt is already close to 250w ?
 
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Something else green team fans don't seem to understand is the relationship between power, die size scaling, frequency and voltages. Larger processors tend to be far more power efficient, it's the reason why a 2080ti which has roughly 2.3X times the number of shaders that a 2060 has also doesn't need 2.3X times the power.

1594398209568.png


This phenomena is a complete mystery to them.
 
Last edited:
Joined
Jun 3, 2010
Messages
2,540 (0.50/day)
Something else green team fans don't seem to understand is the relationship between power, die size scaling, frequency and voltages. Larger processors tend to be far more power efficient, it's the reason why a 2080ti that has roughly 2.3X times the number of shaders that a 2060 has also doesn't need 2.3X times the power.
Old times when the saying went, big gpus don't need so much of everything. I don't know which since gpureview shut down.
 
Joined
Aug 6, 2017
Messages
7,412 (3.03/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
Something else green team fans don't seem to understand is the relationship between power, die size scaling, frequency and voltages. Larger processors tend to be far more power efficient, it's the reason why a 2080ti which has roughly 2.3X times the number of shaders that a 2060 has also doesn't need 2.3X times the power.

View attachment 161795

This phenomena is a complete mystery to them.
is this the reason rx470 beats vega 64 in power efficiency ? :roll:
and why would you not include one with 5700xt,I wonder.....
 
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Nvidia even was all for a third party porting GPU PhysX to run on Radeons but AMD refused to provide proper support.

Nvidia wanted to create two types of gamers. Those with an Nvidia GPU that would enjoy their games at their full visual quality and those who would have to settle for something visually inferior. We also saw how open Nvidia was the next years with their GameWorks libraries. Locked and proprietary. Also PhysX, those first years it was running extremely badly on the CPU, being totally unoptimized. If I remember correctly(after so many years), it was running on a single thread and using ancient MMX instructions. It's software version was meant to make the GPU version look like 10 times faster. So, why would AMD trust Nvidia and support a proprietary and locked standard? AMD supporting PhysX back then would have been a mistake. Make hardware PhysX a necessity in gaming and hope that a company like Nvidia will not stub you in the back? I think not.

Nvidia's true intentions where totally clear when it chose to lock PhysX while it was probably totally independent to the primary GPU used. Probably because Ageia developed that way and Nvidia didn't bothered to make it incompatible with anything else than Nvidia GPUs. So with a simple patch you could unlock PhysX and play those few games that where supporting hardware PhysX with an AMD primary GPU without any problems and good framerates. I enjoyed Alice with a 4890 as a primary card and a 9600GT as a PhysX card. Super smooth, super fun. There was also a driver from Nvidia that accidentally came out without a PhysX lock. I think it was 256.xxx something.
Nvidia could had offered PhysX without support in those cases where the primary GPU was not an Nvidia one. They didn't.
 
Joined
Aug 20, 2007
Messages
20,763 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
And satan
Lol,I dont know what's funnier,the contents of this paragraph or the fact it starts with "on topic"

I like the implication that profiting off any of those things is inherently bad. I don't feel it is if it's filling a need (like gaming during a pandemic), and at least for mining, AMD arguably profited more.
 
Joined
Mar 8, 2013
Messages
20 (0.00/day)
Also PhysX, those first years it was running extremely badly on the CPU, being totally unoptimized. If I remember correctly(after so many years), it was running on a single thread and using ancient MMX instructions. It's software version was meant to make the GPU version look like 10 times faster.
That was a bullshit myth.

So, why would AMD trust Nvidia and support a proprietary and locked standard?
You're ascribing information from 2010 to AMD's decision making process in 2008. In 2008 AMD had a lot more reason to distrust Intel than Nvidia and yet they had no problem supporting Intel's proprietary and locked standard (Havok).

AMD supporting PhysX back then would have been a mistake. Make hardware PhysX a necessity in gaming and hope that a company like Nvidia will not stub you in the back? I think not.

AMD didn't just "not support" PhysX, they also explicitly backed Havok against it. It was a business move through and through, one that was clearly made to hurt Nvidia and stifle their development, which it did... but it also deprived their customers of a potential feature and stifled the industry's development.

Nvidia's true intentions where totally clear when it chose to lock PhysX while it was probably totally independent to the primary GPU used.
This came after AMD had already made their intentions clear that they wouldn't play ball.
 
Last edited:
Joined
Jun 21, 2015
Messages
66 (0.02/day)
Location
KAER MUIRE
System Name Alucard
Processor M2 Pro 14"
Motherboard Apple thingy all together
Cooling no Need
Memory 32 Shared Memory
Video Card(s) 30 units
Storage 1 TB
Display(s) Acer 2k 170Hz, Benq 4k HDR
Mouse Logictech M3
Keyboard Logictech M3
Software MacOs / Ubuntu
You want me to go on a wild goose chase? Some Intel executive said that as a joke. I literally cannot find it at this time.
haha I like the expression :)
 
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Yeah, he explains that they didn't had the time, resources and knowledge to make it multithreaded and use the best options available. Your point? You have to realize that someone posting a huge explanation in public, doesn't necessarily say all the truth.

You're ascribing information from 2010 to AMD's decision making process in 2008. In 2008 AMD had a lot more reason to distrust Intel than Nvidia and yet they had no problem supporting Intel's proprietary and locked standard (Havok).
Intel and AMD are a duopoly in the x86 business for decades. Think Samsung and Apple. They fight in courts and at the same time they are doing business together. Nvidia on the other hand was a newcomer that had a vision where the GPU is doing the most heavy tasks in a PC, while the CPU is playing the roll of a traffic policeman in the system. Nvidia was and today clearly is a common enemy to both of them.

AMD didn't just "not support" PhysX, they also explicitly backed Havok against it. It was a business move through and through, one that was clearly made to hurt Nvidia and stifle their development, which it did... but it also deprived their customers of a potential feature and stifled the industry's development.
Common... bullshit myth from Nvidia fans. We had physics effects long before Ageia. Nvidia took something that was free at a time, running on the CPU and tried to make it a proprietary feature for it's GPUs. The result was games with PhysX effects available only to those with an Nvidia GPU. The rest could "enjoy" a totally different game.

This came after AMD had already made their intentions clear that they wouldn't play ball.
Nvidia's lock didn't had to do with AMD's decision. If the GPU was not an Nvidia one, PhysX got disabled. I had read a case where someone with an Nvidia GPU was having problems enabling PhysX because he had a second USB monitor and the Nvidia driver was treating the USB driver as a non Nvidia graphics driver. Hilarious. Also I believe Nvidia removed the lock a few years later. i think drivers after 2014 do not lock PhysX, but i might be wrong here.
In any case Nvidia could let the PhysX feature unlocked and just throw a pop up window while installing the driver informing that there would be no customer support when a non Nvidia GPU is used as primary. That customer support and bug reports would be valid only for those using an Nvidia GPU as primary. Anyway let me repeat something here. With a simple patch PhysX was running without any problems with an AMD card as primary.
 
Last edited:
Joined
Mar 8, 2013
Messages
20 (0.00/day)
Yeah, he explains that they didn't had the time, resources and knowledge to make it multithreaded and use the best options available. Your point? You have to realize that someone posting a huge explanation in public, doesn't necessarily say all the truth.

Your claim, "It's software version was meant to make the GPU version look like 10 times faster" is bullshit. You're saying the codebase from before Nvidia's acquisition was designed to make GPU's faster, despite not even being ported to GPU's yet.

Intel and AMD are a duopoly in the x86 business for decades. Think Samsung and Apple. They fight in courts and at the same time they are doing business together. Nvidia on the other hand was a newcomer that had a vision where the GPU is doing the most heavy tasks in a PC, while the CPU is playing the roll of a traffic policeman in the system. Nvidia was and today clearly is a common enemy to both of them.

So AMD is fine with proprietary and locked standards as long as they dick over Nvidia. Gotcha.

Common... bullshit myth from Nvidia fans. We had physics effects long before Ageia. Nvidia took something that was free at a time, running on the CPU and tried to make it a proprietary feature for it's GPUs. The result was games with PhysX effects available only to those with an Nvidia GPU. The rest could "enjoy" a totally different game.

No they didn't... what is with you guys and how quick you resort to just blatant lying? Nvidia didn't change anything regarding the propriety of Ageia's properties and it sure as hell wasn't anymore "free" before than it was after. The effects that were ported to the GPU were the effects that would only run well on PPUs before. GPU PhysX effects could have been functional on Radeons if AMD had properly supported the porting, it's AMD's fault PhysX wasn't supported on their hardware in Alice and other games.

Nvidia's lock didn't had to do with AMD's decision. If the GPU was not an Nvidia one, PhysX got disabled. I had read a case where someone with an Nvidia GPU was having problems enabling PhysX because he had a second USB monitor and the Nvidia driver was treating the USB driver as a non Nvidia graphics driver. Hilarious. Also I believe Nvidia removed the lock a few years later. i think drivers after 2014 do not lock PhysX, but i might be wrong here.
In any case Nvidia could let the PhysX feature unlocked and just throw a pop up window while installing the driver informing that there would be no customer support when a non Nvidia GPU is used as primary. That customer support and bug reports would be valid only for those using an Nvidia GPU as primary. Anyway let me repeat something here. With a simple patch PhysX was running without any problems with an AMD card as primary.

Nvidia's lock came after AMD's decision to torpedo their efforts, you're saying Nvidia should have thrown AMD's customers a bone when even AMD wouldn't throw them one themselves and were simultaneously giving Nvidia the finger.
 
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Your claim, "It's software version was meant to make the GPU version look like 10 times faster" is bullshit. You're saying the codebase from before Nvidia's acquisition was designed to make GPU's faster, despite not even being ported to GPU's yet.
All my posts here talk about Nvidia and PhysX. I don't care about the code before Nvidia. That's your conclusion because it just suits you. But anyway, that link you posted proves that the code was far from optimized. And Nvidia could make it much more optimized easily and fast. They are a software company in case you don't know that. They had a ton of programmers and experience to fix it, but they didn't.

Now, PhysX wasn't locked when it was meant to run on Ageia cards, before Nvidia took over. I would be objecting on Ageia cards, you would also, if I was seeing developers throwing all physics effects on the Ageia card and forcing people to buy one more piece of hardware, when there where already multicore CPUs to do the job.

By the way. Saying ALL the time that the other person posts bullshit, is a red flag. You are putting a red flag on yourself, that you are a total waste of time. You look like a brainless 8 years old fanboy that just wants to win an argument when you keep saying that the other person posts bullshit. This is the simplest way to explain it to you.

So AMD is fine with proprietary and locked standards as long as they dick over Nvidia. Gotcha.
If this convenient explanation makes yourself happy, no problem. Why spoil your happiness?

No they didn't... what is with you guys and how quick you resort to just blatant lying? Nvidia didn't change anything regarding the propriety of Ageia's properties and it sure as hell wasn't anymore "free" before than it was after. The effects that were ported to the GPU were the effects that would only run well on PPUs before. GPU PhysX effects could have been functional on Radeons if AMD had properly supported the porting, it's AMD's fault PhysX wasn't supported on their hardware in Alice and other games.
You just don't want to understand. You have an image in you mind that Nvidia is a company run by saints who want to push technology and make people happy. Maybe in another reality. It's funny that GameWorks even hurt performance in older series of Nvidia cards, but hey, Nvidia would have treated AMD cards fair with it's locked and proprietary code. You reject reality and then you ask what is it with us? And who are we? Are we a group? Maybe a group of non believers?

Nvidia's lock came after AMD's decision to torpedo their efforts, you're saying Nvidia should have thrown AMD's customers a bone when even AMD wouldn't throw them one themselves and were simultaneously giving Nvidia the finger.

Look at my system specs. My answer is there. I keep a simple GT 620 card in my system just so I can enjoy hardware PhysX effects in games like Batman for example. When i enable software PhysX in a systme with a 4th gen quad core i5 and an RX 580 framerate goes down to single digit. That simple GT 620 is enough for fluid gaming. And no I didn't had to install a patch because as I said, Nvidia decided to remove the lock? Why did they removed the lock. Did they decided to support AMD cards by themselves? Throw a bone to AMD's customers? Maybe they finally came in agreements with AMD? And by the way, why didn't they announced that lock removal? There was NO press release.
But things changed. The PhysX software became faster on the CPU, it had to become, or Havoc would have totally killed it and only a couple of developers where choosing to take Nvidia's money and create a game with hardware PhysX, where gamers that where using Intel or AMD GPUs would have to settle for a game with only minimal physics effects. No developer could justify going hardware PhysX in a world with 4-6-8 cores/threads CPUs. So Nvidia decided to offer a PhysX software engine that was usable. It's dream to make AMD GPUs look inferior through physics had failed.

As long as you reject reality, you would keep believing that we(the non believers) post bullshit and lies.
 
Last edited:
Joined
Dec 12, 2018
Messages
17 (0.01/day)
It's a speculation for those that don't know what Intel began to do in the early 2000s. They drove AMD out of the OEM and server business with billions in bribes, there was no way AMD would have survived that exodus up until today had they not bought ATI and started shipping APUs in consoles, at one point in time that was basically their only considerable source of income. Intel probably never predicted that they were going to buy ATI nor what their intentions with it were.

This is an excellent point , AMD with out ATI purchase doesn't make it through the failure that was Bulldozer , the console Contracts were the only thing keeping the lights on. What a scary world no AMD and ,$1k Intel quad core CPUs.
 
Joined
Mar 8, 2013
Messages
20 (0.00/day)
All my posts here talk about Nvidia and PhysX. I don't care about the code before Nvidia. That's your conclusion because it just suits you. But anyway, that link you posted proves that the code was far from optimized. And Nvidia could make it much more optimized easily and fast. They are a software company in case you don't know that. They had a ton of programmers and experience to fix it, but they didn't.

Sigh... they did fix it with PhysX 3, it was a significant overhaul and was completed as quickly as could be expected for something like that. Maybe you need to take another look that the timelines here.

Now, PhysX wasn't locked when it was meant to run on Ageia cards, before Nvidia took over. I would be objecting on Ageia cards, you would also, if I was seeing developers throwing all physics effects on the Ageia card and forcing people to buy one more piece of hardware, when there where already multicore CPUs to do the job.

You just described Ageia's pre-Nvidia business model verbatim...

By the way. Saying ALL the time that the other person posts bullshit, is a red flag. You are putting a red flag on yourself, that you are a total waste of time. You look like a brainless 8 years old fanboy that just wants to win an argument when you keep saying that the other person posts bullshit. This is the simplest way to explain it to you.

You lied, objectively... why shouldn't you be called out?

You just don't want to understand. You have an image in you mind that Nvidia is a company run by saints who want to push technology and make people happy. Maybe in another reality. It's funny that GameWorks even hurt performance in older series of Nvidia cards, but hey, Nvidia would have treated AMD cards fair with it's locked and proprietary code.

WOW, so you're saying you know whats in my mind and you are going to tell me what I think and want... that's certainly not an insidious and disingenuous way to argue </s>

You reject reality and then you ask what is it with us? And who are we? Are we a group? Maybe a group of non believers?

I guess you just jumped into the discussion without reading the other posts, that makes sense, that kind of lazy ignorance seems your speed.

Look at my system specs. My answer is there. I keep a simple GT 620 card in my system just so I can enjoy hardware PhysX effects in games like Batman for example. When i enable software PhysX in a systme with a 4th gen quad core i5 and an RX 580 framerate goes down to single digit. That simple GT 620 is enough for fluid gaming. And no I didn't had to install a patch because as I said, Nvidia decided to remove the lock? Why did they removed the lock. Did they decided to support AMD cards by themselves? Throw a bone to AMD's customers? Maybe they finally came in agreements with AMD? And by the way, why didn't they announced that lock removal? There was NO press release.
But things changed. The PhysX software became faster on the CPU, it had to become, or Havoc would have totally killed it and only a couple of developers where choosing to take Nvidia's money and create a game with hardware PhysX, where gamers that where using Intel or AMD GPUs would have to settle for a game with only minimal physics effects. No developer could justify going hardware PhysX in a world with 4-6-8 cores/threads CPUs. So Nvidia decided to offer a PhysX software engine that was usable. It's dream to make AMD GPUs look inferior through physics had failed.

As long as you reject reality, you would keep believing that we(the non believers) post bullshit and lies.

Your notions are directly contradicted by the facts, how is calling that out rejecting reality?
 
Joined
Sep 6, 2013
Messages
2,976 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Sigh... they did fix it with PhysX 3, it was a significant overhaul and was completed as quickly as could be expected for something like that. Maybe you need to take another look that the timelines here.
Oh my. Here we go again.

Well I checked the date. SDK 3.0 came out in June 2011. I guess programmers also need time to learn it and implement it, so games using it came out when? Probably when hardware PhysX was clear that wasn't meant to became a standard.

Try again.
You just described Ageia's pre-Nvidia business model verbatim...
Ageia didn't had the connections, money, power to enforce that. So even if they wanted to do that, they couldn't. Also PPUs where not something that people where rushing to buy, so developers wouldn't cripple the game for 99% of their customers, just to make 1% happy. Nvidia was a totally different beast. And they did try to enforce physics on their GPUs. You are NOT reading or you just pretend to not read what I post.
You lied, objectively... why shouldn't you be called out?
It seems that you are a waste of time after all. What you don't like is not a lie. I could call you also a liar. But I am not 5 years old.
WOW, so you're saying you know whats in my mind and you are going to tell me what I think and want... that's certainly not an insidious and disingenuous way to argue </s>
I come on. You keep posting like a 5 years old. I am just bored to post the parts of your posts where you make assumptions about what I think, what I mean, where I intentionally lie.
I guess you just jumped into the discussion without reading the other posts, that makes sense, that kind of lazy ignorance seems your speed.
You do reject reality. As for ignorance, it's your bliss.
Your notions are directly contradicted by the facts, how is calling that out rejecting reality?
So you had nothing to say here.

Well, nice losing my time with you. Have a nice day.
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Intel's income is higher than NV's revenue...

Am I missing something about the whole "AI business" or is it about rather straighforward number crunching?

ATI is now 7 years behind Nvidia
By which braindamaged metric? Dear God...
 
Joined
Jun 30, 2019
Messages
56 (0.03/day)
Intel is a company that tries every trick in the book. They take their leisure to putting their right foot forward, until they threw the kitchen sink and the bath tub at the problem. That is about when they have decided upon the said question.


That is some of the dumbest stuff I have ever read in my life.

Even worse is that they turned down Apple's pitch for the CPU in the first iPhone. Intel did not see the potential of smartphones at the time. ARM is now a major long term threat to their existence.

Intel has form when it comes to these sorts of missed opportunities.
This was the most arrogant, short sighted decisions in the history of tech. Intel may have owned mobile computing.

AMD is no underdog. Check the dictionary for the meaning of the word:

View attachment 161684
Until AMD can make inroads into the server rooms where they have like maybe 10% penetration and oem's where they have even less than that they are still HUGE underdogs. Retails sales are like 1% of the overall market for cpu's if that.
 
Joined
Apr 12, 2013
Messages
6,743 (1.68/day)
AMD arguably profited more.
I'd argue the retailers & miners profited a heck lot more, if AMD started a mining farm back then I bet they'd have made 10x the profit during the (peak) mining booms.
 
Joined
Aug 20, 2007
Messages
20,763 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
I'd argue the retailers & miners profited a heck lot more,

No. Just no. AMD's cashflow is much larger than that. Hell bitcoins market cap isn't even really all that huge in talking corporation money. Let me put it this way, if Intel wanted to, they could just buy all bitcoins and end it.

AMD probably could too though it would likely end them.

The disparity between the big money like that and Bitcoin is actually quite large.
 
Joined
Apr 12, 2013
Messages
6,743 (1.68/day)
You mean during the two mining peaks, when worldwide crypto market cap soared 5x-10x in months? Some of currencies saw even greater gains. I don't have the exact numbers from back then but I do remember AMD's GPU division didn't report anywhere near the kind of profits you saw from crypto & yes I realize electricity cost play a major role in it. Of course the smart(er) money probably invested in crypto rather than opening giant farms.
 
Top