• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Sues NVIDIA Over Chipset License, NVIDIA Responds

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.63/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
And the noose grows tighter around NVIDIA's neck...
 

DrPepper

The Doctor is in the house
Joined
Jan 16, 2008
Messages
7,482 (1.26/day)
Location
Scotland (It rains alot)
System Name Rusky
Processor Intel Core i7 D0 3.8Ghz
Motherboard Asus P6T
Cooling Thermaltake Dark Knight
Memory 12GB Patriot Viper's 1866mhz 9-9-9-24
Video Card(s) GTX470 1280MB
Storage OCZ Summit 60GB + Samsung 1TB + Samsung 2TB
Display(s) Sharp Aquos L32X20E 1920 x 1080
Case Silverstone Raven RV01
Power Supply Corsair 650 Watt
Software Windows 7 x64
Benchmark Scores 3DMark06 - 18064 http://img.techpowerup.org/090720/Capture002.jpg

PCpraiser100

New Member
Joined
Jul 17, 2008
Messages
1,062 (0.18/day)
System Name REBEL R1
Processor Core i7 920
Motherboard ASUS P6T
Cooling Stock
Memory 6GB OCZ GOLD TC LV Kit 1866MHz@1.65V 9-9-9-24
Video Card(s) Two Sapphire HD 5770 Vapor-X Xfire'd and OC'd (920/1330)
Storage Seagate 7200.11 500GB 32MB
Case Antec Three Hundred
Audio Device(s) ASUS Xonar D1 PCI Sound Card
Power Supply OCZ StealthXStream 500W
Software Windows 7 Ultimate 64-bit
Benchmark Scores 16585 Performance Score on 3DMark Vantage
Gee, I miss CPU dominace. Is there anything sacred? :(

It seems that those who sue are serious cowards. Taking stuff here and here, I just wished they competed between each other like fair players just like when ATI and Nvidia went apeshit on each other in the DX9 generation. And they still do.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Personally i like this war , both intel and Nvidia are evil little bastards and they deserve to kill each other to the death , both when they had some sort of tehnological supremacy over the competition they asked very high prices , this war put's some balance in this sector , some braething air for AMD to get on their feet.
It's not nice want intel does but it does it to an equal evil enemy
Some of the things Nvidia does are : paid benchmarks over the web , all kinds of scandals if someone says their better than them or even if there is a possibility of someone being better than them , fights with ATI/AMD , fights with intel , deceiving customers and confusing them ( the rebranding scheme ) , hardware faults in some of their products over the years that never got fixed ar admited ( G80 memory leak on 320mb was never fixed and a chipset i don't remember that caused data loss that again was never fixed ) .
So don't feel sorry for Nvidia , let them loose some money.

That's way off reality. Apparently you can't remember AMD's FX line or Ati's X850 XT/XT PE.

And the rest is just :roll:, like paid benchmarks. I won't say they don't do it, to an extent, but don't think for 1 second they have the monopoly on that. Or that they control any significant portion of the media. lol

And cough* TLB errata *cough.

Companies are simply that, companies and they want to make money. AMD and Ati have done those things less (they did them though), simply because they have been less times on the lead and for a shorter time. Give them the lead and some time and you get another FX or X850. It didn't take them too much to release misleading HD4870X2 slides neither.
 

leonard_222003

New Member
Joined
Jan 29, 2006
Messages
241 (0.04/day)
System Name Home
Processor Q6600 @ 3300
Motherboard Gigabyte p31 ds3l
Cooling TRUE Intel Edition
Memory 4 gb x 800 mhz
Video Card(s) Asus GTX 560
Storage WD 1x250 gb Seagate 2x 1tb
Display(s) samsung T220
Case no name
Audio Device(s) onboard
Power Supply chieftec 550w
Software Windows 7 64
That's way off reality. Apparently you can't remember AMD's FX line or Ati's X850 XT/XT PE.

And the rest is just :roll:, like paid benchmarks. I won't say they don't do it, to an extent, but don't think for 1 second they have the monopoly on that. Or that they control any significant portion of the media. lol

And cough* TLB errata *cough.

Companies are simply that, companies and they want to make money. AMD and Ati have done those things less (they did them though), simply because they have been less times on the lead and for a shorter time. Give them the lead and some time and you get another FX or X850. It didn't take them too much to release misleading HD4870X2 slides neither.
True that AMD would do it too but i fail to see the problem with x850 ? it was a very fast card at that time and had no hardware bug , if you pick on the shader problem and HDR wich couldn't be done completly , this is a hardware limitation not a serious design flaw like the
g80 320mb has , they did strongly said they can do HDR and they paid crytek to make a tech demo wich looked absoluetly awesome and they support even today HDR in source engine so it's more a developer choice how they wanted to do it , partially that looked just as well and worked even on x850 or full wich worked only on shader 3.0.
I didn't defend AMD , i just said i like this war and i hope they loose a lot of money , i dislike Nvidia's ways , i always stumble o some news about Nvidia picking on someone or blaming someone , they are always perfect , they are noisy and that have a spoiled attitude.
It's not their fault sales are bad , the partners are to blame ( and look how they run from them to ATI and hope Intel joins the party with some good stuff ) , they are 1 milion times faster than a CPU but wait , they still need it to emulate some stuff that the simple GPU can't do , the cpu is stil vital but to hell with it , let's bash intel who is 5-10 times bigger than us just to get some publicity about how great we are and pull stunts like the one with the myth busters.
If you think how much they abused Intel and the cpu bussines i'm amazed they didn't retaliate much earlier considering Intel has a reputation for being very ruthless , and i'm amazed how much their parteners (Nvidia's ) put up with Nvidia who always blames them and put the pressure on them to cut costs and cut prices at their expenses , no wonder they leave.
This kind of behaviour can be considered very hostile by intel and they take steps to eliminate the threat , what to do ? continue business with Nvidia while they bash a little the importance of CPU and how you don't need to buy a fast CPU and how great is the GPU that one day the CPU will be something like a soundcard or network card , it's very understandable what they do , they want to kill Nvidia but for now they content with them not making money from the chipset business.
And all this could've been avoided is Nvidia just shut up and mind their own problems or schemes , just pretend they are friends with intel and stab them in the back when you had all you wanted ( licenses or whatever ) , a company runed by stupid people that all they want is glory ( we are the best , we do , we can ..... ) and make noise.
Let's hope i'm not murdered by some Nvidia fanboy now :) , i like Nvidia cards , I LOVE THEM :D .
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4

KBD

New Member
Joined
Feb 23, 2007
Messages
2,477 (0.40/day)
Location
The Rotten Big Apple
Processor Intel e8600 @ 4.9 Ghz
Motherboard DFI Lanparty DK X48-T2RSB Plus
Cooling Water
Memory 2GB (2 x 1GB) of Buffalo Firestix DDR2-1066
Video Card(s) MSI Radeon HD 4870 1GB OC (820/950) & tweaking
Storage 2x 74GB Velociraptors in RAID 0; 320 GB Barracuda 7200.10
Display(s) 22" Mitsubishi Diamond Pro 2070SB
Case Silverstone TJ09-BW
Audio Device(s) Creative X-Fi Titanium Fatal1ty Profesional
Power Supply Ultra X3 800W
Software Windows XP Pro w/ SP3
I have to agree that Intel chipsets are better than NV for Intel platforms. But it doesnt mean Intel should just cut off NV from making their own. We as consumers will benefit from the competition and NV will strive to make better chipsets as a result. Apparently Intel doesnt see it that way though
 
Joined
Jun 16, 2008
Messages
3,175 (0.55/day)
Location
Brockport, NY
System Name Is rly gud
Processor Intel Core i5 11600kf
Motherboard Asus Prime Z590-V ATX
Memory (48GB total) 16GB (2x8GB) Crucial Ballistix Sport 3000MHZ and G. Skill Ripjaws 32GB 3200MHZ (2x16GB)
Video Card(s) GIGABYTE RTX 3060 12GB
Storage 1TB MSI Spatium M370 NVMe M.2 SSD
Display(s) 32" Viewsonic 4k, 34" Samsung 3440x1440, XP Pen Creative Pro 13.3
Power Supply EVGA 600 80+ Gold
VR HMD Meta Quest Pro, Tundra Trackers
Software Windows 10
Nvidia is Dr Scratchnsniff and Intel is his fat ugly date for the night who keeps telling him to "stay on his side". Don't think this relationship is very strong.
 
Joined
Aug 18, 2006
Messages
993 (0.15/day)
Location
Los Angeles...U.S.A
Processor i7 920
Motherboard EVGA X58
Cooling eight 120mm fans, Swiftech GTZ cpu block, 3 120mm radiator, MCP655 pump, primochill tubing
Memory 6 gig G-Skill DDR3 1600
Video Card(s) GTX 285's (SLI)
Storage 500GB Western Digital
Display(s) 3 Asus 23 in inchers
Case Lian Li A77B
Audio Device(s) on board 7.1
Power Supply Corsair 1000
Software win 7 64
Intel is so disgustingly greedy it's pathetic..:shadedshu..Like an ugly, fat, rich kid who has every toy in the world, yet hates to let anyone play with them even though he invited you to his house. I really wish that someone, anyone would give them a nice big:nutkick:
 
Last edited:
Joined
May 31, 2005
Messages
275 (0.04/day)
Intel did this to VIA as well, if anyone else remembers.....

Business is war. Businesses will do whatever they can to maintain their advantages or they might as well just give up. Intel doesn't like other companies eating into the pie they want to own.

Don't start thinking that Intel is the evil one here. Any company will aggressively defend its future if they see a threat and can act on it in any way. And hell, like we know what NVIDIA is up to really. Maybe they pissed off Intel. Consumers don't know jack about the inner workings of these places and siding with them is a bad call.
 
Last edited:

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,362 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Imagine if tomorrow NVIDIA decides "no more SLI for Intel platforms". It will sink Intel's high-end CPU business, as people wanting the fastest graphics (in effect the fastest PC) will have to opt for AMD and nForce 980a. So Intel can't f with NVIDIA beyond a point. NVIDIA doesn't have much to lose since every X58 motherboard vendor who wishes to give SLI support needs to pay a royalty to NVIDIA, or use its BR-03 chip. If you go to see, it more or less makes for all the profit NVIDIA would have ended up with, having made SLI exclusive to a Core i7-supportive nForce chipset.
 
Joined
May 31, 2005
Messages
275 (0.04/day)
Imagine if tomorrow NVIDIA decides "no more SLI for Intel platforms". It will sink Intel's high-end CPU business, as people wanting the fastest graphics will have to opt for AMD and nForce 980a. So Intel can't f with NVIDIA beyond a point.

Just how many SLI systems do you think sell? I bet it's a really, really tiny volume. It's not where the money is. It's just a sort of show off crown to get attention. NVIDIA and Intel are slowly becoming direct competitors and that means they aren't going to be buddy buddy at all much longer, IMO.

I think if NV was smart they'd get SLI to be chipset agnostic ASAP. ATI has managed it.

I am very curious as to what Larrabee will turn into. Intel may not go after gaming, but they have plans for dealing with the GPGPU threat that's for damn sure.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,362 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Just how many SLI systems do you think sell? I bet it's a really, really tiny volume. It's not where the money is. It's just a sort of show off crown to get attention. NVIDIA and Intel are slowly becoming direct competitors and that means they aren't going to be buddy buddy at all much longer, IMO.

The volume that opts for SLI, opts against Intel, in that hypothetical situation. NVIDIA's long terms plans are to reduce the prominence of the x86 processor as a vital PC component, that each PC has at least $20 worth of NVIDIA technology. If the CPU becomes a relatively insignificant component, that's terrible news for Intel. One might argue "what?! CPU an insignificant component? What's a PC without a CPU?" well, the same would apply to components such as PSUs, cases as well. They are a requirement, though their specifications and features become a significant factor only beyond a price-point, where the user seeks to add more components to the PC. Same with CPU. In an application environment where apps depend on the GPU power more so than CPU power effectively reduces the need to have a stronger than required CPU, leaving Intel to sell only cheaper weaker ones.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,362 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Update: Added NVIDIA's PR.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
True that AMD would do it too but i fail to see the problem with x850 ? it was a very fast card at that time and had no hardware bug , if you pick on the shader problem and HDR wich couldn't be done completly , this is a hardware limitation not a serious design flaw like the
g80 320mb has , they did strongly said they can do HDR and they paid crytek to make a tech demo wich looked absoluetly awesome and they support even today HDR in source engine so it's more a developer choice how they wanted to do it , partially that looked just as well and worked even on x850 or full wich worked only on shader 3.0.
I didn't defend AMD , i just said i like this war and i hope they loose a lot of money , i dislike Nvidia's ways , i always stumble o some news about Nvidia picking on someone or blaming someone , they are always perfect , they are noisy and that have a spoiled attitude.
It's not their fault sales are bad , the partners are to blame ( and look how they run from them to ATI and hope Intel joins the party with some good stuff ) , they are 1 milion times faster than a CPU but wait , they still need it to emulate some stuff that the simple GPU can't do , the cpu is stil vital but to hell with it , let's bash intel who is 5-10 times bigger than us just to get some publicity about how great we are and pull stunts like the one with the myth busters.
If you think how much they abused Intel and the cpu bussines i'm amazed they didn't retaliate much earlier considering Intel has a reputation for being very ruthless , and i'm amazed how much their parteners (Nvidia's ) put up with Nvidia who always blames them and put the pressure on them to cut costs and cut prices at their expenses , no wonder they leave.
This kind of behaviour can be considered very hostile by intel and they take steps to eliminate the threat , what to do ? continue business with Nvidia while they bash a little the importance of CPU and how you don't need to buy a fast CPU and how great is the GPU that one day the CPU will be something like a soundcard or network card , it's very understandable what they do , they want to kill Nvidia but for now they content with them not making money from the chipset business.
And all this could've been avoided is Nvidia just shut up and mind their own problems or schemes , just pretend they are friends with intel and stab them in the back when you had all you wanted ( licenses or whatever ) , a company runed by stupid people that all they want is glory ( we are the best , we do , we can ..... ) and make noise.
Let's hope i'm not murdered by some Nvidia fanboy now :) , i like Nvidia cards , I LOVE THEM :D .

I'm going to do it point for point as it will be easier:

- I said AMD's FX and Ati's x850, because at the time they were the faster thing around and costed an arm an a leg. In fact, comparatively, they costed much more than any Intel or Nvidia card costed in recent years. i.e. The 8800 Ultra. It's no correct to blame only Nvidia or Intel for that practice then, when you have the best product you price it accordingly, it's not their fault they are most of times above. lol

- Erm the Nvidia blaming partners is a very simple thing taken out of proportion by the media. I can't remember where, but I've readed the true comment to the shareholders (from where all that came out) and not the second or third hand info you can get in most sensationalist news. They ONLY said that because of the crisis and competition, partners still had a lot of stock, and thus instead of asking for 3 month inventory they asked for half that. In the case of why Nvidia didn't sell as much that quarter, partners are to blame, because they didn't buy what they used to. Doesn't mean they are guilty. It was said in the sense that, it's not that they still don't want the chips or that Nvidia did something wrong, overall sales were lower for everyone and once sales return to normality, so will the partners demand. No partner took those comments for bad, because they know what is all about, it's the media and speciffically some media who make all that thing "explode" like it was somthing important.

- Nvidia has never said that the CPU is not important, that it will dissapear, they said that high-end CPUs are becoming more irrelevant with each generation as they are today, and Ion platform confirms that. Even a weak CPU like Atom (albeit with retained i/o capabilities) does well for 95% of the people when into Ion. Anyway, it was Intel the first one attacking the GPU, saying that the GPU was not important at all. That was when they were commented about their crappy IGPs and how Nvidia's were much better. Like saying, "might be better but unnecessary". We know that's not true. Historically CPUs have been becoming more and more powerful at number crunching along with their ability to act like the director of other parts in the PC. Today the GPU is much better suited for number crunching and there's no point increasing the power of the CPUs for that purpose, that same silicon should be used to improve the i/o capabilities of the CPU so that cheap ones dont become such the bottleneck they are today.
The trend confirms Nvidia is right, in year 2000 PC enthusiasts used to spend way more on the CPU than on the GPU, today we usually spend more on the GPU, because it's there where the benefit is. Spend $800 instead of $200 in a CPU and you get what, a 50% increment? Do that, no, do $100 versus $400 on GPUs and it's a world apart, no tto mention the benefits are much greater. That's Nvidia's point.

- Nvidia, nor AMD nor anyone have to ask Intel for something they have the right to do. On the contrary is Intel who always tries to bend the situation, to a place where they are close to a monopoly, but yet far enough that the law can't put the hands on them. This is no different.

- In any case, I fail to see how a fight that will severely damage Nvidia is good for anyone, except Ati of course. The best for the customer is to have two potent graphics companies. We need stronger Ati, not weaker Nvidia, by any means.
 
Last edited:

leonard_222003

New Member
Joined
Jan 29, 2006
Messages
241 (0.04/day)
System Name Home
Processor Q6600 @ 3300
Motherboard Gigabyte p31 ds3l
Cooling TRUE Intel Edition
Memory 4 gb x 800 mhz
Video Card(s) Asus GTX 560
Storage WD 1x250 gb Seagate 2x 1tb
Display(s) samsung T220
Case no name
Audio Device(s) onboard
Power Supply chieftec 550w
Software Windows 7 64
I'm going to do it point for point as it will be easier:

- I said AMD's FX and Ati's x850, because at the time they were the faster thing around and costed an arm an a leg. In fact, comparatively, they costed much more than any Intel or Nvidia card costed in recent years. i.e. The 8800 Ultra. It's no correct to blame only Nvidia or Intel for that practice then, when you have the best product you price it accordingly, it's not their fault they are most of times above. lol

- Erm the Nvidia blaming partners is a very simple thing taken out of proportion by the media. I can't remember where, but I've readed the true comment to the shareholders (from where all that came out) and not the second or third hand info you can get in most sensationalist news. They ONLY said that because of the crisis and competition, partners still had a lot of stock, and thus instead of asking for 3 month inventory they asked for half that. In the case of why Nvidia didn't sell as much that quarter, partners are to blame, because they didn't buy what they used to. Doesn't mean they are guilty. It was said in the sense that, it's not that they still don't want the chips or that Nvidia did something wrong, overall sales were lower for everyone and once sales return to normality, so will the partners demand. No partner took those comments for bad, because they know what is all about, it's the media and speciffically some media who make all that thing "explode" like it was somthing important.

- Nvidia has never said that the CPU is not important, that it will dissapear, they said that high-end CPUs are becoming more irrelevant with each generation as they are today, and Ion platform confirms that. Even a weak CPU like Atom (albeit with retained i/o capabilities) does well for 95% of the people when into Ion. Anyway, it was Intel the first one attacking the GPU, saying that the GPU was not important at all. That was when they were commented about their crappy IGPs and how Nvidia's were much better. Like saying, "might be better but unnecessary". We know that's not true. Historically CPUs have been becoming more and more powerful at number crunching along with their ability to act like the director of other parts in the PC. Today the GPU is much better suited for number crunching and there's no point increasing the power of the CPUs for that purpose, that same silicon should be used to improve the i/o capabilities of the CPU so that cheap ones dont become such the bottleneck they are today.
The trend confirms Nvidia is right, in year 2000 PC enthusiasts used to spend way more on the CPU than on the GPU, today we usually spend more on the GPU, because it's there where the benefit is. Spend $800 instead of $200 in a CPU and you get what, a 50% increment? Do that, no, do $100 versus $400 on GPUs and it's a world apart, no tto mention the benefits are much greater. That's Nvidia's point.

- Nvidia, nor AMD nor anyone have to ask Intel for something they have the right to do. On the contrary is Intel who always tries to bend the situation, to a place where they are close to a monopoly, but yet far enough that the law can't put the hands on them. This is no different.

- In any case, I fail to see how a fight that will severely damage Nvidia is good for anyone, except Ati of course. The best for the customer is to have two potent graphics companies. We need stronger Ati, not weaker Nvidia, by any means.

You are missing a lot of information from the days when HD4800 series was launched and Nvidia faced a problem competing in price with the 2 products because their cards were more expensive to make , they told parteners to suck it up and reduce costs eventually and to lower prices at their expenses not Nvidia's , i agree that Nvidia parteners are just as greedy as them and don't accept to much bullshit from Nvidia ( look how many closed business or jumped to the enemy too ) but still they should've helped in some way those people with a better product or cut the price of the GPU.
Abotu AMD's and Ati's prices you could be right , i don't know what the prices were for top stuff at that time and i got mainstream components all the time , i know FX series was very pricie but i remember the intel CPU's were too very expensive and what worked then for intel was not better hardware but the marketing machine , the bullshit was so big that people sweared on their children Intel Cpu's ( netburst ) are better.
About Nvidia and Intel war i don't really know who was the first one to start the fight but look at this
http://www.youtube.com/watch?v=ZrJeYFxpUyQ&annotation_id=annotation_764898&feature=iv
Now you tell me how big the bullshit is here and how staged everything is , if i was Intel i would burry them in lawsuits until they die.
There is another thing to consider , let's say a man called jimy invents the CPU and follows this invention that one day becomes very rich , some other guy wants a piece of this great thing of making money called bob and he buys a license to build cpu's , all nice until here but one day jimy regrets giving him a license because bob all it does is copy most of his work and is a pain in the ass making money from his invetion.
This invention of his ( jimy ) is built on a motherboard and all that stuff that go around wich is invented by them too for supporting the CPU , again some people wanted to get rich from this too and bought a license to build this stuff , they sold to them the license but one day they regret giving them means to get rich because they attacked them saying bad things about them so they don't want to give them the right to build MB's for their creations.
It's not moving the world forward this kind of thinking but this is the world were we live.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
You are missing a lot of information from the days when HD4800 series was launched and Nvidia faced a problem competing in price with the 2 products because their cards were more expensive to make , they told parteners to suck it up and reduce costs eventually and to lower prices at their expenses not Nvidia's , i agree that Nvidia parteners are just as greedy as them and don't accept to much bullshit from Nvidia ( look how many closed business or jumped to the enemy too ) but still they should've helped in some way those people with a better product or cut the price of the GPU.
Abotu AMD's and Ati's prices you could be right , i don't know what the prices were for top stuff at that time and i got mainstream components all the time , i know FX series was very pricie but i remember the intel CPU's were too very expensive and what worked then for intel was not better hardware but the marketing machine , the bullshit was so big that people sweared on their children Intel Cpu's ( netburst ) are better.
About Nvidia and Intel war i don't really know who was the first one to start the fight but look at this
http://www.youtube.com/watch?v=ZrJeYFxpUyQ&annotation_id=annotation_764898&feature=iv
Now you tell me how big the bullshit is here and how staged everything is , if i was Intel i would burry them in lawsuits until they die.
There is another thing to consider , let's say a man called jimy invents the CPU and follows this invention that one day becomes very rich , some other guy wants a piece of this great thing of making money called bob and he buys a license to build cpu's , all nice until here but one day jimy regrets giving him a license because bob all it does is copy most of his work and is a pain in the ass making money from his invetion.
This invention of his ( jimy ) is built on a motherboard and all that stuff that go around wich is invented by them too for supporting the CPU , again some people wanted to get rich from this too and bought a license to build this stuff , they sold to them the license but one day they regret giving them means to get rich because they attacked them saying bad things about them so they don't want to give them the right to build MB's for their creations.
It's not moving the world forward this kind of thinking but this is the world were we live.

That video just goes to show the truth. A GPU can do ANY parallel work that fast comparatively, there's no BS there. If anything, all that I can see here is that you smelled Intel's BS in any case. GPUs are now at around 2 TFlops of processing power while CPUs are still in the 50-100 GFlops area. People don't know this fact and that stage show was an easy way to demostrate what anyone in the parallel computing world knows for fact.

The last paragraph, apart from being a mess is utterly wrong, because in that imaginary setting you presented, Intel is NOT jimy, to begin with...
 
Last edited:

leonard_222003

New Member
Joined
Jan 29, 2006
Messages
241 (0.04/day)
System Name Home
Processor Q6600 @ 3300
Motherboard Gigabyte p31 ds3l
Cooling TRUE Intel Edition
Memory 4 gb x 800 mhz
Video Card(s) Asus GTX 560
Storage WD 1x250 gb Seagate 2x 1tb
Display(s) samsung T220
Case no name
Audio Device(s) onboard
Power Supply chieftec 550w
Software Windows 7 64
That video just goes to show the truth. A GPU can do ANY parallel work that fast comparatively, there's no BS there. If anything, all that I can see here is that you smelled Intel's BS in any case. GPUs are now at around 2 TFlops of processing power while CPUs are still in the 50-100 GFlops area. People don't know this fact and that stage show was an easy way to demostrate what anyone in the parallel computing world knows for fact.

You are ignorant and take your information as it is served , you don't actually analyze it.
The video shows a stupid thing that can be done by the cpu also and just as fast or faster , there are just a few dots there , can you actually contradict me on this that the CPU can't do a picture like the mona lisa that fast ?
You are missing the fact all those presentations are made on a CPU , anything that is involded there at the base it needs a CPU , the gpu is a simple powerfull raw calculator , but it is stupid and anything complex makes it obsolete.
You are talking like some fanboy who doesn't know anything more than what it reads on forums and internet , the gpu is not a miracle computing power , they don't even have a fab where to build this things, the cpu business has fabs and if they want they can build a CPU made from 4 billion transitors and organized in 1000-2000 threads with the usual tweaking only a CPU with fabs can do ( lot's of mhz's ) , then it will encode a movie so fast your head will spin but they can't do that because they must build a CPU wich works well on all kinds of software and most of them don't need this kinds of parallelization or don't take any advantage from having this.
I'm dissapointed i'm talking with people that are defending any hype is going around and a few years latter they discover how blown out of proportions some things were and how manipulative some companies are about any stupid feature they have and how much this can help people.
Why do you think it's so hard to make anything work on a GPU ( CPU software i mean ) , because it's so basic in what it can do and they hit the limitations so often they always end up making more stuff on the CPU or the limitations are so big there is no advantage using the GPU for a particular software.
What the GPU is as a piece of hardware is very simple for CPU engineers and the raw power can be matched and by far overtaken , but they can't do it just for the purpose of being the best at number crunching, they will build it as a video card and then they will have a reason to build a big stupid calculator.
The reason Nvidia fears so much Intel is because they know the GPU is not anything groundbreaking , it's not finetuned like a CPU is and the old rumors Nvidia would want to build a CPU is hillarious , they could never compete with Intel or AMD considering they don't have a fabs and can't finetune a CPU to that level even if they had , plus many companies made some sort of CPU's with extraordinary numbers ( tera flops ) but as always they can't be profitable for destkop or even just as powerfull ( IBM made lot's of so called powerfull cpu's and the latest one would be the ones in xbox360 and ps3 ).
They are cocky before they fall , they want to die with pride ( Nvidia ) .
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
You are ignorant and take your information as it is served , you don't actually analyze it.
The video shows a stupid thing that can be done by the cpu also and just as fast or faster , there are just a few dots there , can you actually contradict me on this that the CPU can't do a picture like the mona lisa that fast ?
You are missing the fact all those presentations are made on a CPU , anything that is involded there at the base it needs a CPU , the gpu is a simple powerfull raw calculator , but it is stupid and anything complex makes it obsolete.
You are talking like some fanboy who doesn't know anything more than what it reads on forums and internet , the gpu is not a miracle computing power , they don't even have a fab where to build this things, the cpu business has fabs and if they want they can build a CPU made from 4 billion transitors and organized in 1000-2000 threads with the usual tweaking only a CPU with fabs can do ( lot's of mhz's ) , then it will encode a movie so fast your head will spin but they can't do that because they must build a CPU wich works well on all kinds of software and most of them don't need this kinds of parallelization or don't take any advantage from having this.
I'm dissapointed i'm talking with people that are defending any hype is going around and a few years latter they discover how blown out of proportions some things were and how manipulative some companies are about any stupid feature they have and how much this can help people.
Why do you think it's so hard to make anything work on a GPU ( CPU software i mean ) , because it's so basic in what it can do and they hit the limitations so often they always end up making more stuff on the CPU or the limitations are so big there is no advantage using the GPU for a particular software.
What the GPU is as a piece of hardware is very simple for CPU engineers and the raw power can be matched and by far overtaken , but they can't do it just for the purpose of being the best at number crunching, they will build it as a video card and then they will have a reason to build a big stupid calculator.
The reason Nvidia fears so much Intel is because they know the GPU is not anything groundbreaking , it's not finetuned like a CPU is and the old rumors Nvidia would want to build a CPU is hillarious , they could never compete with Intel or AMD considering they don't have a fabs and can't finetune a CPU to that level even if they had , plus many companies made some sort of CPU's with extraordinary numbers ( tera flops ) but as always they can't be profitable for destkop or even just as powerfull ( IBM made lot's of so called powerfull cpu's and the latest one would be the ones in xbox360 and ps3 ).
They are cocky before they fall , they want to die with pride ( Nvidia ) .

You are the only ignorant if you trully believe that a CPU can match the performance-per-die area of a GPU or that a CPU can do any picture rendering. And I'm talking about a $100 CPU and x86. Other architectures like the Cell microprocessor are much much more close to being able to do it, and yet are far far away from being able to do the same that a $10 GPU does.

It doesn't matter how difficult might be for developers to make programs outside of x86, tht is because and only because they are not used to, because there's no documentation or past works you can work around, not because the architecture itself is any harder to work with. And as of the kind of work that a GPU can do, of course they can't do many things, but the things they DO are the ones that most power require. Again no one said that a PC can run without a CPU, but it CAN run with a CPU that competely lacks FP capabilities, for example, with all it's die area deboted to integer, i/o operations and coherency.

What I think about parallel computing is not based in anyting Nvidia, AMD, Intel or whatnot say in their PR BS, it's directly taken out from works they publish at Standford regarding parallel computing. And in that front, they are the vanguard. And interestingly enough they took the GPGPU approach, first with Ati's X1900 architecture and later With Nvidia through CUDA and Ati/AMD through their Stream (Brook+) approach.

YOU have no idea of what parallel computing is nor anything about it's benefits, something that no CPU can dream of, and so is the case that Intel had no choice than to make thier own "GPU". Because a CPU with all that power is USELESS if it's going to be idling all the time, that's the reason that CPU's only have few ALUs. On the other hand the GPU has no choice than to be there unused most of the time, because when you do need the power, you need a lot of it.

So don't try to make a point out of an idea with no fundament and that even technogy trends are ignoring. And don't call ignorant when you make it clear you don't have a clue about what you are talking. You are so ignorant, in fact, that you are unable to understand that what mythbusters did, was only a representation, lol.

First show me a CPU doing a rendering at 1680x1050 4xAA and acceptable framerates and then we could continue speaking of something. My God. :shadedshu

And BTW, AMD is fabless since yesterday I think, since the shareholders have agreed to The Foundry Company.

EDIT: Now that I think of it, isn't it stupid that I have to explain these things to a guy with $45 worth of CPU and a $180 GPU?? A guy who spent more on the GPU, than on CPU+mobo+ram??
 
Last edited:

leonard_222003

New Member
Joined
Jan 29, 2006
Messages
241 (0.04/day)
System Name Home
Processor Q6600 @ 3300
Motherboard Gigabyte p31 ds3l
Cooling TRUE Intel Edition
Memory 4 gb x 800 mhz
Video Card(s) Asus GTX 560
Storage WD 1x250 gb Seagate 2x 1tb
Display(s) samsung T220
Case no name
Audio Device(s) onboard
Power Supply chieftec 550w
Software Windows 7 64
You are the only ignorant if you trully believe that a CPU can match the performance-per-die area of a GPU or that a CPU can do any picture rendering. .
Oh yeah baby , pricelles words that i will forever keep here to laugh out loud.You just said an iimmensely stupid thing that will forever hover over you.

You are so ignorant, in fact, that you are unable to understand that what mythbusters did, was only a representation, lol.
So now you go back on your word ? previously you said

That video just goes to show the truth. A GPU can do ANY parallel work that fast comparatively, there's no BS there.

So how is it ? those few dots can be made by the cpu or not ? how is that a represantation when the myst busters constantly said this is made with a cpu and this is made with a GPU.
I bet you there was no GPU in that machine and that was a staged presentation , like those prototype cars with no engine , there was missing the GPU.
Enough with this , after you said a cpu can't render a picture i think i waste my time arguing with you , go read some Nvidia newsletters and some slides , they tell the "real truth".
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,362 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Please, no getting personal. Put your argument forward and nothing more.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Oh yeah baby , pricelles words that i will forever keep here to laugh out loud.You just said an iimmensely stupid thing that will forever hover over you.

We'll see who laughs out louder. BTW Larrabee is NOT a CPU. Just in case. lol.

So now you go back on your word ? previously you said



So how is it ? those few dots can be made by the cpu or not ? how is that a represantation when the myst busters constantly said this is made with a cpu and this is made with a GPU.
I bet you there was no GPU in that machine and that was a staged presentation , like those prototype cars with no engine , there was missing the GPU.
Enough with this , after you said a cpu can't render a picture i think i waste my time arguing with you , go read some Nvidia newsletters and some slides , they tell the "real truth".

As I said show me a CPU rendering a picture IN REAL TIME, off course. I REALLY hope you are not arguing with production rendering in mind. I really hope so, because otherwise just would gho to show, well, I'll stop here.

Second, MythBusters thing. Man I don't think it's so difficult to understand. I said:

That video just goes to show the truth. A GPU can do ANY parallel work that fast comparatively, there's no BS there.

Because it's a representation, just goes to show the difference of the speed at which both can perform a highly parallel task. For example rendering. A CPU can render an image in 3Dmax, maya, lightwave, etc (lol I'm loling at the fact that yo thought I don't know about that ("offline"/production rendering), when I ACTUALLY work on that :laugh:), but it takes seconds or minutes to render what a GPU does 60 times per second. Is this clear enough for you to understand??
 
Last edited:

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Is or not the CPU able to do the same rendering as GPU does? And by extension, is the CPU able to do other parallel tasks just as well (which implies fast)? i.e F@H, physics, applying filters to a picture...
 
Last edited by a moderator:

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
Is or not the CPU able to do the same rendering as GPU does? And by extension, is the CPU able to do other parallel tasks just as well (which implies fast)? i.e F@H, physics, applying filters to a picture...
Doesn't matter, the cpu still needs to feed the information to the gpu. And the GPU is terrible at integer calculations compared to a cpu.

How about this, my quad is faster than my 8800GT at encoding H.264 whenever I enable any advanced filters.

No matter how you look at it, the cpu is a very important piece of the computer puzzle. Nvidia is downplaying it's importance for marketing purposes.
 
Last edited by a moderator:

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Doesn't matter, the cpu still needs to feed the information to the gpu. And the GPU is terrible at integer calculations compared to a cpu.

How about this, my quad is faster than my 8800GT at encoding H.264 whenever I enable any advanced filters.

No matter how you look at it, the cpu is a very important piece of the computer puzzle. Nvidia is downplaying it's importance for marketing purposes.

Funny, because mine (8800GT) is MUCH MUCH faster than my Quad. The more filters I enable on TMGENC 4.6 the bigger the difference is.

And once again, I'm not saying the CPU is not an essential part, and I have never interpreted Nvidia's words like saying that either. But think about this, Atom can't do anything, but when you pair it up with a more than modest GPU, it does wonders. What Nvidia said is that the days of $1000 PCs with a $500 CPU and $50 GPU are gone. 50%/50% ($250 each) gives much better results and not only for the gamers.
 
Top