• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Losing CPU Market-Share to AMD

SparkyJJO

New Member
Joined
Apr 2, 2009
Messages
311 (0.06/day)
System Name Dominator
Processor Core i7-2600k @ 4.5GHz
Motherboard Gigabyte Z77X-D3H
Cooling XSPC Raystorm, MCW82, MCR320, DDC 3.2 Petras top, EK150 res
Memory G.Skill Ripjaws 2x4GB DDR3-1600
Video Card(s) Radeon 7950
Storage 2x Crucial M4 256GB SSD, 500GB and 1TB Samsung storage
Display(s) Samsung 46" HDTV :D
Case Gigabyte 3D Aurora 570 (modded for internal WC)
Audio Device(s) Audigy 2 ZS
Power Supply Corsair HX620
Software Windows 7 Pro x64
Joined
May 19, 2007
Messages
7,662 (1.24/day)
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
good news.

and a little FYI for you all, i5 is just a core2 chip with IMC, the performance isnt that much better, infact many benches dont show any signifigant gain at all.

i5 also is plant to be very hard to overclock, intel wants to block overclocking unless u buy their "enthusist" platforms, they have been working on ways to make the cpu fail if you overclock it(no joke)

so well, i will stick with my amd rigs, I cant see amd removing clocking from their cpu's, hell their black edition chips are a great buy IMHO.

core2 is still competitive so no problems
 

pepsi71ocean

New Member
Joined
Nov 7, 2007
Messages
1,471 (0.24/day)
Location
The Peoples Republic of New South Jersey
System Name The Grand Phoenix Clusterflop
Processor AMD Phenom II X4 965 Black Edition Deneb @3.4GHz
Motherboard ASRock 870 EXTREME3
Cooling Xigmatec S1284 (Lapped)1x200mm, 4x120mm
Memory Muskin Silverline 4GB DDR3 1333 (PC3 10666) 9-9-9-24
Video Card(s) eVGA GTX 470 SC Edition 1280mb RAM (C/S/M)(640/1280/1705)
Storage 2x500GB Seagate, 32MB Cache 1xWD 40GB UMD IDE Hdd.
Display(s) SAMSUNG 22" LCDTV HD Monitor and Samsung 24"
Case COOLER MASTER RC-690
Audio Device(s) USB 2.0 Sound (USB out to my Stero System)
Power Supply Thermaltake XT TPX-775M 775W
Software Windows XP Home SP3
thank god AMD is making a comeback, hopfully it will cause intel to cut their prices to remain competitive.
 
Joined
May 19, 2007
Messages
7,662 (1.24/day)
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
thank god AMD is making a comeback, hopfully it will cause intel to cut their prices to remain competitive.

This is what everyone should look forward to
 
Joined
Apr 2, 2009
Messages
3,505 (0.64/day)
That surprising? :D

Well?

I thought all CEOs did was playing golfs and put stamp on documents. How naive of me !
j/k, of course. I know better than that.

Still, it's quite surprising that, once Hector is gone, AMD is starting to get better. Hector is the bad mojo, or ... you know, he sucks at what he is supposed to do.

How he got his fame, that's beyond me.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.80/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
assassins creed developer was gagged and slapped by nvidia when ati owned nvidia in the game :p

You mean the whole 10.1 issues? No, it wasn't. 10.1 didn't give the gains, they accidentally left out part of the code in the patch, so the 10.1 cards actually weren't having to render as much as 10.0 cards. That's where the performance gains came from. Once that code was corrected, the gains disappeared. The conspiracy was made up by ATI fanboys.
 

HammerON

The Watchful Moderator
Staff member
Joined
Mar 2, 2009
Messages
8,397 (1.52/day)
Location
Up North
System Name Threadripper
Processor 3960X
Motherboard ASUS ROG Strix TRX40-XE
Cooling XSPC Raystorm Neo (sTR4) Water Block
Memory G. Skill Trident Z Neo 64 GB 3600
Video Card(s) PNY RTX 4090
Storage Samsung 960 Pro 512 GB + WD Black SN850 1TB
Display(s) Dell 32" Curved Gaming Monitor (S3220DGF)
Case Corsair 5000D Airflow
Audio Device(s) On-board
Power Supply EVGA SuperNOVA 1000 G5
Mouse Roccat Kone Pure
Keyboard Corsair K70
Software Win 10 Pro
Benchmark Scores Always changing~
I remember the first computer I built with an Athlon 64 3200. Then a 3500, then a 3700, then a 4000, and finally a FX 53. Those were all great CPUs compared to Intel (at the time). Then I switched with the Core2 Duo.
It is nice to see AMD back in the "game" and applying pressure to Intel :toast:
 
Last edited:

FryingWeesel

New Member
Joined
Mar 15, 2009
Messages
132 (0.02/day)
You mean the whole 10.1 issues? No, it wasn't. 10.1 didn't give the gains, they accidentally left out part of the code in the patch, so the 10.1 cards actually weren't having to render as much as 10.0 cards. That's where the performance gains came from. Once that code was corrected, the gains disappeared. The conspiracy was made up by ATI fanboys.

wrong, one of the main features of 10.1 was/is to remove that extra rendering pass, so no code was left out, no effects where being missed, fact is that was an excuse to explain the patch, not not a conpiracy made up by ati fanboys, in this case it really is a fact that the game was patched to keep nvidia happy since they had dumped money(or in this case hardware) into helping dev the game.

there are plenty of links about it, those that go into depth explain it quite well, 10.1 removes the need for extra rendering passes for some effects, the same effects that gave the perf boost to ati cards.

so you can read up about this and get the FACTS not the excuses used by Ubi to placate nVidia.

http://techreport.com/discussions.x/14707

.....So we have confirmation that the performance gains on Radeons in DirectX 10.1 are indeed legitimate. The removal of the rendering pass is made possible by DX10.1's antialiasing improvements and should not affect image quality. Ubisoft claims it's pulling DX10.1 support in the patch because of a bug, but is non-commital on whether DX10.1 capability will be restored in a future patch for the game....

basickly it was removed to remove the advantege ati had shown due to their cards supporting 10.1 when NOTHING nvidia had or even has today can support 10.1(true dx10)
 

FryingWeesel

New Member
Joined
Mar 15, 2009
Messages
132 (0.02/day)
little more info:
10.1 in assains creede is actually legitimate because you can just reuse the depth buffer in DX10.1 instead of second pass.

again, something nvidia cards cant do because nvidia didnt want to support true dx10(hence ms cutting dx10 specs and having to bring out dx10.1 later)

ATI on the other hand had true dx10(now called 10.1) support with the hd2k cards but....well, nvidia didnt want to follow ms's specs, and cryed enought that ms backed down and removed the stuff nvidia couldnt/wouldnt support.

mind you, im on an 8800gts 512...so dont say im an nvidia hater, i love this card, but i dont love the actions of tthe company behind it.

http://www.pcgameshardware.de/aid,6...Assassins-Creed-planned/Assassins-Creed/News/

You might remember: The enormously successful Assassin's Creed was the first game to have DX 10.1 support. But the patch 1.02 removed this feature.

PCGH was able to get some more information about this business. Below you will find an email interview with Ubisoft.

PCGH: D3D 10.1 support in Assassin's Creed was a hidden feature. Why do you choose not to announce this groundbreaking technology?
Ubisoft: The support for DX10.1 was minimal. When investigating the DX10 performance, we found that we could optimize a pass by reusing an existing buffer, which was only possible with DX10.1 API.


PCGH: What features from Direct 3D 10.1 do you use with the release version? Why do they make Assassin's Creed faster? And why do FSAA works better on D3D 10.1 hardware?
Ubisoft: The re-usage of the depth buffer makes the game faster. However, the performance gains that were observed in the retail version are inaccurate since the implementation was wrong and a part of the rendering pipeline was broken.
This optimization pass is only visible when selecting anti-aliasing. Otherwise, both DX10 and DX10.1 use the same rendering pipeline.


PCGH: Why do you plan to remove the D3D 10.1 support?
Ubisoft: Unfortunately, our original implementation on DX10.1 cards was buggy and we had to remove it.


PCGH: Are there plans to implement D3D 10.1 again?
Ubisoft: There is currently no plan to re-implement support for DX10.1.

if that dosnt look like somebodys just making excuses for patching out something that offers a bennifit to the "other team" i duno what you been smoking.....
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.64/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Let me put this in bullets:

1) Most games are still designed for DirectX 9.0c so they don't lose the enormous customer potential of Windows 98 through Windows XP.

2) DirectX 10 support is usually coded as an alternate in software (it's easier on the hardware when ran on Vista). That is, it is more or less the same as DirectX 9.0c. Very, very few developers go out of their way to focus on DirectX 10 support (ehm, games released exclusively for DirectX 10).

3) DirectX 10, being mostly useless from the sales and development standpoint, carries over to DirectX 10.1; however, even fewer people have DirectX 10.1 support than DirectX 10.

4) Ubisoft developed the game with DirectX 10.1 in mind. First, they saw that NVIDIA announced they had no plans to support DirectX 10.1. Then they ran into problems themselves with the DirectX 10.1 code path in the game after it was launched. They decided that about 1 out of every 10 cards playing the game could handle DirectX 10.1 and decided it would cost too much to fix the botched code in comparison to just removing it altogether.

And that's pretty much it. It wasn't worth fixing so they removed it. NVIDIA's dominance and them saying they won't support DirectX 10.1 may have something to do with deciding it wasn't worth fixing but, as with most publishers, it comes down to cost. The cost to fix it exceeded the amount they were willing to pay so they just got rid of it.
 
Joined
Jun 27, 2008
Messages
96 (0.02/day)
4) is just for assassin creed case rite?

Since Ubee put DX10.1 in Tom Clancy HAWX
 

ShadowFold

New Member
Joined
Dec 23, 2007
Messages
16,918 (2.84/day)
Location
Omaha, NE
System Name The ShadowFold Draconis (Ordering soon)
Processor AMD Phenom II X6 1055T 2.8ghz
Motherboard ASUS M4A87TD EVO AM3 AMD 870
Cooling Stock
Memory Kingston ValueRAM 4GB DDR3-1333
Video Card(s) XFX ATi Radeon HD 5850 1gb
Storage Western Digital 640gb
Display(s) Acer 21.5" 5ms Full HD 1920x1080P
Case Antec Nine-Hundred
Audio Device(s) Onboard + Creative "Fatal1ty" Headset
Power Supply Antec Earthwatts 650w
Software Windows 7 Home Premium 64bit
Benchmark Scores -❶-❸-❸-❼-
HAWX runs awesome with DX10.1 on, so does STALKER but it's an AMD game so. I really like the boosts I get with games that have 10.1, too bad nvidia doesn't use it. I don't get why they don't use it, it really is great...
 
Joined
Nov 21, 2007
Messages
3,688 (0.62/day)
Location
Ohio
System Name Felix777
Processor Core i5-3570k@stock
Motherboard Biostar H61
Memory 8gb
Video Card(s) XFX RX 470
Storage WD 500GB BLK
Display(s) Acer p236h bd
Case Haf 912
Audio Device(s) onboard
Power Supply Rosewill CAPSTONE 450watt
Software Win 10 x64
exactly, i mean why in the hell would publishers even care what nvidia had to say? i sure as hell wouldn't. all TWIMTBP games have that because nvidia like sponsers and gives em cahs money right? if games were DX10.1 i'd bet my entire system that ATI and Nvidia would have switched rolls with ATI on top.
 

FryingWeesel

New Member
Joined
Mar 15, 2009
Messages
132 (0.02/day)
Let me put this in bullets:

1) Most games are still designed for DirectX 9.0c so they don't lose the enormous customer potential of Windows 98 through Windows XP.

2) DirectX 10 support is usually coded as an alternate in software (it's easier on the hardware when ran on Vista). That is, it is more or less the same as DirectX 9.0c. Very, very few developers go out of their way to focus on DirectX 10 support (ehm, games released exclusively for DirectX 10).

3) DirectX 10, being mostly useless from the sales and development standpoint, carries over to DirectX 10.1; however, even fewer people have DirectX 10.1 support than DirectX 10.

4) Ubisoft developed the game with DirectX 10.1 in mind. First, they saw that NVIDIA announced they had no plans to support DirectX 10.1. Then they ran into problems themselves with the DirectX 10.1 code path in the game after it was launched. They decided that about 1 out of every 10 cards playing the game could handle DirectX 10.1 and decided it would cost too much to fix the botched code in comparison to just removing it altogether.

And that's pretty much it. It wasn't worth fixing so they removed it. NVIDIA's dominance and them saying they won't support DirectX 10.1 may have something to do with deciding it wasn't worth fixing but, as with most publishers, it comes down to cost. The cost to fix it exceeded the amount they were willing to pay so they just got rid of it.

read my posts+links, fact is that there was no botched code, it was just an excuse to remove something that was making TWIMTBP look bad, there was no "need" to remove it, the need came from greed, nvidia was PISSED that an nvidia game was running better on ATI hardware due to a FEATURE of dx10(the ability to avoid a 2nd rendering pass by re-using the depth buffer)

I have personaly seen the game on ati hardware vs my 8800gts, it looks/runs better on a 3850/3870 or even 2900xt then it runs for me on my 8800gts 512mb (755/1900/2200) once AA is enabled.

the r600 and higher are TRUE dx10(whats now called 10.1) cards, the 4k cards add back some features of dx9 cards(hardware aa support insted of doing it all in shaders)

had nvidia not refused to support true dx10 and convenced MS to dumb it down they would have bennifited from one less rendering pass being needed, but nv refuses to support 10.1, and when it showed a bennifit for ATI on a game NV supported(either with cash, advertising or hardware) NV was PISSED and got ubi to remove it.....

its not a conspiricy theory, its just buisness, and nv doing what i would call a dirty trick to the public at large, even their own customers.
 
Joined
Nov 21, 2007
Messages
3,688 (0.62/day)
Location
Ohio
System Name Felix777
Processor Core i5-3570k@stock
Motherboard Biostar H61
Memory 8gb
Video Card(s) XFX RX 470
Storage WD 500GB BLK
Display(s) Acer p236h bd
Case Haf 912
Audio Device(s) onboard
Power Supply Rosewill CAPSTONE 450watt
Software Win 10 x64
lol just the fact that they wouldn't get dx10.1 going for their hardware just makes me laugh at them. i mean i wonder how much better performance in games like stalker and crysis we would have if they were dx10.1 not 10. not to mention that is probly what microsoft had in mind when they said dx10 would run better than dx9. they were refering to what we call dx10.1, well at least that's my theory. Nvidia is so pathetic :roll:
 

FryingWeesel

New Member
Joined
Mar 15, 2009
Messages
132 (0.02/day)
a_ump, thats exectly what they where refering to, there are features that can improove both perf and quility that where removed from "10" to placate nvidia, as such we are not getting the best possible game experiance, insted we get dx9c games with some dx10 shader effects taged on, and when a company puts out a true dx10.1 path on a TWIMTBP title giving 10.1 hardware better perf, nvidia has it removed because it makes them look bad.

hell, the g80, g92, gt200 and we still dont see dx10.1 out of nvidia, they COULD do it, but it would take more work then just re-using stuff they already have :/
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.80/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
How do you know MS removed them to placate NV? How do you know there wasn't botched code in Assassin's Creed? Everything you are claiming has no solid evidence.

Plain and simple, it's a conspiracy theory made up by ATI fanboys to make themselves feel better. Nv never paid off the multitude of other vendors whose TWIMTBP titles ran better on ATI hardware.

It's all a bunch of BS.
 

FryingWeesel

New Member
Joined
Mar 15, 2009
Messages
132 (0.02/day)
ms cut back 10 due to nvidia request, there have been a couple articals about it online over the last couple years, the g80 CANT do some stuff that orignal dx10 specs called for, so MS pulled them out since at the time nvidia was the only maker with a dx10 card(the 2900 wasnt avalable yet as u full well know)

MS im sure hoped that by cutting back 10 and allowing the g80 to be a true dx10 card(by changing what dx10 really was) they would beable to draw more people to vista and dx10, it didnt work, mostly due to bad press and the fact that pre sp1 vista was a buggy pain in the ass to deal with.

You can compair the image quility of dx9, 10 and 10.1 on assassins creede yourself and see that theres not a problem, You can read the dx10.1 specs and see that what they refered to (the "missing rendering pass" is also a SPICIFIC FEATURE of dx10.1 that makes it more efficent then dx10 by allowing the depth buffer to be re-used insted of needing a 2nd rendering pass.

again, if you look at the statements that ubi made when interviewed about it, they dont hold up, they are vauge or use dbl talk to avoid telling people what the real reasion is.

to me it comes off as them saying whatever they have to in order to justify removing something that works fine for ati owners.

It dosnt effect me directly as thru the whole time I have had a g92 card, yet you say im an ati fanboi because I dont just accept the excuses ubi and nvidia put out for their actions.

like nvidia saying they didnt put 10.1 support in the gtx260 and gtx280 cards because "nobodys using 10.1", then why even bother supporting dx10 at all? NOBODY is truely using dx10 because it would cut off to large a portion of the market, those people who are running 2k/xp with dx9 hardware, they could have just made a really bitchin dx9 card since nobodys really using 10......but that would look really insain......(hell it looks insain that they put out extreamly high priced cards with no dx10.1 to me....)

but hey, you must be right, nvidia can do no wrong after all.....:rolleyes:

Personaly, I have seen the stuff nV has pulled over the years, and dispite really liking my current card and being impressed by nvidias current driver development I dont think they are what you seen to think them to be, they are not flawless, they are not above bribery and other dirty tricks to keep their lead in benchmarks.

I guess you also think that the doom3 "conspiricy" was thought up by ati fanboys?

to refresh your memory, nvidia and id work togather, and intentionaly put in code that would run like SHIT on ati hardware, they used "texture lookups" insted of shader code, nvidia hardware did texture lookups insainly well back then, ati's hardware did shader work insainly well, by editing 1 file and replacing the texturelookup code with equivlant shader code ati cards became FASTER then nvidia cards with no quility diffrance(but these changes also slowed nvidia cards down even more then texture lookups slowed ati cards down)

In the end ati put a fix in their drivers to get around the "problem", clearly if you looked at what they did, it wouldnt have been hard to have put both paths in the game and have it auto detect ati vs nvidia and use the proper path for that card, but they didnt..........

this stuffs happened many times over the years.

tiger woods first golf game for example wouldnt run in 3d mode on non-nvidia cards, u could trick it into running in full 3d mode with all features by using an app to change the device id to that of an nvidia card.
and that was an early TWIMTBP title and they have continued to do that kinda stuff over the years, hey its a good marketing move if you dont get caught as they did with AC, doom3 and tigerwoods(just 3 examples)

I mean if you can keep your perf higher then the compeditors for the first months benching your set, if you can keep it going longer, your golden.

if u get caught, you just get the company to say the game/app needs patched because of flawed code or some other excuse.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.80/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
ms cut back 10 due to nvidia request, there have been a couple articals about it online over the last couple years, the g80 CANT do some stuff that orignal dx10 specs called for, so MS pulled them out since at the time nvidia was the only maker with a dx10 card(the 2900 wasnt avalable yet as u full well know)

MS im sure hoped that by cutting back 10 and allowing the g80 to be a true dx10 card(by changing what dx10 really was) they would beable to draw more people to vista and dx10, it didnt work, mostly due to bad press and the fact that pre sp1 vista was a buggy pain in the ass to deal with.

You can compair the image quility of dx9, 10 and 10.1 on assassins creede yourself and see that theres not a problem, You can read the dx10.1 specs and see that what they refered to (the "missing rendering pass" is also a SPICIFIC FEATURE of dx10.1 that makes it more efficent then dx10 by allowing the depth buffer to be re-used insted of needing a 2nd rendering pass.

again, if you look at the statements that ubi made when interviewed about it, they dont hold up, they are vauge or use dbl talk to avoid telling people what the real reasion is.

to me it comes off as them saying whatever they have to in order to justify removing something that works fine for ati owners.

It dosnt effect me directly as thru the whole time I have had a g92 card, yet you say im an ati fanboi because I dont just accept the excuses ubi and nvidia put out for their actions.

like nvidia saying they didnt put 10.1 support in the gtx260 and gtx280 cards because "nobodys using 10.1", then why even bother supporting dx10 at all? NOBODY is truely using dx10 because it would cut off to large a portion of the market, those people who are running 2k/xp with dx9 hardware, they could have just made a really bitchin dx9 card since nobodys really using 10......but that would look really insain......(hell it looks insain that they put out extreamly high priced cards with no dx10.1 to me....)

but hey, you must be right, nvidia can do no wrong after all.....:rolleyes:

Personaly, I have seen the stuff nV has pulled over the years, and dispite really liking my current card and being impressed by nvidias current driver development I dont think they are what you seen to think them to be, they are not flawless, they are not above bribery and other dirty tricks to keep their lead in benchmarks.

I guess you also think that the doom3 "conspiricy" was thought up by ati fanboys?

to refresh your memory, nvidia and id work togather, and intentionaly put in code that would run like SHIT on ati hardware, they used "texture lookups" insted of shader code, nvidia hardware did texture lookups insainly well back then, ati's hardware did shader work insainly well, by editing 1 file and replacing the texturelookup code with equivlant shader code ati cards became FASTER then nvidia cards with no quility diffrance(but these changes also slowed nvidia cards down even more then texture lookups slowed ati cards down)

In the end ati put a fix in their drivers to get around the "problem", clearly if you looked at what they did, it wouldnt have been hard to have put both paths in the game and have it auto detect ati vs nvidia and use the proper path for that card, but they didnt..........

this stuffs happened many times over the years.

tiger woods first golf game for example wouldnt run in 3d mode on non-nvidia cards, u could trick it into running in full 3d mode with all features by using an app to change the device id to that of an nvidia card.
and that was an early TWIMTBP title and they have continued to do that kinda stuff over the years, hey its a good marketing move if you dont get caught as they did with AC, doom3 and tigerwoods(just 3 examples)

I mean if you can keep your perf higher then the compeditors for the first months benching your set, if you can keep it going longer, your golden.

if u get caught, you just get the company to say the game/app needs patched because of flawed code or some other excuse.
I didn't call you a fanboy. I said fanboys made it up. Did you make it up?

And Ubi never said a render pass was missing, like the DX10.1 feature you are referring to.They said their implementation is buggy. If you want to take that as a conspiracy against ATI by nV and Ubi, be my guest.

And none of what you are saying has any solid backing in terms of evidence. No proof of motive exists. No, NV is not an angel of a company, nor is MS, Intel, or AMD. They are all guilty of something shady at any given point in time, but just because a game has the TWIMTBP tags on it, does not mean that the developer is doing anything to hurt ATI. Yes, they optimize for nV, because nV provides them the means to do so, but they don't sabotage ATI like so many want to believe.
 

HammerON

The Watchful Moderator
Staff member
Joined
Mar 2, 2009
Messages
8,397 (1.52/day)
Location
Up North
System Name Threadripper
Processor 3960X
Motherboard ASUS ROG Strix TRX40-XE
Cooling XSPC Raystorm Neo (sTR4) Water Block
Memory G. Skill Trident Z Neo 64 GB 3600
Video Card(s) PNY RTX 4090
Storage Samsung 960 Pro 512 GB + WD Black SN850 1TB
Display(s) Dell 32" Curved Gaming Monitor (S3220DGF)
Case Corsair 5000D Airflow
Audio Device(s) On-board
Power Supply EVGA SuperNOVA 1000 G5
Mouse Roccat Kone Pure
Keyboard Corsair K70
Software Win 10 Pro
Benchmark Scores Always changing~
Well stated Wile E
 
Joined
Feb 21, 2008
Messages
4,985 (0.85/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
People that buy Intel had a good competitive processor a year or three ago. The cutting edge types still upgrade though.

People that only buy AMD (for whatever reason) finally found a great competitive processor with Phenom 2.

So the "AMD only" group wasn't very motivated until the Phenom 2, for an upgrade. Most upgraded from the dismal original phenom or the good old ground breaking X2 939 or AM2.

I buy AMD and Intel. Why would you limit yourself to only one or the other. Its not a sports team.... its a processor.


PS I am just saying that Intel was ahead of the game by alot from the Core2 launch until Phenom 2 finally caught up but is still behind Core i7.
 
Last edited:

FryingWeesel

New Member
Joined
Mar 15, 2009
Messages
132 (0.02/day)
some do some dont do things to hamper perf on ati/nvidia cards on titles that are related to their hardware, as you should fully know, some companys are well known for it, most arent so blatant about it tho.

many times you see "unexplainable" perf issues with one or the other companys hardware for no apparent reasion, i mean HL2 vs doom3, well ati is just better at d3d and also the game/engine was optimized at least at the time for ati, BUT it also had rendering path optimizations that helped some nvidia cards run better as well, Doom3 had a spicific peice of coding that ran VERY poorly on ati hardware, somebody found the fix and posted it(then ati's driver dept figuared out how to fix it in drivers with that info)

Id is one of those companys I use to have nothing but respect for, they use to be very even handed, they would add optimizations for most common readely avalable hardware, 3dfx,ati,nvidia,hell even powervr got support in quake1 and 2, then came doom3........

there are things I will accept as optimizations and things I wont accept as purely being optimizations, doom3 is one title that was clearly coded with extream bias to nvidia(it would have been easy to have put both code paths in) AC, well from what i read myself its very clear that nV presured Ubi to "fix" their problem, the easiest fix was to just dissable/remove dx10.1 and say it was flawed/borked........
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.80/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
some do some dont do things to hamper perf on ati/nvidia cards on titles that are related to their hardware, as you should fully know, some companys are well known for it, most arent so blatant about it tho.

many times you see "unexplainable" perf issues with one or the other companys hardware for no apparent reasion, i mean HL2 vs doom3, well ati is just better at d3d and also the game/engine was optimized at least at the time for ati, BUT it also had rendering path optimizations that helped some nvidia cards run better as well, Doom3 had a spicific peice of coding that ran VERY poorly on ati hardware, somebody found the fix and posted it(then ati's driver dept figuared out how to fix it in drivers with that info)

Id is one of those companys I use to have nothing but respect for, they use to be very even handed, they would add optimizations for most common readely avalable hardware, 3dfx,ati,nvidia,hell even powervr got support in quake1 and 2, then came doom3........

there are things I will accept as optimizations and things I wont accept as purely being optimizations, doom3 is one title that was clearly coded with extream bias to nvidia(it would have been easy to have put both code paths in) AC, well from what i read myself its very clear that nV presured Ubi to "fix" their problem, the easiest fix was to just dissable/remove dx10.1 and say it was flawed/borked........
Adding all those optimizations cost more development money. Things the parent companies of the devs are taking very seriously nowadays. dev teams no longer get the time or budget allotted to them.

And again, still no proof exists that Ubi pulled 10.1 as a favor to nV.
 
Joined
Feb 21, 2008
Messages
4,985 (0.85/day)
Location
Greensboro, NC, USA
System Name Cosmos F1000
Processor i9-9900k
Motherboard Gigabyte Z370XP SLI, BIOS 15a
Cooling Corsair H100i, Panaflo's on case
Memory XPG GAMMIX D30 2x16GB DDR4 3200 CL16
Video Card(s) EVGA RTX 2080 ti
Storage 1TB 960 Pro, 2TB Samsung 850 Pro, 4TB WD Hard Drive
Display(s) ASUS ROG SWIFT PG278Q 27"
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR HXi HX1000i 1000watt
Mouse G400s Logitech
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
Software Win10 Pro, Win7 x64 Professional
Adding all those optimizations cost more development money. Things the parent companies of the devs are taking very seriously nowadays. dev teams no longer get the time or budget allotted to them.

And again, still no proof exists that Ubi pulled 10.1 as a favor to nV.

The hardware DX10.1 compliance from ATi isn't compatible with the latest microsoft directX 10.1 software provided to developers by microsoft. I remember reading about it thinking no wonder why nobody bothers with 10.1. :laugh:

I know that affected 3xxx series ATi, but maybe 4xxx fixed the mistake?
 

FryingWeesel

New Member
Joined
Mar 15, 2009
Messages
132 (0.02/day)
no proof they didnt either, and their comments when interviewed dont lead me to belive they removed it for any reasion other then because it gave ati an advantege.

and the optimizations for doom3 took a user very little time to figuar out, if u would like, i could link the post on megagames about it....
http://www.megagames.com/news/html/pc/doom3enhancetheexperiencept2.shtml

Enhance the ATI Experience


It is, of course, a well known fact that Doom 3 is a game which performs best when using boards by nVidia. This has left ATI fans frustrated and eager for a driver update or some other fix. Since ATI has not yet responded, a way of improving the way Doom 3 handles on ATI cards has been posted on the Beyond3D forums. According to the author, the performance increase can increase frame rate from 34fps in 1280x1024 to 48fps. Changes would, of course, depend on each individual set-up. A further suggestion from the forum is that the fix really kicks-in if vsync is enabled. Please feel free to post your experience with the fix on the MegaGames Forums.

The fix involves changing some code which can be found in the Doom 3 pak000.pk4 file. For those not interested in the technical side of the fix, an already changed file is available by following the download tab above. Extract so that the shader file goes under doom3\base\glprogs. This replaces a dependent texture read with equivalent math, which runs better on ATI cards, but seems to run slower on NV boards, so only apply this if you got an ATI card.

...this should be good enough proof that ATI hardware can run Doom3 just as good if not better than nVidia, and that we can pass on all the "ATI suck in OpenGL", "ATI's drivers suck" etc. into the trashcan where it belongs.

The full, do-it-yourself, fix is as follows:

I picked up Doom3 today and let be begin by saying it's a kickass game so far. A few minuses like weapon reload (which I find add nothing to a game, except annoyance, so I don't know why many devs keep adding it to their games), but overall much above my expectations.

Anyway, to the fun part, exploring the technology.
I think I've found the source of why this game runs comparably slow on ATI hardware vs. nVidia at the moment, and found a solution to the problem.

First, open your doom3\base folder. Doubleclick on the pak000.pk4 file. In the "window can't open this file .. .bla bla" dialog, go on and associate the file with an app like WinRar. With this file open in WinRar, go to the glprogs directory in the file. In there you'll find the shaders. The interaction.vfp file seems to be the main rendering shader. Altering this shader to output a constant color turns most objects into that constant color, except for stuff like computer screens etc.

So doubleclick the interaction.vfp file to open it (you may have to associate the .vfp extension with a text editor like notepad or wordpad first since we're going to edit the file). Scroll down to the fragment shader. You'll find these rows:

Code:

PARAM subOne = { -1, -1, -1, -1 };
PARAM scaleTwo = { 2, 2, 2, 2 };


Add this right below them:

Code:

PARAM specExp = { 16, 0, 0, 0 };


Now scroll down to this:

Code:

# perform a dependent table read for the specular falloff
TEX R1, specular, texture[6], 2D;


Comment out that line by adding a "#" to it, and add another line that will do the same thing with math instead, so it should look like this:

Code:

# perform a dependent table read for the specular falloff
# TEX R1, specular, texture[6], 2D;
POW R1, specular.x, specExp.x;


Save the file and close your text editor. WinRar will ask if you want to update the file in the archive, select yes. Close WinRar and enjoy about 40% higher performance in Doom3. Haven't done extensive testing yet, but my performance went from 34fps in 1280x1024 to 48fps.

Conclusion and discussion:
I don't want to complain about Carmack's work, I still consider him to be the industry leader in graphics engines. Though when I read the shader it striked me how many texture accesses it did compared to the relatively short shader, even for stuff that could just as well be done with math for a small cost in instructions. Using a dependent texture lookup for POW evaluation makes a lot of sense for R200 level hardware due to instruction set limits, but for R300 and up it's much better to just spend the three cycles it takes to evaluate POW with math instead of risking texture cache trashing with a dependent texture read, which may be much more costly, especially since the access pattern in this case will be far from linear. Also, using math improves the quality too, even though it may not be very noticable in this game.

I should point out though that I'm not sure if the constant specular factor 16 that I chose is the one that the game uses, so output may be slightly different, but if this solution will be worked into the game in a future patch, then this is easily configurable by the game so that there won't be a difference, except a lot faster.

An interesting follow-up discussion may be why this dependent texture lookup is much slower on our hardware than on nVidia. Maybe there's an architectural difference that's to blame, or maybe something else? The main point here though is that this should be good enough proof that ATI hardware can run Doom3 just as good if not better than nVidia, and that we can pass on all the "ATI suck in OpenGL", "ATI's drivers suck" etc. into the trashcan where it belongs.

theres more advanced versions but the megagames one is easy to find thats why i use it :)

fact is that as u see the changes where EASY to make and made a HUGE diffrance in perf for ATI cards, but Id include such shader/math based code, because nvidia cards do texture lookups faster then they do math(at the time)
 
Top