• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Now Almost Worth A Quarter of What it Paid for ATI

Joined
Apr 16, 2010
Messages
2,067 (0.40/day)
System Name The Stone that the Builders Refused / iJayo
Processor R5 1600/ R7 3700X
Motherboard Asrock AB350 Pro4 / Asus Rog Strix B450-F gaming
Cooling Cryorig M9 / Noctua NH-D14
Memory G skill 16 Gigs ddr4 / 16 gigs PNY ddr4
Video Card(s) Nvdia GTX 660 / Nvidia RTX 2070 Super
Storage 120gig 840 evo, 120gig adata sp900 / 1tb Mushkin M.2 ssd 1 & 3 tb seagate hdd, 120 gig Hyper X ssd
Display(s) 42" Nec retail display monitor/ 34" Dell curved 165hz monitor
Case Pink Enermax Ostrog / Phanteks Enthoo Evolv Tempered Glass edition
Audio Device(s) Altec Lansing Expressionist Bass/ M-Audio monitors
Power Supply Corsair450 / Be Quiet Dark Power Pro 650
Mouse corsair vengence M65 / Zalman Knossos
Keyboard corsair k95 / Roccat Vulcan 121
Software Window 10 pro / Windows 10 pro
Benchmark Scores meh... feel me on the battle field!
I like the idea of a samsung merger/buyout.......but samsung probably sees


download.jpg
 
Joined
Nov 18, 2010
Messages
7,125 (1.45/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
Holy Crap... 100 comments about nothing :D

WHO CARES... As long they are afloat with their design team, they will sell their own product, just as ARM does.
 
Joined
Apr 19, 2011
Messages
2,198 (0.46/day)
Location
So. Cal.
If the stock drops you can just wait for it to go back up.
Exactly, where would you be had you bought Apple stock just before the BOD brought back Steve Jobs in 1997? Those where grim days and remember it took like 7 years to see stock begin to march up. You just buy some, and kiss good-bye that money, only to revisit it after 10-15 years (or those who live on) to judge. At such point you're presently surprised, or like those that gamble several hundred, kiss it good-bye the moment it left your hand. Folk that loose 500-1k on Blackjack don't normally go on about 10-15 year later a say that was stupid... is it any different? Well other than the instant gratification if you win (some peg it at ~18%) and that's if you walk away right then. What are the odds that AMD can move beyond this, and double your money, IDK.
 
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
I think we can safely say that AMD won't die off. Intel wouldn't allow it because that would mean potentially huge anti-trust litigation against Intel.
Unlikely. Anti-trust litigation applies to wrong doing. Intel can't be held directly responsible for the failure of a company based on their own mismanagement - if it were, AMD's board of directors would have carte blanche to throw money away like used Kleenex........oh, nvm.
In the event of AMD going under (extremely unlikely since the company could exist - in an extreme situation- as a design house drawing revenue solely from IP), all that would happen is that Intel would be forced to operate under a Consent Decree to ensure that the company did not take advantage of their position. Consent Decree is basically the only thing that kept IBM in check for decades when it could quite easily have squeezed the "seven dwarfs" out of the mainframe market.
Honestly, I would rather see AMD focus on something. They don't have the money (never did,) to compete with Intel and nVidia at the same time.
It would have been better if AMD themselves had come to this same decision voluntarily, rather than having it forced upon them in a series of painful amputations - although I'm not sure AMD's BoD have the stones to set a long term goal based upon core competency. AMD have a tendency to follow trends, not set them.
I know that AMD's X86 license isn't transferable in an acquisition and how it plays in the case of a merger is another story as well. I'm not so sure about shader technology though, as AMD bought ATI and all of the IP seemed to go with it.
A merger is still ownership change (as an entity) as far as the agreement is concerned. Section 5.2 of the agreement has the relevant clauses. You're right about the IP complications - and it isn't just ATI. A large part of ATI's IP originated from 3DLabs - who in turn had acquired Dynamic Pictures, Chromatic Research, and Intergraph's graphic division. All these companies has IP sharing in place with other companies that ended up being subsumed other players ( Nvidia's acquisition of SGI's graphics IP for example) - in addition Nvidia and ATI had a very non-adversarial relationship once they'd seen off the other graphics players (S3, 3Dfx, Matrox, Rendition etc.)
This is a case where Samsung can't get into x86, but they could partner with AMD to produce and develop the GPU side of APUs which would give huge benefits to both sides I would imagine
Graphics and parallelization is where its at (or will be). As far as x86 is concerned, I doubt anyone would actually want it - imagine how far behind a potential "new" player would be. Intel is too entrenched in the high margin x86 enterprise markets, and the low end is a dogfight for the lowest pricing/power envelope for a limited feature set between x86 and ARM on razor thin margins.
So I would like to see more focus out of AMD. In all seriousness, I think AMD needs help; a partner that could offer assistance in one market....
Basically you're saying is that AMD need management vision and a defined strategic plan that can be augmented by new technologies/markets, and not waver from it at the first sign of something shiny distracting them? Taking the decision making out of AMD's BoDs hands? I fully agree if that is the case, although just bringing in someone with a proven track record of success and understanding of strategic planning might suffice (Lisa Su's previous work experience doesn't indicate that she is the ONE). Renée James will be looking for a CEO position next year, and would be a great catch for AMD - whether AMD could tempt her is another matter entirely.
 
Joined
Jul 31, 2014
Messages
480 (0.14/day)
System Name Diablo | Baal | Mephisto | Andariel
Processor i5-3570K@4.4GHz | 2x Xeon X5675 | i7-4710MQ | i7-2640M
Motherboard Asus Sabertooth Z77 | HP DL380 G6 | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Cooling Swiftech H220-X | Chassis cooled (6 fans + HS) | dual-fanned heatpipes | small-fanned heatpipe
Memory 32GiB DDR3-1600 CL9 | 96GiB DDR3-1333 ECC RDIMM | 32GiB DDR3L-1866 CL11 | 8GiB DDR3L-1600 CL11
Video Card(s) Dual GTX 670 in SLI | Embedded ATi ES1000 | Quadro K2100M | Intel HD 3000
Storage many, many SSDs and HDDs....
Display(s) 1 Dell U3011 + 2x Dell U2410 | HP iLO2 KVMoIP | 3200x1800 Sharp IGZO | 1366x768 IPS with Wacom pen
Case Corsair Obsidian 550D | HP DL380 G6 Chassis | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Audio Device(s) Auzentech X-Fi HomeTheater HD | None | On-board | On-board
Power Supply Corsair AX850 | Dual 750W Redundant PSU (Delta) | Dell 330W+240W (Flextronics) | Lenovo 65W (Delta)
Mouse Logitech G502, Logitech G700s, Logitech G500, Dell optical mouse (emergency backup)
Keyboard 1985 IBM Model F 122-key, Ducky YOTT MX Black, Dell AT101W, 1994 IBM Model M, various integrated
Software FAAAR too much to list
Which is part of the reason both AMD and Intel are so fond of extending the instruction set regularly. There is absolutely no efficiency reason to do so, it just adds complexity to the chip from a performance per watt perspective.

Haha. No. There are absolutely both efficiency and performance reasons. Have you seen any of the AVX performance boosts compared to non-AVX code doing literally the same thing? Around 100% improvement as I recall with around 305 higher power usage, and you can be damn sure there's a few patents on that.

If you want to go older, MMX and SSE are basically the backbone of modern multimedia (high-quality music and graphics) being a thing at all on PCs by and large (you could probably do it all using plain old i486 binary-compatible code with how highly-clocked CPUs are now, vut the speed boosts from using MMX and SSE are real), or if you go even older, the math comprocessor/FPU "extension" is why 3D games are a thing at all, being absolutely needed for the id1 engine powering Wolfenstein 3D and DOOM.

If you want more recent widely-used stuff, just compare an AES-NI-enabled AES encryption program vs x86 AES - even with the full complement of MMX, SSE and AVX extensions, x86 is nowhere near AES-NI.

If you want more essential, look at the various virtualization extensions across the whole CPU industry (VT-x/AMD-V, VT-d/IOMMU, similar stuff on ARM and POWER) that made "the cloud" a thing at all by making fast x86 virtualization possible at all.

So please, do tell me again how instruction set extensions don't add to efficiency, or improve performance per watt, or improve all out performance.

I still maintain, AMD's biggest issue is that they have utterly and completely lost the big x86 server market (Bulldozer and Thuban being unscalable being why such is the case, and has remained unchanged since 2012!) and have essentially zero GPGPU compute presence compared to the sheer amount of of Tesla deployments in the wild. Combine CUDA being decently well-known and loved by the scientific community, as well as Xeon Phi's x86 nature, and going AMD using only OpenCL can be a hard pill to swallow to many. Plus AMD GPUs run hot and use a lot power. Fine for a desktop, deal-breaker when you're trying to fit a few hundred in a single room and have to pay the ills. Oh, and the Fury X is hopeless as a compute card: the radiator is impossible to fit in most servers.

If, and I really mean IF AMD can become competitive again in the server market, they will see a return to profitability. If they are unable to do so, their future is looking pretty bleak.
 
  • Like
Reactions: Fx

Fx

Joined
Oct 31, 2008
Messages
1,332 (0.24/day)
Location
Portland, OR
Processor Ryzen 2600x
Motherboard ASUS ROG Strix X470-F Gaming
Cooling Noctua
Memory G.SKILL Flare X Series 16GB DDR4 3466
Video Card(s) EVGA 980ti FTW
Storage (OS)Samsung 950 Pro (512GB), (Data) WD Reds
Display(s) 24" Dell UltraSharp U2412M
Case Fractal Design Define R5
Audio Device(s) Sennheiser GAME ONE
Power Supply EVGA SuperNOVA 650 P2
Mouse Mionix Castor
Keyboard Deck Hassium Pro
Software Windows 10 Pro x64
That's what Nvidia wanted everyone to believe. They lost the consoles because they couldn't make an x86 APU. They didn't had x86 cores. On the other hand Intel didn't had a good GPU. So Microsoft and Sony gone with the only company that had an x86 CPU AND a good GPU. The fact that AMD is not in a posision to negotiate higher prices, was also a nice bonus for both Microsoft and Sony.

I have always tended to believe that was a BS statement by Nvidia as well for the same reasons. Those were legit wins for AMD.
 
Joined
Jul 31, 2014
Messages
480 (0.14/day)
System Name Diablo | Baal | Mephisto | Andariel
Processor i5-3570K@4.4GHz | 2x Xeon X5675 | i7-4710MQ | i7-2640M
Motherboard Asus Sabertooth Z77 | HP DL380 G6 | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Cooling Swiftech H220-X | Chassis cooled (6 fans + HS) | dual-fanned heatpipes | small-fanned heatpipe
Memory 32GiB DDR3-1600 CL9 | 96GiB DDR3-1333 ECC RDIMM | 32GiB DDR3L-1866 CL11 | 8GiB DDR3L-1600 CL11
Video Card(s) Dual GTX 670 in SLI | Embedded ATi ES1000 | Quadro K2100M | Intel HD 3000
Storage many, many SSDs and HDDs....
Display(s) 1 Dell U3011 + 2x Dell U2410 | HP iLO2 KVMoIP | 3200x1800 Sharp IGZO | 1366x768 IPS with Wacom pen
Case Corsair Obsidian 550D | HP DL380 G6 Chassis | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Audio Device(s) Auzentech X-Fi HomeTheater HD | None | On-board | On-board
Power Supply Corsair AX850 | Dual 750W Redundant PSU (Delta) | Dell 330W+240W (Flextronics) | Lenovo 65W (Delta)
Mouse Logitech G502, Logitech G700s, Logitech G500, Dell optical mouse (emergency backup)
Keyboard 1985 IBM Model F 122-key, Ducky YOTT MX Black, Dell AT101W, 1994 IBM Model M, various integrated
Software FAAAR too much to list
That's what Nvidia wanted everyone to believe. They lost the consoles because they couldn't make an x86 APU. They didn't had x86 cores. On the other hand Intel didn't had a good GPU. So Microsoft and Sony gone with the only company that had an x86 CPU AND a good GPU. The fact that AMD is not in a posision to negotiate higher prices, was also a nice bonus for both Microsoft and Sony.

I have always tended to believe that was a BS statement by Nvidia as well for the same reasons. Those were legit wins for AMD.

I disagree. Why would Nvidia bother with $20 per console when they can sell Tegra for automotive purposes for way more, and then add on massive service contracts for their machine-learning stuff? There's a reason they exited mobile even though they have very interesting chips and could force the industry, and much like IBM with PCs, and recently x86 servers, it amounts to profit margins.

The x86 CPU is a red herring argument: if they wanted to, all the console makers could have easily moved to ARM or (more likely) used POWER again (especially when you factor that Nvidia is one of the founding members of OpenPOWER together with IBM, Mellanox (high-bandwidth, low-latency networking provider), Google and Tyan (mobo manufacturer)). And if the hint isn't obvious enough: Nvidia could have made both an ARM or POWER-based APU for the console people, they declined to do so citing too small margins. As for the software side.. software will compile for both.. more owrk is spent of porting engines to each console's API/ABI than fiddling with the compilers, cause compilers are genuinely good enough, besides, outside of MS, pretty much everyone uses GCC or LLVM-Clang, so support is just as good all around.
 
Joined
Sep 6, 2013
Messages
2,978 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7

Fx

Joined
Oct 31, 2008
Messages
1,332 (0.24/day)
Location
Portland, OR
Processor Ryzen 2600x
Motherboard ASUS ROG Strix X470-F Gaming
Cooling Noctua
Memory G.SKILL Flare X Series 16GB DDR4 3466
Video Card(s) EVGA 980ti FTW
Storage (OS)Samsung 950 Pro (512GB), (Data) WD Reds
Display(s) 24" Dell UltraSharp U2412M
Case Fractal Design Define R5
Audio Device(s) Sennheiser GAME ONE
Power Supply EVGA SuperNOVA 650 P2
Mouse Mionix Castor
Keyboard Deck Hassium Pro
Software Windows 10 Pro x64
I have always tended to believe that was a BS statement by Nvidia as well for the same reasons. Those were legit wins for AMD.

If, and I really mean IF AMD can become competitive again in the server market, they will see a return to profitability. If they are unable to do so, their future is looking pretty bleak.

Agreed. The server business is the real profit in both healthy margins and numbers.
 
Joined
Jul 31, 2014
Messages
480 (0.14/day)
System Name Diablo | Baal | Mephisto | Andariel
Processor i5-3570K@4.4GHz | 2x Xeon X5675 | i7-4710MQ | i7-2640M
Motherboard Asus Sabertooth Z77 | HP DL380 G6 | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Cooling Swiftech H220-X | Chassis cooled (6 fans + HS) | dual-fanned heatpipes | small-fanned heatpipe
Memory 32GiB DDR3-1600 CL9 | 96GiB DDR3-1333 ECC RDIMM | 32GiB DDR3L-1866 CL11 | 8GiB DDR3L-1600 CL11
Video Card(s) Dual GTX 670 in SLI | Embedded ATi ES1000 | Quadro K2100M | Intel HD 3000
Storage many, many SSDs and HDDs....
Display(s) 1 Dell U3011 + 2x Dell U2410 | HP iLO2 KVMoIP | 3200x1800 Sharp IGZO | 1366x768 IPS with Wacom pen
Case Corsair Obsidian 550D | HP DL380 G6 Chassis | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Audio Device(s) Auzentech X-Fi HomeTheater HD | None | On-board | On-board
Power Supply Corsair AX850 | Dual 750W Redundant PSU (Delta) | Dell 330W+240W (Flextronics) | Lenovo 65W (Delta)
Mouse Logitech G502, Logitech G700s, Logitech G500, Dell optical mouse (emergency backup)
Keyboard 1985 IBM Model F 122-key, Ducky YOTT MX Black, Dell AT101W, 1994 IBM Model M, various integrated
Software FAAAR too much to list
I already posted why, posts #80 and #92.

Thanks for partially quoting me and removing all the context I guess? My point was that Nvidia can make much more with Tegra in automotive than they can with consoles, nit that they consider $20 too little (else they wouldn't have bothered with things like the GT 720)

Now, onwards to your posts:

3. Most important. It keeps Nvidia out of the consoles, and considering that many top games are ports from consoles, it keeps AMD's gpus alive.
That's also the reason why Nvidia would kill to be able to supply an x86 APU like chip, even with zero margins. But they can't.

We can all see the effects of PhysX and GameWorks on AMD GPUs. We can even see the effects of GameWorks on older Nvidia GPUs. If Nvidia was controlling GPUs in consoles, then it's proprietary techs would have been already a de facto standard. Every game programmed on consoles would have PhysX and GameWorks in it's core. It would have been close to impossible for AMD to create drivers that would be performing without problems and bad performance even on the simplest console game ports. Every game would have been a Project Cars AT BEST.

Keeping nvidia out of consoles is an incredibly minor win for AMD compared to nvidia completely and utterly dominating AMD in the HPC and server space: $20 per chip shipped vs $1000s per Tesla card, more in support contracts and even more in fully built solutions by Nvidia like GRID.

PhysX works partially even on AMD systems, and is the bigger risk of the two for vendor lock-in than anything else.

Gameworks on the other hand is a much more traditional closed-source code licensing affair, with no restrictions on running it with non-Nvidia hardware. It runs slow on everything because it's heavy (you know, a bit like Crysis back in 2006... except now it has a pretty marketing name instead of being nothing more than a meme). Why does it run particularly slowly on AMD GPUs? Well, quite simply because AMD GPUs are designed quite differently from Nvidia. If most games had gameworks, AMD would simply respond by desoigning a new GPU that looks a lot more like the Fermi/Kepler/Maxwell evoutionary family than GCN. No more, no less.

Much the same happenned with the GeForce 8000 and Radeon HD 2000 when Direct3D 10 changed the rendering pipeline completely: the industry as a whole moved from pixel pipelines to much more general-purpose shader processors instead.

Much the same also happens in the CPU side of things, with how Intel and AMD have vastly different CPU designs that perform vastly differently based on different workloads, the current one being Bulldozer vs Haswell/Broadwell, before that NetBurst vs K8, and even further before that, K6 vs Pentium 2/P6.

Nothing to see here in Gameworks/Physx, so move along and stop bringing it up unless you're ripping apart AMD's driver teams' excuses, in which case, do bring it up as much as possible.

Now, if you say that Gameworks is bad from a more conflict of interest point of view, then remember, TressFX is also around, as well as various other AMD-centric stuff under AMD Gaming. Besides, Gameworks has always existed, albeit less well-marketed under the "The way it's meant to be played" program from way back, but you don't see people whining about it after the first 3-4months of being suspicious, and even then, much less loudly than now.

Consoles where not x86 PCs before this generation. Also Nvidia didn't had almost 80% of the discrete graphics cards market on PCs. Not to mention that it was not aggressively pushing proprietary techs like GameWorks, PhysX, GSync etc. as they do today. Games for PCs also where not ports from consoles. All these combined with the deeper pockets of Nvidia and the stronger relations they have with the game developers would give them the absolute advantage over AMD. And Mantle couldn't become the de facto standard for many reasons. No money, no market share, competition had much bigger influence on game developers, I also think consoles don't use Mantle anyway.

You have to realize something first. Mantle was not meant to give a big advantage to AMD's GPUs. It was made to make that awful Bulldozer architecture look better at games. It was meant to close the gap between Intel cpus and FX cpus. To give an extra push to APU's performance. That's why AMD gave Mantle to Khronos, that's why they stopped developing it when Microsoft announced DX12. The day Microsoft announced DX12, AMD's plan succeeded. Windows 10 could have come without DX12 like Windows 8. You don't know that Microsoft was going to come out with DX12. I don't know that. The only company that needed DX12 yesterday, was AMD, with it's mediocre DX11 drivers and that useless Bulldozer architecture(Thuban at 32nm you morons. Thuban at 32nm). Intel, Nvidia, even Microsoft was happy with the situation. No one from those three cared if DX12 would come out or not. On the other hand AMD was desperate for a low level API. Mantle was the best wasted money AMD had spend.

Those benchmarks did show AMD's problem with the DX11 drivers. But I guess the core of their drivers couldn't change. They should have fixed that problem the day they decided to follow the "more cores" route on the cpu front.

I am not going to repeat my self here. We just see a few things complete differently :)

As I said before, the CPU architecture is an irrelevant argument. Console makers would have been just as happy with ARM or POWER or even MIPS. Obviously nobody besides AMD found it profitable enough to bother custom engineering the silicon for the console makers.

Mantle was a push mostly from DICE (Johan Andersson, specifically, probably also why he/DICE got the first Fury X, ahead of reviewers :)), not from AMD, though AMD was the more responsive company by far, likely because it would make CPUs less of an argument in games. And sure, while Microsoft was happy with D3D11 as it was with no real plans for major re-architecting in the works, Nvidia, AMD and game devs would keep pushing new features, and MS and Khronos (OpenGL/Vulkan) would oblige by extending the APIs as neeeded, as they largely have since D3D10/OGL4. Before Johan pushed and AMD noticed, AMD was happy to coast along and extend D3D10-11/OGL4.x just like Nvidia.

Oh, and no, given where it came from, it's blindingly obvious that Mantle was all about getting better 3D performance. Propping up Bulldozer and Jaguar were just excellent side benefits as far as Johan (DICE) was concerned, but excellent marketing material for AMD if they could make it stick. And try they did, getting a decent amount of support from all corners and whether intentional or not, spent a fair bit of time keeping it closed-source despite their open-source claims.

AMD then gave Mantle to Khronos because not only was Mantle's job as a proof of concept was done, and they had finally sanitized the code and manuals enough it could be both handed over and made open-source. Besides, D3D12 was on, Khronos had started NGOGL to look into lower-level API for OpenGl users - suddenly Mantle was not something AMD could use as a competitive advantage anymore, so they handed it over.

However, it is irrelevant in the grand scheme of things: AMD failed to scale Bulldozer and GCN, partly because basically everyone besides Intel failed to deliver 22/20nm, but mostly because they made foolish bets: On the CPU-side, they tried to build a better, lower-clocked, wider, higher-cored NetBurst and smacked right into the same problems Intel failed to solve over 4 (FOUR!) whole manufacturing nodes, and on the GPU side.. GCN is basically AMD's Fermi, their first truly compute-oriented card, and runs similarly hot, rather amusingly.

Still irrelevant in the scheme of consoles by and large though: all three consoles run modifed version of various existing APIs, all with very low-level access to things, effectively they already had Mantle, and whatever Nvidia would have cooked up if they had accepted.
 
Joined
Sep 6, 2013
Messages
2,978 (0.77/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 16GB G.Skill RIPJAWS 3600 / 16GB G.Skill Aegis 3200 / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, NVMes everywhere / NVMes, more NVMes / Various storage, SATA SSD mostly
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / Sharkoon Rebel 9 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / Coolermaster Devastator / Logitech
Software Windows 10 / Windows 10 / Windows 7
Thanks for partially quoting me and removing all the context I guess?
Are you serious? Everyone can read your post. It's directly over mine! The same is true with this one. Your post is DIRECTLY OVER THIS POST. I just wanted to inform you that I already posted about this. There where two more posts by me that YOU WHERE TOO LAZY TO READ BECAUSE THEY WHERE IN THE 4TH PAGE.

You also never explained why that link you posted in the second page was even relevant to the consoles. I quoted on page three about that, you never answered. Probably you could add post #61 to my other two posts, but never mind.

Next time try to be polite and not incorrectly accuse others. I read that post of yours, I have NO intention in reading your last post after the way you started it. Have a nice day.
 
Joined
Mar 24, 2011
Messages
2,356 (0.49/day)
Location
VT
Processor Intel i7-10700k
Motherboard Gigabyte Aurorus Ultra z490
Cooling Corsair H100i RGB
Memory 32GB (4x8GB) Corsair Vengeance DDR4-3200MHz
Video Card(s) MSI Gaming Trio X 3070 LHR
Display(s) ASUS MG278Q / AOC G2590FX
Case Corsair X4000 iCue
Audio Device(s) Onboard
Power Supply Corsair RM650x 650W Fully Modular
Software Windows 10
Nvidia said after the PS3 they were done with consoles because the razor-thin margins were not worth the overhead in R&D and arguing with Microsoft\Sony over costs. Nintendo has used AMD\ATi for years so it's not surprising they stuck with them. I have no doubt MS\Sony would have done an Intel\Nvidia setup if the cost wouldn't have been astronomical and both companies were willing. Microsoft had already done such a setup in the original Xbox and the costs nearly destroyed their entry into the market (Nvidia was basically ripping them off).

AMD happened to have APU's that fit the bill, but I think a Intel\Nvidia setup could have easily worked. Using a cut down 4-core Intel CPU and something like a cut-down 960 would have resulted in quite the efficient and powerful console. On paper the consoles seem great with 8-cores but those cores are horrible so it would probably be better to go with less cores that are a lot more powerful.

AMD getting the contracts for the consoles was not necessarily about making a lot of money, they knew they wouldn't. It was about fattening up their revenue stream to increase their perceived value for a potential buyer.
 
Joined
Aug 20, 2007
Messages
20,787 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
Haha. No. There are absolutely both efficiency and performance reasons. Have you seen any of the AVX performance boosts compared to non-AVX code doing literally the same thing? Around 100% improvement as I recall with around 305 higher power usage, and you can be damn sure there's a few patents on that.

If you want to go older, MMX and SSE are basically the backbone of modern multimedia (high-quality music and graphics) being a thing at all on PCs by and large (you could probably do it all using plain old i486 binary-compatible code with how highly-clocked CPUs are now, vut the speed boosts from using MMX and SSE are real), or if you go even older, the math comprocessor/FPU "extension" is why 3D games are a thing at all, being absolutely needed for the id1 engine powering Wolfenstein 3D and DOOM.

If you want more recent widely-used stuff, just compare an AES-NI-enabled AES encryption program vs x86 AES - even with the full complement of MMX, SSE and AVX extensions, x86 is nowhere near AES-NI.

If you want more essential, look at the various virtualization extensions across the whole CPU industry (VT-x/AMD-V, VT-d/IOMMU, similar stuff on ARM and POWER) that made "the cloud" a thing at all by making fast x86 virtualization possible at all.

So please, do tell me again how instruction set extensions don't add to efficiency, or improve performance per watt, or improve all out performance.

I still maintain, AMD's biggest issue is that they have utterly and completely lost the big x86 server market (Bulldozer and Thuban being unscalable being why such is the case, and has remained unchanged since 2012!) and have essentially zero GPGPU compute presence compared to the sheer amount of of Tesla deployments in the wild. Combine CUDA being decently well-known and loved by the scientific community, as well as Xeon Phi's x86 nature, and going AMD using only OpenCL can be a hard pill to swallow to many. Plus AMD GPUs run hot and use a lot power. Fine for a desktop, deal-breaker when you're trying to fit a few hundred in a single room and have to pay the ills. Oh, and the Fury X is hopeless as a compute card: the radiator is impossible to fit in most servers.

If, and I really mean IF AMD can become competitive again in the server market, they will see a return to profitability. If they are unable to do so, their future is looking pretty bleak.

You do realize you could probably clock the snot out of the generalized chip were it not for the additional die complexity added by these precious extensions?

More than 200% I'd dare wager.

You are basically arguing a cisc over risc design philosophy... And it's been well established the gains in higher clocked simple processors are usually more than lower clocked complex ones. All without having to code for anything special.

There is far more to the goal of extensions than performance. It actaully has almost nothing to do with that. If anything, it's about catering to a select application... And locking code in.
 
Joined
Jul 31, 2014
Messages
480 (0.14/day)
System Name Diablo | Baal | Mephisto | Andariel
Processor i5-3570K@4.4GHz | 2x Xeon X5675 | i7-4710MQ | i7-2640M
Motherboard Asus Sabertooth Z77 | HP DL380 G6 | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Cooling Swiftech H220-X | Chassis cooled (6 fans + HS) | dual-fanned heatpipes | small-fanned heatpipe
Memory 32GiB DDR3-1600 CL9 | 96GiB DDR3-1333 ECC RDIMM | 32GiB DDR3L-1866 CL11 | 8GiB DDR3L-1600 CL11
Video Card(s) Dual GTX 670 in SLI | Embedded ATi ES1000 | Quadro K2100M | Intel HD 3000
Storage many, many SSDs and HDDs....
Display(s) 1 Dell U3011 + 2x Dell U2410 | HP iLO2 KVMoIP | 3200x1800 Sharp IGZO | 1366x768 IPS with Wacom pen
Case Corsair Obsidian 550D | HP DL380 G6 Chassis | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Audio Device(s) Auzentech X-Fi HomeTheater HD | None | On-board | On-board
Power Supply Corsair AX850 | Dual 750W Redundant PSU (Delta) | Dell 330W+240W (Flextronics) | Lenovo 65W (Delta)
Mouse Logitech G502, Logitech G700s, Logitech G500, Dell optical mouse (emergency backup)
Keyboard 1985 IBM Model F 122-key, Ducky YOTT MX Black, Dell AT101W, 1994 IBM Model M, various integrated
Software FAAAR too much to list
You do realize you could probably clock the snot out of the generalized chip were it not for the additional die complexity added by these precious extensions?

More than 200% I'd dare wager.

You are basically arguing a cisc over risc design philosophy... And it's been well established the gains in higher clocked simple processors are usually more than lower clocked complex ones. All without having to code for anything special.

There is far more to the goal of extensions than performance. It actaully has almost nothing to do with that. If anything, it's about catering to a select application... And locking code in.

At low-core counts, sure. At high core counts, not really: scaling frequency leads to requiruring scaling voltage, and put together, you get anywhere from quadratic (^2) to quartic (^4) scaling. Scaling die area on the other hand, is a little bit more than linear, so while games have too short development cycles to do the massively threaded thing, other stuff like scientific computing, databases, virtualization and the like have much longer development cycles, and thread out as needed to fully fill highly-threaded CPUs.

RISC vs CISC is entirely irrelveant: any modern high-performance core does not execute the instruction directly - they decode it into their own internal instructions, called micro-ops and execute them instead. Combine that with out of order processing, simultaneous multi-threading (HyperThreading is an implementation of that with 2Threads oer CPU), branch predictors and the like, and you get way more from extensions than you do from scaling clockspeeds.

If you don't believe me, compare Broadwell-M to a 1.3GHz Pentium 3. The broadwell M will kick the crap out of the P3 while running slower, in single-core mode, just from having better x86 + SSE.. and then you turn on AVX and it just flies.

Even RISC cores are not immune to such effects: the ARM ISA has grown from ARMv1 to ARMv8, and has had extra instructions added every new release, and much like x86, specialised instructions as the market demands, since they are much faster than the general-purpose stuff. The difference is that on x86 I have core x86 (binary compatible back to at least the 486 CPU)+ a bajillion extensions, while on ARM I have 8 versions of ARM, plus custom stuff if I tweak the core to add custom stuff.
 
Joined
Aug 20, 2007
Messages
20,787 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
If you don't believe me, compare Broadwell-M to a 1.3GHz Pentium 3. The broadwell M will kick the crap out of the P3 while running slower, in single-core mode, just from having better x86 + SSE.. and then you turn on AVX and it just flies.

Of course it would, it's doing much more via the additional space wasted on implementing those instructions but if that Pentium M core was built on a the same process it could likely clock beyond 6Ghz. At least from my understanding of it.

I'll conceed I am not certain about this, as my knowledge comes from the old days when IBM was doing research into this. IBM now only makes extremely high end servers, and even they have made the PowerPC instruction set pretty beefy. My main point was not that you should not use propietary extensions at all, but rather that they have a limited benefit vs keeping the instruction set propietary. Now, there may be SOME benefit in select instances (vector math and AVX are a great example of specialization), but they certainly keep x86 patents from expiring in a useful way and don't underestimate intels evaluation of that.
 
Joined
May 18, 2010
Messages
3,427 (0.67/day)
System Name My baby
Processor Athlon II X4 620 @ 3.5GHz, 1.45v, NB @ 2700Mhz, HT @ 2700Mhz - 24hr prime95 stable
Motherboard Asus M4A785TD-V EVO
Cooling Sonic Tower Rev 2 with 120mm Akasa attached, Akasa @ Front, Xilence Red Wing 120mm @ Rear
Memory 8 GB G.Skills 1600Mhz
Video Card(s) ATI ASUS Crossfire 5850
Storage Crucial MX100 SATA 2.5 SSD
Display(s) Lenovo ThinkVision 27" (LEN P27h-10)
Case Antec VSK 2000 Black Tower Case
Audio Device(s) Onkyo TX-SR309 Receiver, 2x Kef Cresta 1, 1x Kef Center 20c
Power Supply OCZ StealthXstream II 600w, 4x12v/18A, 80% efficiency.
Software Windows 10 Professional 64-bit
...and yet even AMD help Intel's cause by using Intel system builds for their GPU press deck benchmarks. If AMD don't have faith in their own platform, should you expect OEMs to?

This has nothing to do with the fact that Eroldru's AMD performs identical to his Intel for cooler and cheaper.

Fanboys aside, the share price drop is indicative of the mass perception and market belief that AMD are no longer delivering a solid product.
The geeks here are an insignificant minority in the global market. AMD's decline can only be blamed on fanboys if you are one yourself. Stocks don't listen to fanboys, they listen to market presence and profitability.
It's brutally naive to assume AMD's decline is anything other than lack of product and lack of product perception, combined with possible major mismanagement.
This is not a third party fault. Similarly, Intel do just enough to stay far out in front. AMD's lack of product threat means they can work on minimal R&D expenditure with minimal product improvement.
Don't get me wrong (if you do you're illogical) I don't want AMD to disappear. Competition is required for better service and product. I want to see AMD taken over and used properly to create segment desirable products, across a whole range of applications.
They're not done yet but they're getting really close to it.

I would agree overall but there are so many variables that influence a share price. AMD have made poor business decisions along the way.
 
Joined
Sep 7, 2011
Messages
2,785 (0.60/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
This has nothing to do with the fact that Eroldru's AMD performs identical to his Intel for cooler and cheaper.
So what? This thread is about AMD and its financial position, which is predicated upon its marketing, its brand, and its products.
If AMD had the same mindset, why would they use Intel systems to benchmark their graphics cards for public consumption? Are you saying AMD have an Intel bias? AMD don't know how to get the best out of their own graphics benchmarking? Why use a competitors product to showcase your own, and by inference, indicate that the system used would provide the best results?

So you are basically asking me to believe a random forum member over the company that makes the hardware. So either Eroldru is correct and AMD don't know what they're doing in giving Intel free publicity and torpedoing their own enthusiast platform, or AMD did some comparative benchmarking and went with the system that provided the best numbers. I guess only Eroldru and AMD know for sure...oh, and the reviewers of course:
GTA V CPU performance
Battlefield Hardline CPU performance
Evolve CPU performance
Far Cry 4 CPU performance
Dragon Age: Inquisition CPU performance

as the54thvoid intimated, perception is reality in business - and AMD highlighted a competitors product over its own in every flagship graphics launch event of the past few years.
Now, what kind of perception does that engender amongst potential customers?
 
Last edited:
Joined
Jul 31, 2014
Messages
480 (0.14/day)
System Name Diablo | Baal | Mephisto | Andariel
Processor i5-3570K@4.4GHz | 2x Xeon X5675 | i7-4710MQ | i7-2640M
Motherboard Asus Sabertooth Z77 | HP DL380 G6 | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Cooling Swiftech H220-X | Chassis cooled (6 fans + HS) | dual-fanned heatpipes | small-fanned heatpipe
Memory 32GiB DDR3-1600 CL9 | 96GiB DDR3-1333 ECC RDIMM | 32GiB DDR3L-1866 CL11 | 8GiB DDR3L-1600 CL11
Video Card(s) Dual GTX 670 in SLI | Embedded ATi ES1000 | Quadro K2100M | Intel HD 3000
Storage many, many SSDs and HDDs....
Display(s) 1 Dell U3011 + 2x Dell U2410 | HP iLO2 KVMoIP | 3200x1800 Sharp IGZO | 1366x768 IPS with Wacom pen
Case Corsair Obsidian 550D | HP DL380 G6 Chassis | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Audio Device(s) Auzentech X-Fi HomeTheater HD | None | On-board | On-board
Power Supply Corsair AX850 | Dual 750W Redundant PSU (Delta) | Dell 330W+240W (Flextronics) | Lenovo 65W (Delta)
Mouse Logitech G502, Logitech G700s, Logitech G500, Dell optical mouse (emergency backup)
Keyboard 1985 IBM Model F 122-key, Ducky YOTT MX Black, Dell AT101W, 1994 IBM Model M, various integrated
Software FAAAR too much to list
(FF autoscrolled too far and I missed john_'s post... time to fix that!)

Are you serious? Everyone can read your post. It's directly over mine! The same is true with this one. Your post is DIRECTLY OVER THIS POST. I just wanted to inform you that I already posted about this. There where two more posts by me that YOU WHERE TOO LAZY TO READ BECAUSE THEY WHERE IN THE 4TH PAGE.

You also never explained why that link you posted in the second page was even relevant to the consoles. I quoted on page three about that, you never answered. Probably you could add post #61 to my other two posts, but never mind.

Next time try to be polite and not incorrectly accuse others. I read that post of yours, I have NO intention in reading your last post after the way you started it. Have a nice day.

If you wanted to only partially quote me, you should have quoted at least the comparison part. as it is, no matter how close my post is, people (myself included) won't scroll up, and see just a blank statement of Nvidia not caring for consoles because the revenue is small, not that teh revenue is small compared to something else. Welcome to twisting people's words to mean something else. I'll be nice and give you tyhe benefit of doubt and take it as being accidental.

I read the two posts you mentioned specifically at the time. I also read the rest of page 4 many hours earlier, and that's well and truly gone off the stack. If you wanted post #61 included, you should have included it when giving specific examples, since I treated every other post as not part of your reponse. Welcome to how references work.

As for relevance to consoles, the only relevant bit is how Nvidia decided not to pursue consoles and AMD did. All I did was explain ways in which Nvidia could have provided a competing chip (either by integrating POWER or ARM with GeForce in a SoC, or by using an extremely wide link combined with an existing external CPU, which is where the link to an article about Sierra and Summit are relevant, since, due to needs, they built a really wide, high-bandwidth link)

Now, let's have a look at the fames post #61:

Considering that Nvidia was already thinking of lowering prices at the hi end, this is exactly what AMD needed. I don't expect Nvidia to do a huge price cut after these news. They will do the price cuts they where thinking and stop there. Also the prices affect all the cards not just the hi end and at the low/mid range AMD is more than competitive. Also the price redaction affects console processors.

Nvidia would have offered to make console chips even with zero margins. The reason is simple. All console games would be using PhysX and GameWorks by now. AMD gpus would have been considered faulty at best today with many bugs all over the place and poor performance. I think Nvidia tried, but both Sony and Microsoft where having cheap x86 consoles in their minds that would start bringing profits from day one.

Nvidia controlling console GPU would not have resulted in gameworks and physx being everywhere, and even if it were, studios would still have ported them over anyways - remember Gameworks works on any GPU, not just Nvidia, and AMD would have launched a GPU that looked a lot more like Maxwell 2 than GCN. if PhysX became commonplace, AMD and game devs would've found a way to replicate the functionality on non-GeForce platforms. It just hasn't been necessary so far.

And what exactly is the link you show me? I don't find it relevant with consoles or x86 and even if it is, this article is from 11/2014. PS4 and Xbox One where introduced at the end of 2013. So the question is what did Nvidia had to offer at 2012? Looking at wiki, Tegra 3 and that's not an x86 SOC. x86 on all platforms (Xbox One, PS4, PC) makes game development cheaper and faster. The latest rumors say than Nintendo NX will also use AMD's chips, which makes sense because the biggest problem for the Nintendo consoles today is the lack of third party games.

x86 vs blah , as I have explained several times now, is irrelevant: programmers in all industries no longer work in assembly, and will not do so again outside of a bit of hardware init and hand-optimizing certaing HPC and crypto programs, though even that is falling out of favour. So now you're coding in C/C++ because that's what the SDKs like by and large for fast games, and, well, C/C++ and most of it's various libraries and OSes have been ported to all the major platforms (x86, ARM, POWER, MIPS).

The only, and I mean ONLY relevant part of x86 being in consoles is that it makes porting to PC minorly easier (Especially from XBOne or PS4-OpenGL). The bulk of the effort is still cross-API/ABI compatibility/translation layers/shims, as has been since the X360/PS3 generation.

I think Sony and Microsoft know how to negotiate a deal. Also AMD was with its back on the wall. And both those companies knew it. So I am pretty sure they explained to AMD that they had many alternative options for their next gen consoles, and all those options where bad for AMD. So AMD would have to offer them a great deal from the beginning, to guaranteed that Sony and/or Microsoft would not turn to Nvidia and/or Intel for the main parts. I don't think AMD was willing to gamble it's future just so it can secure a better deal.

Based on how Nvidia walked away from all three, I think AMD managed to raise the price of their SoC by being the only viable platform left. Intel doesn't have the GPU power, neither do Qualcomm (Adreno), ARM (Mali) or Imagination (PowerVR), and then you have the driver state of the latter three... which is just hopeless from what we can see on Android. Based on HumanSmoke's link, AMD is charging $100-110 per unit, with 20% margin (hence the $20 number). If AMD were not the only choice, MS and Sony would have pushed for a race to the bottom and dropped that price even lower.

This is pure speculation though, so it's probably wrong, though I suspect the truth isn't that far away based on NV's public statements.

Of course it would, it's doing much more via the additional space wasted on implementing those instructions but if that Pentium M core was built on a the same process it could likely clock beyond 6Ghz. At least from my understanding of it.

I'll conceed I am not certain about this, as my knowledge comes from the old days when IBM was doing research into this. IBM now only makes extremely high end servers, and even they have made the PowerPC instruction set pretty beefy. My main point was not that you should not use propietary extensions at all, but rather that they have a limited benefit vs keeping the instruction set propietary. Now, there may be SOME benefit in select instances (vector math and AVX are a great example of specialization), but they certainly keep x86 patents from expiring in a useful way and don't underestimate intels evaluation of that.

I meant Broadwell M at the same frequency as a P3, running identical code would be faster, but that the extensions would be even faster.

IBM server CPUs are actually the last really high-frequency chips out there. If they could push for even more cores and lower the speeds, they would be outperforming Intel's Haswell-EX platform, but they can't - simply because no fabs can make chips bigger than they're already shipping (each POWER8 core is bigger than a Haswell x86 core), then you have the monstruous memory config of the POWER8 chips. And that's on 22nm SOI, not 22nm FinFET CMOS+High-K (what Intel is using), which allows for the much higher clock speeds, at the cost of absolutely insane power consumption: Tyan's lower-clocked POWER8 CPUs are 190W-247W TDP, IBM's higher-clocked parts go even higher (300W is a pretty conservative estimate by AT), meanwhile Intel's E5-2699v3 and E7-8890v3 are a "mere" 165W.

Keeping the ISA proprietary while a bit of a nasty thing to do, is a status quo neither Intel, AMD or anyone else really wants to change: they get to quash any upstart competition without needing to lift a finger. And if you think AMD is nice about it, think again - Intel sued AMD for AMD64 because AMD did not want to license it, and why would they.. Opteron with AMD64 had single-handedly smashed into the datacenter and cleared out MIPS and SPARC and was well on it's way to clearing POWER from everything but the highest of the high-end systems. Meanwhile, Itanium (Intel and HP's 64bit CPU.. probably one of the nicest architectures built, ever.. with no compiler ever built to use it properly) was floundering hard, and was later killed off by GPUs doing the only thing Itanium ended up being good for: fast, very parallelisable math. Eventually, after a lot of counter-suiing, AMD and Intel settled and cross-licensed a lot of stuff, and continue to do so with all the new ISA extensions.
 
Joined
Aug 20, 2007
Messages
20,787 (3.41/day)
System Name Pioneer
Processor Ryzen R9 7950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage 2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
I can't really disagree with that. My main point that I was trying to make (though we got on a bit of tagent there due to my misunderstanding of modern tech) was that Intel/AMD/whoever loves keeping extensions in house if possible, and we seem to have reached agreement there. ;)
 
Joined
Jul 31, 2014
Messages
480 (0.14/day)
System Name Diablo | Baal | Mephisto | Andariel
Processor i5-3570K@4.4GHz | 2x Xeon X5675 | i7-4710MQ | i7-2640M
Motherboard Asus Sabertooth Z77 | HP DL380 G6 | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Cooling Swiftech H220-X | Chassis cooled (6 fans + HS) | dual-fanned heatpipes | small-fanned heatpipe
Memory 32GiB DDR3-1600 CL9 | 96GiB DDR3-1333 ECC RDIMM | 32GiB DDR3L-1866 CL11 | 8GiB DDR3L-1600 CL11
Video Card(s) Dual GTX 670 in SLI | Embedded ATi ES1000 | Quadro K2100M | Intel HD 3000
Storage many, many SSDs and HDDs....
Display(s) 1 Dell U3011 + 2x Dell U2410 | HP iLO2 KVMoIP | 3200x1800 Sharp IGZO | 1366x768 IPS with Wacom pen
Case Corsair Obsidian 550D | HP DL380 G6 Chassis | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Audio Device(s) Auzentech X-Fi HomeTheater HD | None | On-board | On-board
Power Supply Corsair AX850 | Dual 750W Redundant PSU (Delta) | Dell 330W+240W (Flextronics) | Lenovo 65W (Delta)
Mouse Logitech G502, Logitech G700s, Logitech G500, Dell optical mouse (emergency backup)
Keyboard 1985 IBM Model F 122-key, Ducky YOTT MX Black, Dell AT101W, 1994 IBM Model M, various integrated
Software FAAAR too much to list
I can't really disagree with that. My main point that I was trying to make (though we got on a bit of tagent there due to my misunderstanding of modern tech) was that Intel/AMD/whoever loves keeping extensions in house if possible, and we seem to have reached agreement there. ;)

The patents are a by-product of the extensions, not really the true objective. Mind you, as I said, Intel and AMD are more than happy to keep it nice and closed :).
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.65/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Holy Crap... 100 comments about nothing :D

WHO CARES... As long they are afloat with their design team, they will sell their own product, just as ARM does.

Shareholders care I assure you and probably people buying AMD GPUs may be wondering if AMD will be around to update drivers for their GPU.

AMD shares are down 36% in the last month.

2015 Q2 Net Profit Margin

Intel 20.51%
Nvidia 11.64%
AMD -19.21%
 
Joined
Aug 29, 2008
Messages
88 (0.02/day)
System Name Master & Blaster & BlastFromThePast
Processor Xeon 1241v3 *103x39 | Xeon L5408 @3.2GHz | P4 2.8 GHz
Motherboard Z97 PRO-GAMER / P45-DS3R | P4MDPT
Cooling Freezer / S775 Copper / S478 Stock
Memory 16GB DDR3-2270 / 8 GB DDR2-800 / 2GB DDR PC3200
Video Card(s) Nitro RX570 8GB 1411 MHz| HD 5770 | Radeon 9600 XT 128 MB
Storage NVME 250+1TB SM2263XT | 256 GB SM2259XT | IDE 250GB
Display(s) MANY
Case SANDBLASTED
Power Supply APS-550 Cap-Modded / 350W Cap-modded
Customers were buying intel cpu's even when AMD had better products out there not only value for money wise. Back in the ~2004 when P4 was just as FX cpu are now press/reviewers was not that vicious against intel products. Check reviews from 2012 etc and read of how horrible FX cpu's and their architect is. Not only that intel got a nice 1.4b from EU for antitrust and other similar cases. Let along the famous case when "
Intel finally agrees to pay $15 to Pentium 4 owners" etc. In fact even if we could be back at 2005 when Athlon x2 were rolfstomping the room-heaters p4 amd would not gain that current 80% market of intel's. Cause customers are not all that informed and real life performance. People pay 400$ for a i7 when its production cost let along the innovation from 2011 Sandy is minimal. . Intel are good at many things but they are the best if they want to run u dry of money, chipsets / new sockets / tiny updates ... call it whatever. From 1156 to 1151 in ~4.5 years ... LGA775 lasted more that all these sockets together. Even if zen comes and can compete against intel products, we may not have CPU price wars, might be the other way around amd zen being overpriced and i5 becoming the new vfm king cpus while still being over ~200$. Sometimes customers have to say no to overpriced recycled tech with just fancier I/O and +/- 1pin each year.
 
Joined
Jul 31, 2014
Messages
480 (0.14/day)
System Name Diablo | Baal | Mephisto | Andariel
Processor i5-3570K@4.4GHz | 2x Xeon X5675 | i7-4710MQ | i7-2640M
Motherboard Asus Sabertooth Z77 | HP DL380 G6 | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Cooling Swiftech H220-X | Chassis cooled (6 fans + HS) | dual-fanned heatpipes | small-fanned heatpipe
Memory 32GiB DDR3-1600 CL9 | 96GiB DDR3-1333 ECC RDIMM | 32GiB DDR3L-1866 CL11 | 8GiB DDR3L-1600 CL11
Video Card(s) Dual GTX 670 in SLI | Embedded ATi ES1000 | Quadro K2100M | Intel HD 3000
Storage many, many SSDs and HDDs....
Display(s) 1 Dell U3011 + 2x Dell U2410 | HP iLO2 KVMoIP | 3200x1800 Sharp IGZO | 1366x768 IPS with Wacom pen
Case Corsair Obsidian 550D | HP DL380 G6 Chassis | Dell Precision M4800 | Lenovo Thinkpad X220 Tablet
Audio Device(s) Auzentech X-Fi HomeTheater HD | None | On-board | On-board
Power Supply Corsair AX850 | Dual 750W Redundant PSU (Delta) | Dell 330W+240W (Flextronics) | Lenovo 65W (Delta)
Mouse Logitech G502, Logitech G700s, Logitech G500, Dell optical mouse (emergency backup)
Keyboard 1985 IBM Model F 122-key, Ducky YOTT MX Black, Dell AT101W, 1994 IBM Model M, various integrated
Software FAAAR too much to list
Customers were buying intel cpu's even when AMD had better products out there not only value for money wise. Back in the ~2004 when P4 was just as FX cpu are now press/reviewers was not that vicious against intel products. Check reviews from 2012 etc and read of how horrible FX cpu's and their architect is. Not only that intel got a nice 1.4b from EU for antitrust and other similar cases. Let along the famous case when "
Intel finally agrees to pay $15 to Pentium 4 owners" etc. In fact even if we could be back at 2005 when Athlon x2 were rolfstomping the room-heaters p4 amd would not gain that current 80% market of intel's. Cause customers are not all that informed and real life performance. People pay 400$ for a i7 when its production cost let along the innovation from 2011 Sandy is minimal. . Intel are good at many things but they are the best if they want to run u dry of money, chipsets / new sockets / tiny updates ... call it whatever. From 1156 to 1151 in ~4.5 years ... LGA775 lasted more that all these sockets together. Even if zen comes and can compete against intel products, we may not have CPU price wars, might be the other way around amd zen being overpriced and i5 becoming the new vfm king cpus while still being over ~200$. Sometimes customers have to say no to overpriced recycled tech with just fancier I/O and +/- 1pin each year.

That was 2004, today it's 2015, and there's a lot less of that nonsense going on. Remember Nvidia and ATi taking their turns at faking 3DMark results too? Or AMD raising not licensing AMD64 to Intel? ALL of the companies were raked over the coals good and proper and/or sat through some protacted legal wrangling for that nastiness, and it's just not there anymore, not since VIA bowed out and Transmeta got glomped by Nvidia (who still has an x86 license they agreed to not use in exchange for a very large sum of cash from Intel).

Why Intel is winning? Well it's simple: Nobody can in good conscience recommend anything AMD right now CPU-side to anyone: Intel simply has the better performance at near enough price on the consumer side, and nothing competitive server-side. Oh, and let's not forget, AMD has never managed to build a decent mobile CPU to compete with Pentium M and Core. You can say what you want, but not showing up means you lose.

As for the constant socket change, I for one do NOT want a redux of LGA775, where you had to match chipsets to VRMs and BIOS in order to figure out which CPUs you could run. No thanks, I'll take constant socket changes where I can blindly dump a matching CPU/socket combo over that particular mess.

Why Intel isn't releasing the big 30% improvements anymore, it's quite simple: they've run out of the big improvements, and as a result, are scaling core counts and power instead, because that's what they can do at all by and large. You can read up more if you want on my posts here and here, as well as my conversation with @R-T-B in this very thread a few posts above your own.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
7,566 (2.34/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling U12A + T30┃AXP120x67
Memory 64GB 6000CL30┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Display(s) 43" QN90B / 32" M32Q / 27" S2721DGF
Case Caselabs S3┃Lazer3D HT5
Power Supply Corsair HX1000┃HDPlex
That was 2004, today it's 2015, and there's a lot less of that nonsense going on. Remember Nvidia and ATi taking their turns at faking 3DMark results too? Or AMD raising not licensing AMD64 to Intel? ALL of the companies were raked over the coals good and proper and/or sat through some protacted legal wrangling for that nastiness, and it's just not there anymore, not since VIA bowed out and Transmeta got glomped by Nvidia (who still has an x86 license they agreed to not use in exchange for a very large sum of cash from Intel).

Why Intel is winning? Well it's simple: Nobody can in good conscience recommend anything AMD right now CPU-side to anyone: Intel simply has the better performance at near enough price on the consumer side, and nothing competitive server-side. Oh, and let's not forget, AMD has never managed to build a decent mobile CPU to compete with Pentium M and Core. You can say what you want, but not showing up means you lose.

As for the constant socket change, I for one do NOT want a redux of LGA775, where you had to match chipsets to VRMs and BIOS in order to figure out which CPUs you could run. No thanks, I'll take constant socket changes where I can blindly dump a matching CPU/socket combo over that particular mess.

Why Intel isn't releasing the big 30% improvements anymore, it's quite simple: they've run out of the big improvements, and as a result, are scaling core counts and power instead, because that's what they can do at all by and large. You can read up more if you want on my posts here and here, as well as my conversation with @R-T-B in this very thread a few posts above your own.

Spot on. In addition, AMD could not build a CPU to keep up with the evolution of Core. It was clear that K10 was falling behind, but Bulldozer just went and beheaded K10 in a bloody mess. Now, because of the limitations of Bulldozer, Opteron has virtually died. And all companies from IBM to AMD and Nvidia and Intel have all played in shady dealings in the past. Intel's are simply the only ones that are well-known because everyone attributes AMD's recent failures to Intel.

Also, a wonderful argument regarding sockets. LGA775 was a gong show. DDR2 and DDR3, FSB 800, 1066 and 1333 all existing on one socket made for a hell of a mess. The current socket strategy makes it much easier for less-knowledgeable consumers to get something that works.
 
Top