Friday, July 17th 2015

AMD Now Almost Worth A Quarter of What it Paid for ATI

It's been gloomy at the markets in the wake of the European economic crisis. This along with a revised quarterly outlook released by the company, hit AMD very hard over the past week. The AMD stock opened to a stock price of 1.87 down -0.09 or -4.59% at the time of writing this report, which sets the company's market capitalization at $1.53 billion. This is almost a quarter of what AMD paid to acquire ATI Technology, about a decade ago ($5.60 billion). Earlier this month, AMD took a steep fall of -15.59%, seeing its market cap drop by a quarter.

Intel is now worth $140.8 billion (92 times more), and NVIDIA $10.7 billion (7 times more). Among the issues affecting AMD are decline in PC sales and stiff competition. However, reasonably positive earnings put out by Intel disproves AMD's excuse that the market is to blame for bad performance, and the company could slide even further, hitting its all-time-low at the financial markets. The company will host an earnings call later today.
Source: Google Finance
Add your own comment

136 Comments on AMD Now Almost Worth A Quarter of What it Paid for ATI

#101
Ferrum Master
Holy Crap... 100 comments about nothing :D

WHO CARES... As long they are afloat with their design team, they will sell their own product, just as ARM does.
Posted on Reply
#102
Casecutter
buildzoidIf the stock drops you can just wait for it to go back up.
Exactly, where would you be had you bought Apple stock just before the BOD brought back Steve Jobs in 1997? Those where grim days and remember it took like 7 years to see stock begin to march up. You just buy some, and kiss good-bye that money, only to revisit it after 10-15 years (or those who live on) to judge. At such point you're presently surprised, or like those that gamble several hundred, kiss it good-bye the moment it left your hand. Folk that loose 500-1k on Blackjack don't normally go on about 10-15 year later a say that was stupid... is it any different? Well other than the instant gratification if you win (some peg it at ~18%) and that's if you walk away right then. What are the odds that AMD can move beyond this, and double your money, IDK.
Posted on Reply
#103
HumanSmoke
AquinusI think we can safely say that AMD won't die off. Intel wouldn't allow it because that would mean potentially huge anti-trust litigation against Intel.
Unlikely. Anti-trust litigation applies to wrong doing. Intel can't be held directly responsible for the failure of a company based on their own mismanagement - if it were, AMD's board of directors would have carte blanche to throw money away like used Kleenex........oh, nvm.
In the event of AMD going under (extremely unlikely since the company could exist - in an extreme situation- as a design house drawing revenue solely from IP), all that would happen is that Intel would be forced to operate under a Consent Decree to ensure that the company did not take advantage of their position. Consent Decree is basically the only thing that kept IBM in checkfor decades when it could quite easily have squeezed the "seven dwarfs" out of the mainframe market.
AquinusHonestly, I would rather see AMD focus on something. They don't have the money (never did,) to compete with Intel and nVidia at the same time.
It would have been better if AMD themselves had come to this same decision voluntarily, rather than having it forced upon them in a series of painful amputations - although I'm not sure AMD's BoD have the stones to set a long term goal based upon core competency. AMD have a tendency to follow trends, not set them.
AquinusI know that AMD's X86 license isn't transferable in an acquisition and how it plays in the case of a merger is another story as well. I'm not so sure about shader technology though, as AMD bought ATI and all of the IP seemed to go with it.
A merger is still ownership change (as an entity) as far as the agreement is concerned. Section 5.2 of the agreementhas the relevant clauses. You're right about the IP complications - and it isn't just ATI. A large part of ATI's IP originated from 3DLabs - who in turn had acquired Dynamic Pictures, Chromatic Research, and Intergraph's graphic division. All these companies has IP sharing in place with other companies that ended up being subsumed other players ( Nvidia's acquisition of SGI's graphics IP for example) - in addition Nvidia and ATI had a very non-adversarial relationship once they'd seen off the other graphics players (S3, 3Dfx, Matrox, Rendition etc.)
AquinusThis is a case where Samsung can't get into x86, but they could partner with AMD to produce and develop the GPU side of APUs which would give huge benefits to both sides I would imagine
Graphics and parallelization is where its at (or will be). As far as x86 is concerned, I doubt anyone would actually want it - imagine how far behind a potential "new" player would be. Intel is too entrenched in the high margin x86 enterprise markets, and the low end is a dogfight for the lowest pricing/power envelope for a limited feature set between x86 and ARM on razor thin margins.
AquinusSo I would like to see more focus out of AMD. In all seriousness, I think AMD needs help; a partner that could offer assistance in one market....
Basically you're saying is that AMD need management vision and a defined strategic plan that can be augmented by new technologies/markets, and not waver from it at the first sign of something shiny distracting them? Taking the decision making out of AMD's BoDs hands? I fully agree if that is the case, although just bringing in someone with a proven track record of success and understanding of strategic planning might suffice (Lisa Su's previous work experience doesn't indicate that she is the ONE). Renée James will be looking for a CEO position next year, and would be a great catch for AMD - whether AMD could tempt her is another matter entirely.
Posted on Reply
#104
ZeDestructor
R-T-BWhich is part of the reason both AMD and Intel are so fond of extending the instruction set regularly. There is absolutely no efficiency reason to do so, it just adds complexity to the chip from a performance per watt perspective.
Haha. No. There are absolutely both efficiency and performance reasons. Have you seen any of the AVX performance boosts compared to non-AVX code doing literally the same thing? Around 100% improvement as I recall with around 305 higher power usage, and you can be damn sure there's a few patents on that.

If you want to go older, MMX and SSE are basically the backbone of modern multimedia (high-quality music and graphics) being a thing at all on PCs by and large (you could probably do it all using plain old i486 binary-compatible code with how highly-clocked CPUs are now, vut the speed boosts from using MMX and SSE are real), or if you go even older, the math comprocessor/FPU "extension" is why 3D games are a thing at all, being absolutely needed for the id1 engine powering Wolfenstein 3D and DOOM.

If you want more recent widely-used stuff, just compare an AES-NI-enabled AES encryption program vs x86 AES - even with the full complement of MMX, SSE and AVX extensions, x86 is nowhere near AES-NI.

If you want more essential, look at the various virtualization extensions across the whole CPU industry (VT-x/AMD-V, VT-d/IOMMU, similar stuff on ARM and POWER) that made "the cloud" a thing at all by making fast x86 virtualization possible at all.

So please, do tell me again how instruction set extensions don't add to efficiency, or improve performance per watt, or improve all out performance.

I still maintain, AMD's biggest issue is that they have utterly and completely lost the big x86 server market (Bulldozer and Thuban being unscalable being why such is the case, and has remained unchanged since 2012!) and have essentially zero GPGPU compute presence compared to the sheer amount of of Tesla deployments in the wild. Combine CUDA being decently well-known and loved by the scientific community, as well as Xeon Phi's x86 nature, and going AMD using only OpenCL can be a hard pill to swallow to many. Plus AMD GPUs run hot and use a lot power. Fine for a desktop, deal-breaker when you're trying to fit a few hundred in a single room and have to pay the ills. Oh, and the Fury X is hopeless as a compute card: the radiator is impossible to fit in most servers.

If, and I really mean IF AMD can become competitive again in the server market, they will see a return to profitability. If they are unable to do so, their future is looking pretty bleak.
Posted on Reply
#105
Fx
john_That's what Nvidia wanted everyone to believe. They lost the consoles because they couldn't make an x86 APU. They didn't had x86 cores. On the other hand Intel didn't had a good GPU. So Microsoft and Sony gone with the only company that had an x86 CPU AND a good GPU. The fact that AMD is not in a posision to negotiate higher prices, was also a nice bonus for both Microsoft and Sony.
I have always tended to believe that was a BS statement by Nvidia as well for the same reasons. Those were legit wins for AMD.
Posted on Reply
#106
ZeDestructor
john_That's what Nvidia wanted everyone to believe. They lost the consoles because they couldn't make an x86 APU. They didn't had x86 cores. On the other hand Intel didn't had a good GPU. So Microsoft and Sony gone with the only company that had an x86 CPU AND a good GPU. The fact that AMD is not in a posision to negotiate higher prices, was also a nice bonus for both Microsoft and Sony.
FxI have always tended to believe that was a BS statement by Nvidia as well for the same reasons. Those were legit wins for AMD.
I disagree. Why would Nvidia bother with $20 per console when they can sell Tegra for automotive purposes for way more, and then add on massive service contracts for their machine-learning stuff? There's a reason they exited mobile even though they have very interesting chips and could force the industry, and much like IBM with PCs, and recently x86 servers, it amounts to profit margins.

The x86 CPU is a red herring argument: if they wanted to, all the console makers could have easily moved to ARM or (more likely) used POWER again (especially when you factor that Nvidia is one of the founding members of OpenPOWER together with IBM, Mellanox (high-bandwidth, low-latency networking provider), Google and Tyan (mobo manufacturer)). And if the hint isn't obvious enough: Nvidia could have made both an ARM or POWER-based APU for the console people, they declined to do so citing too small margins. As for the software side.. software will compile for both.. more owrk is spent of porting engines to each console's API/ABI than fiddling with the compilers, cause compilers are genuinely good enough, besides, outside of MS, pretty much everyone uses GCC or LLVM-Clang, so support is just as good all around.
Posted on Reply
#107
john_
ZeDestructorWhy would Nvidia bother with $20 per console
I already posted why, posts #80 and #92.
Posted on Reply
#108
Fx
FxI have always tended to believe that was a BS statement by Nvidia as well for the same reasons. Those were legit wins for AMD.
ZeDestructorIf, and I really mean IF AMD can become competitive again in the server market, they will see a return to profitability. If they are unable to do so, their future is looking pretty bleak.
Agreed. The server business is the real profit in both healthy margins and numbers.
Posted on Reply
#109
ZeDestructor
john_I already posted why, posts #80 and #92.
Thanks for partially quoting me and removing all the context I guess? My point was that Nvidia can make much more with Tegra in automotive than they can with consoles, nit that they consider $20 too little (else they wouldn't have bothered with things like the GT 720)

Now, onwards to your posts:
john_3. Most important. It keeps Nvidia out of the consoles, and considering that many top games are ports from consoles, it keeps AMD's gpus alive.
That's also the reason why Nvidia would kill to be able to supply an x86 APU like chip, even with zero margins. But they can't.

We can all see the effects of PhysX and GameWorks on AMD GPUs. We can even see the effects of GameWorks on older Nvidia GPUs. If Nvidia was controlling GPUs in consoles, then it's proprietary techs would have been already a de facto standard. Every game programmed on consoles would have PhysX and GameWorks in it's core. It would have been close to impossible for AMD to create drivers that would be performing without problems and bad performance even on the simplest console game ports. Every game would have been a Project Cars AT BEST.
Keeping nvidia out of consoles is an incredibly minor win for AMD compared to nvidia completely and utterly dominating AMD in the HPC and server space: $20 per chip shipped vs $1000s per Tesla card, more in support contracts and even more in fully built solutions by Nvidia like GRID.

PhysX works partially even on AMD systems, and is the bigger risk of the two for vendor lock-in than anything else.

Gameworks on the other hand is a much more traditional closed-source code licensing affair, with no restrictions on running it with non-Nvidia hardware. It runs slow on everything because it's heavy (you know, a bit like Crysis back in 2006... except now it has a pretty marketing name instead of being nothing more than a meme). Why does it run particularly slowly on AMD GPUs? Well, quite simply because AMD GPUs are designed quite differently from Nvidia. If most games had gameworks, AMD would simply respond by desoigning a new GPU that looks a lot more like the Fermi/Kepler/Maxwell evoutionary family than GCN. No more, no less.

Much the same happenned with the GeForce 8000 and Radeon HD 2000 when Direct3D 10 changed the rendering pipeline completely: the industry as a whole moved from pixel pipelines to much more general-purpose shader processors instead.

Much the same also happens in the CPU side of things, with how Intel and AMD have vastly different CPU designs that perform vastly differently based on different workloads, the current one being Bulldozer vs Haswell/Broadwell, before that NetBurst vs K8, and even further before that, K6 vs Pentium 2/P6.

Nothing to see here in Gameworks/Physx, so move along and stop bringing it up unless you're ripping apart AMD's driver teams' excuses, in which case, do bring it up as much as possible.

Now, if you say that Gameworks is bad from a more conflict of interest point of view, then remember, TressFX is also around, as well as various other AMD-centric stuff under AMD Gaming. Besides, Gameworks has always existed, albeit less well-marketed under the "The way it's meant to be played" program from way back, but you don't see people whining about it after the first 3-4months of being suspicious, and even then, much less loudly than now.
john_Consoles where not x86 PCs before this generation. Also Nvidia didn't had almost 80% of the discrete graphics cards market on PCs. Not to mention that it was not aggressively pushing proprietary techs like GameWorks, PhysX, GSync etc. as they do today. Games for PCs also where not ports from consoles. All these combined with the deeper pockets of Nvidia and the stronger relations they have with the game developers would give them the absolute advantage over AMD. And Mantle couldn't become the de facto standard for many reasons. No money, no market share, competition had much bigger influence on game developers, I also think consoles don't use Mantle anyway.

You have to realize something first. Mantle was not meant to give a big advantage to AMD's GPUs. It was made to make that awful Bulldozer architecture look better at games. It was meant to close the gap between Intel cpus and FX cpus. To give an extra push to APU's performance. That's why AMD gave Mantle to Khronos, that's why they stopped developing it when Microsoft announced DX12. The day Microsoft announced DX12, AMD's plan succeeded. Windows 10 could have come without DX12 like Windows 8. You don't know that Microsoft was going to come out with DX12. I don't know that. The only company that needed DX12 yesterday, was AMD, with it's mediocre DX11 drivers and that useless Bulldozer architecture(Thuban at 32nm you morons. Thuban at 32nm). Intel, Nvidia, even Microsoft was happy with the situation. No one from those three cared if DX12 would come out or not. On the other hand AMD was desperate for a low level API. Mantle was the best wasted money AMD had spend.

Those benchmarks did show AMD's problem with the DX11 drivers. But I guess the core of their drivers couldn't change. They should have fixed that problem the day they decided to follow the "more cores" route on the cpu front.

I am not going to repeat my self here. We just see a few things complete differently :)
As I said before, the CPU architecture is an irrelevant argument. Console makers would have been just as happy with ARM or POWER or even MIPS. Obviously nobody besides AMD found it profitable enough to bother custom engineering the silicon for the console makers.

Mantle was a push mostly from DICE (Johan Andersson, specifically, probably also why he/DICE got the first Fury X, ahead of reviewers :)), not from AMD, though AMD was the more responsive company by far, likely because it would make CPUs less of an argument in games. And sure, while Microsoft was happy with D3D11 as it was with no real plans for major re-architecting in the works, Nvidia, AMD and game devs would keep pushing new features, and MS and Khronos (OpenGL/Vulkan) would oblige by extending the APIs as neeeded, as they largely have since D3D10/OGL4. Before Johan pushed and AMD noticed, AMD was happy to coast along and extend D3D10-11/OGL4.x just like Nvidia.

Oh, and no, given where it came from, it's blindingly obvious that Mantle was all about getting better 3D performance. Propping up Bulldozer and Jaguar were just excellent side benefits as far as Johan (DICE) was concerned, but excellent marketing material for AMD if they could make it stick. And try they did, getting a decent amount of support from all corners and whether intentional or not, spent a fair bit of time keeping it closed-source despite their open-source claims.

AMD then gave Mantle to Khronos because not only was Mantle's job as a proof of concept was done, and they had finally sanitized the code and manuals enough it could be both handed over and made open-source. Besides, D3D12 was on, Khronos had started NGOGL to look into lower-level API for OpenGl users - suddenly Mantle was not something AMD could use as a competitive advantage anymore, so they handed it over.

However, it is irrelevant in the grand scheme of things: AMD failed to scale Bulldozer and GCN, partly because basically everyone besides Intel failed to deliver 22/20nm, but mostly because they made foolish bets: On the CPU-side, they tried to build a better, lower-clocked, wider, higher-cored NetBurst and smacked right into the same problems Intel failed to solve over 4 (FOUR!) whole manufacturing nodes, and on the GPU side.. GCN is basically AMD's Fermi, their first truly compute-oriented card, and runs similarly hot, rather amusingly.

Still irrelevant in the scheme of consoles by and large though: all three consoles run modifed version of various existing APIs, all with very low-level access to things, effectively they already had Mantle, and whatever Nvidia would have cooked up if they had accepted.
Posted on Reply
#110
john_
ZeDestructorThanks for partially quoting me and removing all the context I guess?
Are you serious? Everyone can read your post. It's directly over mine! The same is true with this one. Your post is DIRECTLY OVER THIS POST. I just wanted to inform you that I already posted about this. There where two more posts by me that YOU WHERE TOO LAZY TO READ BECAUSE THEY WHERE IN THE 4TH PAGE.

You also never explained why that link you posted in the second page was even relevant to the consoles. I quoted on page three about that, you never answered. Probably you could add post #61 to my other two posts, but never mind.

Next time try to be polite and not incorrectly accuse others. I read that post of yours, I have NO intention in reading your last post after the way you started it. Have a nice day.
Posted on Reply
#111
xenocide
Nvidia said after the PS3 they were done with consoles because the razor-thin margins were not worth the overhead in R&D and arguing with Microsoft\Sony over costs. Nintendo has used AMD\ATi for years so it's not surprising they stuck with them. I have no doubt MS\Sony would have done an Intel\Nvidia setup if the cost wouldn't have been astronomical and both companies were willing. Microsoft had already done such a setup in the original Xbox and the costs nearly destroyed their entry into the market (Nvidia was basically ripping them off).

AMD happened to have APU's that fit the bill, but I think a Intel\Nvidia setup could have easily worked. Using a cut down 4-core Intel CPU and something like a cut-down 960 would have resulted in quite the efficient and powerful console. On paper the consoles seem great with 8-cores but those cores are horrible so it would probably be better to go with less cores that are a lot more powerful.

AMD getting the contracts for the consoles was not necessarily about making a lot of money, they knew they wouldn't. It was about fattening up their revenue stream to increase their perceived value for a potential buyer.
Posted on Reply
#112
R-T-B
ZeDestructorHaha. No. There are absolutely both efficiency and performance reasons. Have you seen any of the AVX performance boosts compared to non-AVX code doing literally the same thing? Around 100% improvement as I recall with around 305 higher power usage, and you can be damn sure there's a few patents on that.

If you want to go older, MMX and SSE are basically the backbone of modern multimedia (high-quality music and graphics) being a thing at all on PCs by and large (you could probably do it all using plain old i486 binary-compatible code with how highly-clocked CPUs are now, vut the speed boosts from using MMX and SSE are real), or if you go even older, the math comprocessor/FPU "extension" is why 3D games are a thing at all, being absolutely needed for the id1 engine powering Wolfenstein 3D and DOOM.

If you want more recent widely-used stuff, just compare an AES-NI-enabled AES encryption program vs x86 AES - even with the full complement of MMX, SSE and AVX extensions, x86 is nowhere near AES-NI.

If you want more essential, look at the various virtualization extensions across the whole CPU industry (VT-x/AMD-V, VT-d/IOMMU, similar stuff on ARM and POWER) that made "the cloud" a thing at all by making fast x86 virtualization possible at all.

So please, do tell me again how instruction set extensions don't add to efficiency, or improve performance per watt, or improve all out performance.

I still maintain, AMD's biggest issue is that they have utterly and completely lost the big x86 server market (Bulldozer and Thuban being unscalable being why such is the case, and has remained unchanged since 2012!) and have essentially zero GPGPU compute presence compared to the sheer amount of of Tesla deployments in the wild. Combine CUDA being decently well-known and loved by the scientific community, as well as Xeon Phi's x86 nature, and going AMD using only OpenCL can be a hard pill to swallow to many. Plus AMD GPUs run hot and use a lot power. Fine for a desktop, deal-breaker when you're trying to fit a few hundred in a single room and have to pay the ills. Oh, and the Fury X is hopeless as a compute card: the radiator is impossible to fit in most servers.

If, and I really mean IF AMD can become competitive again in the server market, they will see a return to profitability. If they are unable to do so, their future is looking pretty bleak.
You do realize you could probably clock the snot out of the generalized chip were it not for the additional die complexity added by these precious extensions?

More than 200% I'd dare wager.

You are basically arguing a cisc over risc design philosophy... And it's been well established the gains in higher clocked simple processors are usually more than lower clocked complex ones. All without having to code for anything special.

There is far more to the goal of extensions than performance. It actaully has almost nothing to do with that. If anything, it's about catering to a select application... And locking code in.
Posted on Reply
#113
ZeDestructor
R-T-BYou do realize you could probably clock the snot out of the generalized chip were it not for the additional die complexity added by these precious extensions?

More than 200% I'd dare wager.

You are basically arguing a cisc over risc design philosophy... And it's been well established the gains in higher clocked simple processors are usually more than lower clocked complex ones. All without having to code for anything special.

There is far more to the goal of extensions than performance. It actaully has almost nothing to do with that. If anything, it's about catering to a select application... And locking code in.
At low-core counts, sure. At high core counts, not really: scaling frequency leads to requiruring scaling voltage, and put together, you get anywhere from quadratic (^2) to quartic (^4) scaling. Scaling die area on the other hand, is a little bit more than linear, so while games have too short development cycles to do the massively threaded thing, other stuff like scientific computing, databases, virtualization and the like have much longer development cycles, and thread out as needed to fully fill highly-threaded CPUs.

RISC vs CISC is entirely irrelveant: any modern high-performance core does not execute the instruction directly - they decode it into their own internal instructions, called micro-ops and execute them instead. Combine that with out of order processing, simultaneous multi-threading (HyperThreading is an implementation of that with 2Threads oer CPU), branch predictors and the like, and you get way more from extensions than you do from scaling clockspeeds.

If you don't believe me, compare Broadwell-M to a 1.3GHz Pentium 3. The broadwell M will kick the crap out of the P3 while running slower, in single-core mode, just from having better x86 + SSE.. and then you turn on AVX and it just flies.

Even RISC cores are not immune to such effects: the ARM ISA has grown from ARMv1 to ARMv8, and has had extra instructions added every new release, and much like x86, specialised instructions as the market demands, since they are much faster than the general-purpose stuff. The difference is that on x86 I have core x86 (binary compatible back to at least the 486 CPU)+ a bajillion extensions, while on ARM I have 8 versions of ARM, plus custom stuff if I tweak the core to add custom stuff.
Posted on Reply
#114
R-T-B
If you don't believe me, compare Broadwell-M to a 1.3GHz Pentium 3. The broadwell M will kick the crap out of the P3 while running slower, in single-core mode, just from having better x86 + SSE.. and then you turn on AVX and it just flies.
Of course it would, it's doing much more via the additional space wasted on implementing those instructions but if that Pentium M core was built on a the same process it could likely clock beyond 6Ghz. At least from my understanding of it.

I'll conceed I am not certain about this, as my knowledge comes from the old days when IBM was doing research into this. IBM now only makes extremely high end servers, and even they have made the PowerPC instruction set pretty beefy. My main point was not that you should not use propietary extensions at all, but rather that they have a limited benefit vs keeping the instruction set propietary. Now, there may be SOME benefit in select instances (vector math and AVX are a great example of specialization), but they certainly keep x86 patents from expiring in a useful way and don't underestimate intels evaluation of that.
Posted on Reply
#115
Dent1
HumanSmoke...and yet even AMD help Intel's cause by using Intel system builds for their GPU press deck benchmarks. If AMD don't have faith in their own platform, should you expect OEMs to?
This has nothing to do with the fact that Eroldru's AMD performs identical to his Intel for cooler and cheaper.
the54thvoidFanboys aside, the share price drop is indicative of the mass perception and market belief that AMD are no longer delivering a solid product.
The geeks here are an insignificant minority in the global market. AMD's decline can only be blamed on fanboys if you are one yourself. Stocks don't listen to fanboys, they listen to market presence and profitability.
It's brutally naive to assume AMD's decline is anything other than lack of product and lack of product perception, combined with possible major mismanagement.
This is not a third party fault. Similarly, Intel do just enough to stay far out in front. AMD's lack of product threat means they can work on minimal R&D expenditure with minimal product improvement.
Don't get me wrong (if you do you're illogical) I don't want AMD to disappear. Competition is required for better service and product. I want to see AMD taken over and used properly to create segment desirable products, across a whole range of applications.
They're not done yet but they're getting really close to it.
I would agree overall but there are so many variables that influence a share price. AMD have made poor business decisions along the way.
Posted on Reply
#116
HumanSmoke
Dent1This has nothing to do with the fact that Eroldru's AMD performs identical to his Intel for cooler and cheaper.
So what? This thread is about AMD and its financial position, which is predicated upon its marketing, its brand, and its products.
If AMD had the same mindset, why would they use Intel systems to benchmark their graphics cards for public consumption? Are you saying AMD have an Intel bias? AMD don't know how to get the best out of their own graphics benchmarking? Why use a competitors product to showcase your own, and by inference, indicate that the system used would provide the best results?

So you are basically asking me to believe a random forum member over the company that makes the hardware. So either Eroldru is correct and AMD don't know what they're doing in giving Intel free publicity and torpedoing their own enthusiast platform, or AMD did some comparative benchmarking and went with the system that provided the best numbers. I guess only Eroldru and AMD know for sure...oh, and the reviewers of course:
GTA V CPU performance
Battlefield Hardline CPU performance
Evolve CPU performance
Far Cry 4 CPU performance
Dragon Age: Inquisition CPU performance

as the54thvoid intimated, perception is reality in business - and AMD highlighted a competitors product over its own in every flagship graphics launch event of the past few years.
Now, what kind of perception does that engender amongst potential customers?
Posted on Reply
#117
ZeDestructor
(FF autoscrolled too far and I missed john_'s post... time to fix that!)
john_Are you serious? Everyone can read your post. It's directly over mine! The same is true with this one. Your post is DIRECTLY OVER THIS POST. I just wanted to inform you that I already posted about this. There where two more posts by me that YOU WHERE TOO LAZY TO READ BECAUSE THEY WHERE IN THE 4TH PAGE.

You also never explained why that link you posted in the second page was even relevant to the consoles. I quoted on page three about that, you never answered. Probably you could add post #61 to my other two posts, but never mind.

Next time try to be polite and not incorrectly accuse others. I read that post of yours, I have NO intention in reading your last post after the way you started it. Have a nice day.
If you wanted to only partially quote me, you should have quoted at least the comparison part. as it is, no matter how close my post is, people (myself included) won't scroll up, and see just a blank statement of Nvidia not caring for consoles because the revenue is small, not that teh revenue is small compared to something else. Welcome to twisting people's words to mean something else. I'll be nice and give you tyhe benefit of doubt and take it as being accidental.

I read the two posts you mentioned specifically at the time. I also read the rest of page 4 many hours earlier, and that's well and truly gone off the stack. If you wanted post #61 included, you should have included it when giving specific examples, since I treated every other post as not part of your reponse. Welcome to how references work.

As for relevance to consoles, the only relevant bit is how Nvidia decided not to pursue consoles and AMD did. All I did was explain ways in which Nvidia could have provided a competing chip (either by integrating POWER or ARM with GeForce in a SoC, or by using an extremely wide link combined with an existing external CPU, which is where the link to an article about Sierra and Summit are relevant, since, due to needs, they built a really wide, high-bandwidth link)

Now, let's have a look at the fames post #61:
john_Considering that Nvidia was already thinking of lowering prices at the hi end, this is exactly what AMD needed. I don't expect Nvidia to do a huge price cut after these news. They will do the price cuts they where thinking and stop there. Also the prices affect all the cards not just the hi end and at the low/mid range AMD is more than competitive. Also the price redaction affects console processors.

Nvidia would have offered to make console chips even with zero margins. The reason is simple. All console games would be using PhysX and GameWorks by now. AMD gpus would have been considered faulty at best today with many bugs all over the place and poor performance. I think Nvidia tried, but both Sony and Microsoft where having cheap x86 consoles in their minds that would start bringing profits from day one.
Nvidia controlling console GPU would not have resulted in gameworks and physx being everywhere, and even if it were, studios would still have ported them over anyways - remember Gameworks works on any GPU, not just Nvidia, and AMD would have launched a GPU that looked a lot more like Maxwell 2 than GCN. if PhysX became commonplace, AMD and game devs would've found a way to replicate the functionality on non-GeForce platforms. It just hasn't been necessary so far.
john_And what exactly is the link you show me? I don't find it relevant with consoles or x86 and even if it is, this article is from 11/2014. PS4 and Xbox One where introduced at the end of 2013. So the question is what did Nvidia had to offer at 2012? Looking at wiki, Tegra 3 and that's not an x86 SOC. x86 on all platforms (Xbox One, PS4, PC) makes game development cheaper and faster. The latest rumors say than Nintendo NX will also use AMD's chips, which makes sense because the biggest problem for the Nintendo consoles today is the lack of third party games.
x86 vs blah , as I have explained several times now, is irrelevant: programmers in all industries no longer work in assembly, and will not do so again outside of a bit of hardware init and hand-optimizing certaing HPC and crypto programs, though even that is falling out of favour. So now you're coding in C/C++ because that's what the SDKs like by and large for fast games, and, well, C/C++ and most of it's various libraries and OSes have been ported to all the major platforms (x86, ARM, POWER, MIPS).

The only, and I mean ONLY relevant part of x86 being in consoles is that it makes porting to PC minorly easier (Especially from XBOne or PS4-OpenGL). The bulk of the effort is still cross-API/ABI compatibility/translation layers/shims, as has been since the X360/PS3 generation.
john_I think Sony and Microsoft know how to negotiate a deal. Also AMD was with its back on the wall. And both those companies knew it. So I am pretty sure they explained to AMD that they had many alternative options for their next gen consoles, and all those options where bad for AMD. So AMD would have to offer them a great deal from the beginning, to guaranteed that Sony and/or Microsoft would not turn to Nvidia and/or Intel for the main parts. I don't think AMD was willing to gamble it's future just so it can secure a better deal.
Based on how Nvidia walked away from all three, I think AMD managed to raise the price of their SoC by being the only viable platform left. Intel doesn't have the GPU power, neither do Qualcomm (Adreno), ARM (Mali) or Imagination (PowerVR), and then you have the driver state of the latter three... which is just hopeless from what we can see on Android. Based on HumanSmoke's link, AMD is charging $100-110 per unit, with 20% margin (hence the $20 number). If AMD were not the only choice, MS and Sony would have pushed for a race to the bottom and dropped that price even lower.

This is pure speculation though, so it's probably wrong, though I suspect the truth isn't that far away based on NV's public statements.
R-T-BOf course it would, it's doing much more via the additional space wasted on implementing those instructions but if that Pentium M core was built on a the same process it could likely clock beyond 6Ghz. At least from my understanding of it.

I'll conceed I am not certain about this, as my knowledge comes from the old days when IBM was doing research into this. IBM now only makes extremely high end servers, and even they have made the PowerPC instruction set pretty beefy. My main point was not that you should not use propietary extensions at all, but rather that they have a limited benefit vs keeping the instruction set propietary. Now, there may be SOME benefit in select instances (vector math and AVX are a great example of specialization), but they certainly keep x86 patents from expiring in a useful way and don't underestimate intels evaluation of that.
I meant Broadwell M at the same frequency as a P3, running identical code would be faster, but that the extensions would be even faster.

IBM server CPUs are actually the last really high-frequency chips out there. If they could push for even more cores and lower the speeds, they would be outperforming Intel's Haswell-EX platform, but they can't - simply because no fabs can make chips bigger than they're already shipping (each POWER8 core is bigger than a Haswell x86 core), then you have the monstruous memory config of the POWER8 chips. And that's on 22nm SOI, not 22nm FinFET CMOS+High-K (what Intel is using), which allows for the much higher clock speeds, at the cost of absolutely insane power consumption: Tyan's lower-clocked POWER8 CPUs are 190W-247W TDP, IBM's higher-clocked parts go even higher (300W is a pretty conservative estimate by AT), meanwhile Intel's E5-2699v3 and E7-8890v3 are a "mere" 165W.

Keeping the ISA proprietary while a bit of a nasty thing to do, is a status quo neither Intel, AMD or anyone else really wants to change: they get to quash any upstart competition without needing to lift a finger. And if you think AMD is nice about it, think again - Intel sued AMD for AMD64 because AMD did not want to license it, and why would they.. Opteron with AMD64 had single-handedly smashed into the datacenter and cleared out MIPS and SPARC and was well on it's way to clearing POWER from everything but the highest of the high-end systems. Meanwhile, Itanium (Intel and HP's 64bit CPU.. probably one of the nicest architectures built, ever.. with no compiler ever built to use it properly) was floundering hard, and was later killed off by GPUs doing the only thing Itanium ended up being good for: fast, very parallelisable math. Eventually, after a lot of counter-suiing, AMD and Intel settled and cross-licensed a lot of stuff, and continue to do so with all the new ISA extensions.
Posted on Reply
#118
R-T-B
I can't really disagree with that. My main point that I was trying to make (though we got on a bit of tagent there due to my misunderstanding of modern tech) was that Intel/AMD/whoever loves keeping extensions in house if possible, and we seem to have reached agreement there. ;)
Posted on Reply
#119
ZeDestructor
R-T-BI can't really disagree with that. My main point that I was trying to make (though we got on a bit of tagent there due to my misunderstanding of modern tech) was that Intel/AMD/whoever loves keeping extensions in house if possible, and we seem to have reached agreement there. ;)
The patents are a by-product of the extensions, not really the true objective. Mind you, as I said, Intel and AMD are more than happy to keep it nice and closed :).
Posted on Reply
#120
64K
Ferrum MasterHoly Crap... 100 comments about nothing :D

WHO CARES... As long they are afloat with their design team, they will sell their own product, just as ARM does.
Shareholders care I assure you and probably people buying AMD GPUs may be wondering if AMD will be around to update drivers for their GPU.

AMD shares are down 36% in the last month.

2015 Q2 Net Profit Margin

Intel 20.51%
Nvidia 11.64%
AMD -19.21%
Posted on Reply
#122
50eurouser
Customers were buying intel cpu's even when AMD had better products out there not only value for money wise. Back in the ~2004 when P4 was just as FX cpu are now press/reviewers was not that vicious against intel products. Check reviews from 2012 etc and read of how horrible FX cpu's and their architect is. Not only that intel got a nice 1.4b from EU for antitrust and other similar cases. Let along the famous case when "
Intel finally agrees to pay $15 to Pentium 4 owners" etc. In fact even if we could be back at 2005 when Athlon x2 were rolfstomping the room-heaters p4 amd would not gain that current 80% market of intel's. Cause customers are not all that informed and real life performance. People pay 400$ for a i7 when its production cost let along the innovation from 2011 Sandy is minimal. . Intel are good at many things but they are the best if they want to run u dry of money, chipsets / new sockets / tiny updates ... call it whatever. From 1156 to 1151 in ~4.5 years ... LGA775 lasted more that all these sockets together. Even if zen comes and can compete against intel products, we may not have CPU price wars, might be the other way around amd zen being overpriced and i5 becoming the new vfm king cpus while still being over ~200$. Sometimes customers have to say no to overpriced recycled tech with just fancier I/O and +/- 1pin each year.
Posted on Reply
#123
ZeDestructor
50eurouserCustomers were buying intel cpu's even when AMD had better products out there not only value for money wise. Back in the ~2004 when P4 was just as FX cpu are now press/reviewers was not that vicious against intel products. Check reviews from 2012 etc and read of how horrible FX cpu's and their architect is. Not only that intel got a nice 1.4b from EU for antitrust and other similar cases. Let along the famous case when "
Intel finally agrees to pay $15 to Pentium 4 owners" etc. In fact even if we could be back at 2005 when Athlon x2 were rolfstomping the room-heaters p4 amd would not gain that current 80% market of intel's. Cause customers are not all that informed and real life performance. People pay 400$ for a i7 when its production cost let along the innovation from 2011 Sandy is minimal. . Intel are good at many things but they are the best if they want to run u dry of money, chipsets / new sockets / tiny updates ... call it whatever. From 1156 to 1151 in ~4.5 years ... LGA775 lasted more that all these sockets together. Even if zen comes and can compete against intel products, we may not have CPU price wars, might be the other way around amd zen being overpriced and i5 becoming the new vfm king cpus while still being over ~200$. Sometimes customers have to say no to overpriced recycled tech with just fancier I/O and +/- 1pin each year.
That was 2004, today it's 2015, and there's a lot less of that nonsense going on. Remember Nvidia and ATi taking their turns at faking 3DMark results too? Or AMD raising not licensing AMD64 to Intel? ALL of the companies were raked over the coals good and proper and/or sat through some protacted legal wrangling for that nastiness, and it's just not there anymore, not since VIA bowed out and Transmeta got glomped by Nvidia (who still has an x86 license they agreed to not use in exchange for a very large sum of cash from Intel).

Why Intel is winning? Well it's simple: Nobody can in good conscience recommend anything AMD right now CPU-side to anyone: Intel simply has the better performance at near enough price on the consumer side, and nothing competitive server-side. Oh, and let's not forget, AMD has never managed to build a decent mobile CPU to compete with Pentium M and Core. You can say what you want, but not showing up means you lose.

As for the constant socket change, I for one do NOT want a redux of LGA775, where you had to match chipsets to VRMs and BIOS in order to figure out which CPUs you could run. No thanks, I'll take constant socket changes where I can blindly dump a matching CPU/socket combo over that particular mess.

Why Intel isn't releasing the big 30% improvements anymore, it's quite simple: they've run out of the big improvements, and as a result, are scaling core counts and power instead, because that's what they can do at all by and large. You can read up more if you want on my posts here and here, as well as my conversation with @R-T-B in this very thread a few posts above your own.
Posted on Reply
#124
tabascosauz
ZeDestructorThat was 2004, today it's 2015, and there's a lot less of that nonsense going on. Remember Nvidia and ATi taking their turns at faking 3DMark results too? Or AMD raising not licensing AMD64 to Intel? ALL of the companies were raked over the coals good and proper and/or sat through some protacted legal wrangling for that nastiness, and it's just not there anymore, not since VIA bowed out and Transmeta got glomped by Nvidia (who still has an x86 license they agreed to not use in exchange for a very large sum of cash from Intel).

Why Intel is winning? Well it's simple: Nobody can in good conscience recommend anything AMD right now CPU-side to anyone: Intel simply has the better performance at near enough price on the consumer side, and nothing competitive server-side. Oh, and let's not forget, AMD has never managed to build a decent mobile CPU to compete with Pentium M and Core. You can say what you want, but not showing up means you lose.

As for the constant socket change, I for one do NOT want a redux of LGA775, where you had to match chipsets to VRMs and BIOS in order to figure out which CPUs you could run. No thanks, I'll take constant socket changes where I can blindly dump a matching CPU/socket combo over that particular mess.

Why Intel isn't releasing the big 30% improvements anymore, it's quite simple: they've run out of the big improvements, and as a result, are scaling core counts and power instead, because that's what they can do at all by and large. You can read up more if you want on my posts here and here, as well as my conversation with @R-T-B in this very thread a few posts above your own.
Spot on. In addition, AMD could not build a CPU to keep up with the evolution of Core. It was clear that K10 was falling behind, but Bulldozer just went and beheaded K10 in a bloody mess. Now, because of the limitations of Bulldozer, Opteron has virtually died. And all companies from IBM to AMD and Nvidia and Intel have all played in shady dealings in the past. Intel's are simply the only ones that are well-known because everyone attributes AMD's recent failures to Intel.

Also, a wonderful argument regarding sockets. LGA775 was a gong show. DDR2 and DDR3, FSB 800, 1066 and 1333 all existing on one socket made for a hell of a mess. The current socket strategy makes it much easier for less-knowledgeable consumers to get something that works.
Posted on Reply
#125
medi01
cyneaterAMD needs to make products people want to buy....

And processors that can perform well and are priced well....

The Athlon 64 was released over 10 years ago It was a Pentium 4 killer...

AMD needs another Athlon 64.... Or they could go the way of mips and SGI
AMD created Carrizo I want to buy.
No notebook manufacturer gives a damn.

Even before Carrizo, I'd rather go with AMD's APU than Intel's overpriced CPUs that suck at gaming, but alas, neither that was possible.

Both Microsoft and Sony had good reasons to go with AMDs APU in their current gen consoles.

So, no, it's not really about the product.
Posted on Reply
Add your own comment
Apr 24th, 2024 13:04 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts