Monday, March 11th 2019

NVIDIA to Acquire Mellanox Technology for $6.9 Billion

NVIDIA and Mellanox today announced that the companies have reached a definitive agreement under which NVIDIA will acquire Mellanox. Pursuant to the agreement, NVIDIA will acquire all of the issued and outstanding common shares of Mellanox for $125 per share in cash, representing a total enterprise value of approximately $6.9 billion. Once complete, the combination is expected to be immediately accretive to NVIDIA's non-GAAP gross margin, non-GAAP earnings per share and free cash flow.

The acquisition will unite two of the world's leading companies in high performance computing (HPC). Together, NVIDIA's computing platform and Mellanox's interconnects power over 250 of the world's TOP500 supercomputers and have as customers every major cloud service provider and computer maker. The data and compute intensity of modern workloads in AI, scientific computing and data analytics is growing exponentially and has put enormous performance demands on hyperscale and enterprise datacenters. While computing demand is surging, CPU performance advances are slowing as Moore's law has ended. This has led to the adoption of accelerated computing with NVIDIA GPUs and Mellanox's intelligent networking solutions.

Datacenters in the future will be architected as giant compute engines with tens of thousands of compute nodes, designed holistically with their interconnects for optimal performance.

An early innovator in high-performance interconnect technology, Mellanox pioneered the InfiniBand interconnect technology, which along with its high-speed Ethernet products is now used in over half of the world's fastest supercomputers and in many leading hyperscale datacenters.

With Mellanox, NVIDIA will optimize datacenter-scale workloads across the entire computing, networking and storage stack to achieve higher performance, greater utilization and lower operating cost for customers.

"The emergence of AI and data science, as well as billions of simultaneous computer users, is fueling skyrocketing demand on the world's datacenters," said Jensen Huang, founder and CEO of NVIDIA. "Addressing this demand will require holistic architectures that connect vast numbers of fast computing nodes over intelligent networking fabrics to form a giant datacenter-scale compute engine.

"We're excited to unite NVIDIA's accelerated computing platform with Mellanox's world-renowned accelerated networking platform under one roof to create next-generation datacenter-scale computing solutions. I am particularly thrilled to work closely with the visionary leaders of Mellanox and their amazing people to invent the computers of tomorrow."

"We share the same vision for accelerated computing as NVIDIA," said Eyal Waldman, founder and CEO of Mellanox. "Combining our two companies comes as a natural extension of our longstanding partnership and is a great fit given our common performance-driven cultures. This combination will foster the creation of powerful technology and fantastic opportunities for our people."

The companies have a long history of collaboration and joint innovation, reflected in their recent contributions in building the world's two fastest supercomputers, Sierra and Summit, operated by the U.S. Department of Energy. Many of the world's top cloud service providers also use both NVIDIA GPUs and Mellanox interconnects. NVIDIA and Mellanox share a common performance-centric culture that will enable seamless integration.

Once the combination is complete, NVIDIA intends to continue investing in local excellence and talent in Israel, one of the world's most important technology centers. Customer sales and support will not change as a result of this transaction.

Additional Transaction Details
Post close, the transaction is expected to be immediately accretive to NVIDIA's non-GAAP gross margin, non-GAAP earnings per share and free cash flow. NVIDIA intends to fund the acquisition through cash on its balance sheet. In addition, there is no change to its previously announced capital return program for the rest of fiscal 2020. The transaction has been approved by both companies' boards of directors and is expected to close by the end of calendar year 2019, subject to regulatory approvals as well as other customary closing conditions, including the approval by Mellanox shareholders of the merger agreement.

Advisors
Goldman Sachs & Co. LLC served as exclusive financial advisor to NVIDIA and Jones Day served as legal advisor. Credit Suisse Group and J.P. Morgan Chase & Co. served as financial advisors to Mellanox and Latham & Watkins, LLP and Herzog Fox & Neeman served as legal advisors.
Source: NVIDIA
Add your own comment

12 Comments on NVIDIA to Acquire Mellanox Technology for $6.9 Billion

#1
Space Lynx
Astronaut
datacenters and goldman sachs.... I guess Nvidia is finally looking at the gaming PC market as niche and easy to exploit since they have no competitors, might as well focus their attention elsewhere.

Come on Navi baby... I need you to do well!!!
Posted on Reply
#2
notb
lynx29datacenters and goldman sachs...
Big investments are always taken care of by investment banks. You shouldn't suspect anything evil just because an article mentions a bank that's unpopular among laymans. ;-)
I guess Nvidia is finally looking at the gaming PC market as niche and easy to exploit since they have no competitors, might as well focus their attention elsewhere.
Nvidia is making products for both PC gaming and datacenter markets (and few other as well). They want to expand and they're looking for new possibilities.
This event has no implication on their PC branch.
Posted on Reply
#3
Litzner
lynx29Come on Navi baby... I need you to do well!!!
I don't think Navi is going to be bad, but I don't think it will be all that great, since it is just a re-hash of GCN still. At the top, If they equal even a 2070's performance I would be surprised. I don't think we have much of a chance seeing AMD competitive on the higher end cards until they release a new architecture after Navi. Vega 7 actually has some promise though, from what I have seen from the modding guys, once you get that card setup right, and attached to an appropriate cooler (the stock cooling is using a thermal pad for the GPU apparently :eek:) it seems to have ALOT of headroom.
Posted on Reply
#4
moproblems99
LitznerI don't think Navi is going to be bad, but I don't think it will be all that great, since it is just a re-hash of GCN still.
Has this been confirmed? I haven't seen anything to suggest that but I could have missed it.
LitznerAt the top, If they equal even a 2070's performance I would be surprised.
They have that now with VII so they ought to be able to best that. I would surmise that 2080 performance will be best case scenario.
Posted on Reply
#5
Space Lynx
Astronaut
notbBig investments are always taken care of by investment banks. You shouldn't suspect anything evil just because an article mentions a bank that's unpopular among laymans. ;-)

Nvidia is making products for both PC gaming and datacenter markets (and few other as well). They want to expand and they're looking for new possibilities.
This event has no implication on their PC branch.
doesn't change the fact they used to have all RnD 100% on PC Gaming, that has changed for a reason - no longer any money in it + combined with being a monopoly with 70% market share for 7+ years now. and now RnD is shifting to data centers and AI tech... which means slower progress for gamers.
Posted on Reply
#6
hat
Enthusiast
lynx29doesn't change the fact they used to have all RnD 100% on PC Gaming, that has changed for a reason - no longer any money in it + combined with being a monopoly with 70% market share for 7+ years now. and now RnD is shifting to data centers and AI tech... which means slower progress for gamers.
And AMD is doing the same... if you can break into a new market, why wouldn't you? This is the same reason Intel is looking at making GPUs now.
Posted on Reply
#7
Jism
DC's was always the cashcow of both nvidia and amd.
Posted on Reply
#8
notb
lynx29doesn't change the fact they used to have all RnD 100% on PC Gaming, that has changed for a reason - no longer any money in it + combined with being a monopoly with 70% market share for 7+ years now. and now RnD is shifting to data centers and AI tech... which means slower progress for gamers.
How can you even say that when Nvidia basically owns all pro markets? Datacenters, workstations and even business laptops are built around Nvidia GPUs - AMD almost doesn't exist. It's a dominance unheard of in gaming market, where AMD still manages to keep 20-30% market share for dedicated GPUs.
In 2019 AMD is pretty much a custom GPU designer for Apple, Sony and Microsoft.

Nvidia spent billions on their pro segment, but they've spent very wisely - for example on fortifying CUDA as the prime GPGPU environment. And their chips has been fantastically efficient. And they have tensor cores.
Posted on Reply
#9
Space Lynx
Astronaut
notbHow can you even say that when Nvidia basically owns all pro markets? Datacenters, workstations and even business laptops are built around Nvidia GPUs - AMD almost doesn't exist. It's a dominance unheard of in gaming market, where AMD still manages to keep 20-30% market share for dedicated GPUs.
In 2019 AMD is pretty much a custom GPU designer for Apple, Sony and Microsoft.

Nvidia spent billions on their pro segment, but they've spent very wisely - for example on fortifying CUDA as the prime GPGPU environment. And their chips has been fantastically efficient. And they have tensor cores.
i was talking about at the original outset of Nvidia's first few years... as companies grow, they forget their original base. happens to all of them.
Posted on Reply
#10
notb
lynx29i was talking about at the original outset of Nvidia's first few years... as companies grow, they forget their original base. happens to all of them.
Which still makes your previous statement somehow missing the point.
When Nvidia joined the GPU market, it was all about gaming. The idea of GPGPU was still being developed. Some scientific or industrial computing was done using ASICs, i.e. people were building chips for a particular scenario: be it image rendering or designing bridges.

The whole "innovation" done in gaming graphics over the years had only one aim: to make fake graphics look good. The technology wasn't good enough to make it properly.
And this pretty much ended with RTX. We have ray tracing in games. Not very fast and not very cheap, but it works. There's not much more that can be done. It's the end of innovation. From now on we can only make GPUs faster to make RTRT process more light effects. And making GPUs faster is a common problem for gaming and computing, so why separate?

You're certainly right about one thing. We've reached gaming market limits long time ago. More people won't start gaming (in fact this group is shrinking).
But on the datacenter/AI/IoT side we're merely touching the surface. So yes, these markets are way more interesting for Nvidia.
Posted on Reply
#11
moproblems99
notbThe whole "innovation" done in gaming graphics over the years had only one aim: to make fake graphics look good.
We have a long way to go to make fake graphics look good. RTX is but a blip on the radar for that.
Posted on Reply
#12
notb
LitznerI don't think Navi is going to be bad, but I don't think it will be all that great, since it is just a re-hash of GCN still. At the top, If they equal even a 2070's performance I would be surprised. I don't think we have much of a chance seeing AMD competitive on the higher end cards until they release a new architecture after Navi. Vega 7 actually has some promise though, from what I have seen from the modding guys, once you get that card setup right, and attached to an appropriate cooler (the stock cooling is using a thermal pad for the GPU apparently :eek:) it seems to have ALOT of headroom.
Navi is designed for PS5, which was meant to offer 4K@60fps. Maybe that's why Radeon VII exists, as a statement they can deliver the performance.
On the other hand, it's very unlikely Sony or MS would accept a GPU pulling over 200W, so AMD will have to get close to Pascal's efficiency (20-30% better than what they have today)...
moproblems99We have a long way to go to make fake graphics look good. RTX is but a blip on the radar for that.
No. We have all the technologies needed for this.
RT is just a framework, a model of light. It describes how light travels through vacuum and how it interacts with surfaces and substances. All of this is well known and defined - we've been doing it for decades.
When you can run it (or some of it) in real time, you get RTRT. No magic there.

Now it's just a matter of making faster GPUs. And when they get 4x faster than what we have today, games should become fairly photorealistic (Matrix-level graphics).
Posted on Reply
Add your own comment
May 6th, 2024 15:48 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts