NVIDIA Closes Mellanox Acquisition, Adds High-Speed Networking to Tech Portfolio

Just over a year ago, NVIDIA announced its intentions to acquire Mellanox, a leading datacenter networking and interconnect provider. And, after going through some prolonged regulatory hurdles, including approval by the Chinese government as well as a waiting period in the United States, NVIDIA has now closed on the deal as of this morning. All told, NVIDIA is pricing the final acquisition at a cool 7 billion dollars, all in cash.

Overall, in the intervening year, NVIDIA’s reasoning for acquiring the networking provider has not changed: the company believes that a more vertically integrated product stack that includes high-speed networking hardware will allow them to further grow their business, especially as GPU-powered supercomputers and other HPC clusters get more prominent. To that end, it’s hard to get more prominent than Mellanox, whose Ethernet and Infiniband gear is used in over half of the TOP500-listed supercomputers in the world, as well as countless datacenters.

Ultimately, acquiring the company not only gives NVIDIA leading-edge networking products and IP, but it will also allow them to exploit the advantages of being able to develop in-house the high-performance interconnects needed to allow their own high-performance compute products to better scale. NVIDIA already has significant dealings with Mellanox, as the company’s DGX-2 systems incorporate Mellanox’s controllers for multi-node scaling. As well, Mellanox’s hardware is used in both the Summit and Sierra supercomputers, both of which are also powered by NVIDIA GPUs. So this acquisition is in many respects just the latest expansion in NVIDIA’s ongoing efforts to grow their datacenter presence.



from AnandTech https://ift.tt/359SxWb
via IFTTT
Share on Google Plus

About Unknown

This is a short description in the author block about the author. You edit it by entering text in the "Biographical Info" field in the user admin panel.
    Blogger Comment
    Facebook Comment

0 comments:

Post a Comment