Nvidia is acquiring Run:ai, a Tel Aviv-based company that makes it easier for developers and operations teams to manage and optimize their AI hardware infrastructure, for an undisclosed sum.
Ctech reported earlier this morning the companies were in “advanced negotiations” that could see Nvidia pay upwards of $1 billion for Run:ai. Evidently, those negotiations went without a hitch.
Nvidia says that it’ll continue to offer Run:ai’s products “under the same business model” for the immediate future, and invest in Run:ai’s product roadmap as part of Nvidia’s DGX Cloud AI platform.
“Run:ai has been a close collaborator with Nvidia since 2020 and we share a passion for helping our customers make the most of their infrastructure,” Omri Geller, Run:ai’s CEO, said in a statement. “We’re thrilled to join Nvidia and look forward to continuing our journey together.”
Geller co-founded Run:ai with Ronen Dar several years ago after the two studied together at Tel Aviv University under professor Meir Feder, Run:ai’s third co-founder. Geller, Dar and Feder sought to build a platform that could “break up” AI models into fragments that run in parallel across hardware, whether on-premises, on clouds or at the edge.
While Run:AI has relatively few direct competitors, other startups are applying the concept of dynamic hardware allocation to AI workloads. For example, Grid.ai offers software that allows data scientists to train AI models across GPUs, processors and more in parallel.
But relatively early in its life, Run:AI managed to establish a large customer base of Fortune 500 companies — which in turn attracted VC investment. Prior to the acquisition, Run:ai had raised $118 million in capital from backers including Insight Partners, Tiger Global, S Capital and TLV Partners.