TL;DR:
- Microsoft has signed a multi-year agreement with startup CoreWeave, potentially worth billions, to acquire cloud computing infrastructure.
- CoreWeave offers simplified access to Nvidia’s top-of-the-line GPUs for running AI models.
- The deal ensures that OpenAI’s ChatGPT chatbot, reliant on Microsoft’s Azure cloud infrastructure, has adequate computing power.
- Microsoft’s investment in CoreWeave allows them to tap into Nvidia’s GPUs and meet the growing demand for AI infrastructure.
- CoreWeave recently raised $200 million in funding, with a valuation of $2 billion, and has experienced significant revenue growth.
- Nvidia’s stock price has surged by 170% this year, driven by demand for data center solutions for generative AI and large language models.
- CoreWeave provides computing power at a significantly lower cost compared to legacy cloud providers.
- Microsoft’s engagement with CoreWeave reflects the company’s commitment to dominating the AI market and staying at the forefront of technological advancements.
Main AI News:
The colossal investment by Microsoft in OpenAI has positioned the tech giant at the forefront of the burgeoning artificial intelligence (AI) industry. However, Microsoft’s quest to cater to the skyrocketing demand for AI-powered services extends beyond its collaboration with OpenAI.
Insider sources reveal to CNBC that Microsoft has recently entered into a multi-year agreement with CoreWeave, a startup that has secured $200 million in funding. The deal, estimated to be worth billions of dollars, encompasses the acquisition of cloud computing infrastructure from CoreWeave. Notably, CoreWeave achieved a valuation of $2 billion just over a month ago.
CoreWeave specializes in providing simplified access to Nvidia’s state-of-the-art graphics processing units (GPUs), renowned as the most exceptional GPUs available for running AI models. By partnering with CoreWeave, Microsoft secures the necessary computing power for OpenAI, the operator of the highly popular ChatGPT chatbot. It is worth noting that OpenAI relies on Microsoft’s Azure cloud infrastructure to meet its substantial computational requirements.
Representatives from both Microsoft and CoreWeave have opted not to disclose any further details regarding the agreement.
The race to capitalize on generative AI gained momentum after OpenAI introduced ChatGPT to the public last year. This groundbreaking achievement demonstrated the capability of AI to generate sophisticated responses based on human input. Since then, numerous companies, including Google, have swiftly integrated generative AI into their own products. Microsoft has been actively deploying chatbots across its services, such as Bing and Windows.
Given the overwhelming demand for its infrastructure, Microsoft needs additional avenues to leverage Nvidia’s GPUs effectively. Although CoreWeave CEO Michael Intrator refrained from commenting on the Microsoft deal during a recent interview, he acknowledged that the company’s revenue had experienced substantial growth from 2022 to 2023.
CoreWeave’s latest funding announcement on Wednesday, featuring hedge fund Magnetar Capital, extends the company’s previous $221 million funding round from April. Notably, Nvidia has invested $100 million in CoreWeave’s prior financing. Founded in 2017, CoreWeave currently boasts a team of 160 employees.
With a 170% surge in its stock price this year, Nvidia’s market capitalization briefly surpassed $1 trillion for the first time. This achievement follows Nvidia’s forecast for the July quarter, which exceeded Wall Street estimates by over 50%.
During a recent earnings call, Colette Kress, Nvidia’s finance chief, emphasized that the company’s growth will predominantly be driven by the data center segment, reflecting the substantial demand for generative AI and large language models. OpenAI’s GPT-4 large language model, which extensively utilizes Nvidia GPUs, forms the core of the ChatGPT chatbot.
Kress explicitly mentioned CoreWeave during the call, and Nvidia CEO Jensen Huang also referenced the company in his presentation at Nvidia’s GTC conference in March.
CoreWeave proudly asserts on its website that its computing power is “80% less expensive than legacy cloud providers.” In addition to Nvidia’s A100 GPUs, CoreWeave offers an array of services that developers can also access through leading cloud providers like Amazon, Google, and Microsoft.
Furthermore, CoreWeave provides the cost-effective Nvidia A40 GPUs, specifically designed for visual computing, alongside the A100 GPUs that cater to AI, data analytics, and high-performance computing. Intrator revealed that certain CoreWeave clients had encountered challenges in obtaining sufficient GPU power on major cloud platforms. In response, CoreWeave has recommended the A40 GPUs when prospects have requested the A100 or newer H100 GPUs from Nvidia.
According to Intrator, the A40 GPUs deliver exceptional performance at an attractive price point.
Sources indicate that Microsoft has engaged in discussions with Oracle regarding the mutual rental of servers to address additional capacity requirements, as reported earlier this month by The Information, citing an anonymous source.
Conclusion:
Microsoft’s substantial investment in CoreWeave and its agreement for AI computing power signifies its dedication to meeting the surging demand for AI-powered services. This strategic move allows Microsoft to secure the necessary infrastructure for OpenAI’s ChatGPT and tap into Nvidia’s leading GPUs. The partnership positions Microsoft to maintain its leadership in the AI market and capitalize on the growing opportunities presented by generative AI and large language models.