- Io.Net partners with AI startup WOMBO to enhance computing power using decentralized Apple silicon chips.
- WOMBO’s ecosystem, including its app and ML models, will benefit from io.net’s GPU compute network.
- The partnership addresses operational cost challenges by providing cost-effective GPU compute options through io.net’s DePIN platform.
- Leveraging Apple silicon chips, WOMBO aims to tap into millions of consumer devices to bolster its machine learning models.
- Since its inception, io.net has amassed over 520,000 GPUs and CPUs, valued at over $2 billion.
Main AI News:
WOMBO, the innovative AI avatar app, has entered into a strategic partnership with io.net, a decentralized physical infrastructure network (DePIN), to bolster its machine learning (ML) models. Leveraging io.net’s decentralized Apple silicon chips, WOMBO aims to enhance its ecosystem, including the WOMBO app, Dream, and WOMBO Me, by tapping into additional compute capacity provided by io.net’s decentralized GPU compute network.
This collaboration comes in the wake of io.net’s groundbreaking move to become the premier cloud service provider supporting Apple silicon chip clustering for ML applications. In a move announced in February, io.net positioned itself to offer millions of Apple users and ML engineers more affordable and accessible GPU compute options.
As a pioneering partner in this endeavor, WOMBO, the generative AI startup, is poised to harness the power of Apple silicon chip clusters to drive its ML models forward.
“We are thrilled to partner with io.net to leverage previously untapped computing power for groundbreaking AI applications. Together, our teams have the potential to significantly alleviate the GPU supply shortage,” remarked WOMBO CEO Ben-Zion Benkhin.
Io.Net’s Mission to Reduce Operational Costs
Cloud computing stands as a significant operational expense for AI and ML companies like WOMBO. With burgeoning demand in the cloud computing space and a finite supply of hardware, storing data has become costly and inefficient, resulting in higher expenses and extended timelines for companies.
According to a spokesperson, the partnership with io.net promises to address these challenges for WOMBO. The DePIN platform offers solutions by aggregating decentralized and geographically distributed GPUs, enabling companies to deploy clusters on demand at a fraction of the cost.
Io.net boasts a vast collection of high-performance GPUs, comprising over 100,000 nodes across its network. This network empowers machine learning engineers to deploy Ray and Kubernetes clusters rapidly and cost-effectively. With WOMBO’s impressive 200+ million application downloads, the partnership with io.net is poised to fuel its growth by minimizing operational costs.
WOMBO Embraces Apple’s Silicon Chips
Io.net’s recent innovation in February opened doors for hundreds of millions of Apple users worldwide to contribute unused Apple chip compute resources for AI/ML applications. WOMBO is among the latest platforms to capitalize on these unique capabilities, utilizing compute resources to bolster its machine learning models.
This initiative aims to harness the Neural Engine capabilities of Apple’s chips and io.net’s mega-clustering capabilities to tap into millions of consumer devices for AI workloads.
Commenting on WOMBO’s partnership and utilization of Apple silicon chips, Benkhin noted: “With over 74 million users across more than 180 countries at its peak, WOMBO exemplifies a consumer AI application at scale.”
Since its inception in December, io.net has amassed an impressive array of over 520,000 GPUs and CPUs, with an infrastructure value exceeding $2 billion.
Conclusion:
The collaboration between Io.Net and WOMBO signifies a significant advancement in the AI infrastructure market. By leveraging decentralized Apple silicon chips and io.net’s GPU compute network, companies like WOMBO can reduce operational costs and access vast computing resources. This partnership not only enhances the capabilities of AI applications but also sets a precedent for future collaborations in the evolving AI ecosystem.