Winning with WEKA
New Customer Spotlight: Sustainable Metal Cloud
We’re continually inspired by our customers and the complex data challenges they’re overcoming to make the previously impossible possible. In this blog series, we’ll spotlight the stories of some of our newest customers and why they chose WEKA.
As artificial intelligence (AI) continues to proliferate in global enterprises, one thing is for certain: data center-related energy consumption is exploding.
When we launched our Sustainable AI Initiative last year, the sparse research we could find on this topic grossly underestimated how quickly AI would contribute to the acceleration of data center-related energy consumption. The IEA now estimates it’s on track to double by 2026.
WEKA is on a mission to develop powerful, efficient, and sustainable enterprise data stacks that can power the AI era without accelerating the global climate and energy crises. So when we have the opportunity to work with truly innovative, like-minded firms like our customers at Sustainable Metal Cloud, we get pretty excited.
Sustainable Metal Cloud (SMC) is an AI cloud and GPU service provider based in Singapore that offers omniverse GPU clusters for deep learning applications and robust enterprise AI training and inference tools at scale, including NVIDIA’s AI Enterprise, NeMo, and NIM microservices offerings.
Why They’re Cool
Sustainable Metal Cloud (SMC) is on a mission to enable anyone to innovate with AI sustainably. SMC’s infrastructure arm, Firmus, was launched nearly a decade ago as a research project focused on building the world’s first Sustainable AI Factory: HyperCube.
Inside SMC’s Hypercube
HyperCube’s proprietary immersion-based cooling technology dramatically decreases the energy needed to host large-scale GPU clusters and support performance-intensive generative AI and machine learning workloads while providing best-in-class performance and scale.
This advanced data center cooling approach reduces the necessary power and costs associated with running current-generation GPU clusters at scale by up to 50% and significantly lowers CO2e emissions without sacrificing performance. In its Singapore H100 Region, SMC is saving an incredible 3,800 tons of CO2 per GPU annually—a previously unheard-of breakthrough in efficiency.
A peek under the hood
Why They Chose WEKA
When we asked SMC Co-CEO Tim Rosenfield why they chose to partner with WEKA, he cited the WEKA® Data Platform’s sustainability benefits, flexible software approach, which allowed them to leverage their preferred data center hardware, and its high-performance data pipeline-oriented architecture as deciding factors.
“Highly performant data infrastructure is essential to ensuring our customers’ demanding AI workloads operate at peak efficiency. Within SMC’s Sustainable AI factories, a steady, frictionless data pipeline is also essential to further ensure ongoing energy savings,” Rosenfield said. “Because of this, WEKA was a natural choice for SMC’s data storage partner. Their high-performance data platform, combined with our energy-efficient GPU clusters, embodies our mutual commitment to not just innovate but to do so responsibly.
The WEKA Data Platform allows us to offer our customers the world’s best storage performance at scale, with guaranteed power efficiency and zero performance compromises. The partnership will enable us to support sustainable AI practices for our customers now and in the future. We’re excited to grow our partnership with WEKA and develop new value-added solutions for SMC customers in the future.”
Watch This Space
To meet customer demand for sustainable AI computing resources, SMC expects to scale its APAC GPU Availability Zones to an eye-popping 16,000 GPUs (!!) across Singapore, Thailand and India in 2024, with plans to expand its services to new markets by 2025. To learn more about SMC or sign up, visit https://smc.co/.
WEKA is excited to support SMC’s sustainable AI mission. Be sure to keep an eye out for new opportunities to learn more about how SMC and WEKA are working together coming soon.