While companies are increasingly leveraging machine learning to run their businesses more efficiently, it also requires enormous energy to build, test, and run machine learning models. is. Neu.ro, an early-stage start-up and full-stack MLOps (machine learning operations platform) solution, is working on a greener approach with less environmental impact.
On November 22, the company announced a zero-emission AI cloud solution with Finnish cloud infrastructure partner atNorth.
According to the company, atNorth will offer a Tier 3 compliant ISO 27001 certified data center where it will run DGX and HGX systems powered by NVIDIA A100. With a power capacity of 80 MW, the data center is all powered by geothermal and hydro energy. In addition, its location in the Arctic allows it to cool virtually free of charge, providing an energy-efficient solution for customers building machine learning models using Neu.ro’s solutions.
According to Neu.ro co-founder Max Prasolov, investigating this issue shows that computing and telecommunications account for about 9% of the world‘s total energy expenditure. Found that it could double in the next 10 years. Prasorov and colleagues believe that building machine learning models will play an important role, and have decided to partner with atNorth to reduce their carbon emissions.
“We have decided to move all operations and all experiments to a zero-emission cloud. The goal is not carbon neutral, where you can buy credits to make up for your usage. The question is, how do you get zero emissions? We have found that we are spending a great deal of energy and computing power in training machine learning models for our customers, which is undoubtedly. We realized that it was the largest carbon footprint we were emitting, “says Prasorov.
Meanwhile, the company has figured out ways to build models in a more efficient way through software solutions. This makes it possible to reduce the amount of energy required and provide more sustainable solutions.
As for the product itself, the company offers flexible, cloud-native services, where it offers some of the tools, but there’s plenty of room for companies to make up for what they think is best for them.
“Our approach focuses on interoperability, rather than building all the tools that need to be built, from data ingestion to monitoring, accountability, and pipeline engine. We will build something that isn’t there and connect it to the existing Kubernetes tools universe, “explains the company’s co-founder Arthur McCallum.
The startup is currently offering a commercial solution, but is also working on an open source stack, which will probably be released later this year. The company’s goal is to provide cloud-based AI solutions to smaller cloud vendors other than the Big Three, Amazon, Microsoft, and Google. This will include regional vendors around the world.
Image credits: a-image / Getty Images
[To the original text]
(Sentence: Ron Miller, Translation: Hirokazu Kusakabe)