AI Development May Leave Us Without Electricity, Elon Musk Warns
In a recent interview with Nicolai Tangen, Musk reflected on the hurdles encountered by big-tech in the past year and speculated about future challenges. Discussing the industry’s hardware-related issues, he highlighted that while chip constraints, which plagued the industry back in 2022-2023, are the problem of the past, this year’s primary obstacle is the lack of voltage transformers, essential for many AI developers in running their models. Looking forward, Musk warned that the crisis next year could revolve around a shortage of electricity altogether, considering the immense power needed to train AI.
Just how much power, you might ask? Well, according to the CEO, training the Grok 2 model demanded xAI to utilize about 20,000 NVIDIA H100 GPUs, with Musk mentioning that they would require at least 100,000 H100 chips for Grok 3.
Based on calculations from Tom’s Hardware, NVIDIA’s H100 GPU consumes approximately 700W when fully utilized, meaning that 100,000 GPUs, coupled with servers and cooling a data center needs to operate, would consume an astounding 100 megawatts of power each day. That’s enough energy to sustain a small city, all just to train an AI model.
Source: 80.lv