How Generative AI Is Fueling a Growing Environmental Footprint

Generative AI has exploded in recent years, transforming industries through creating art, writing code, and answering questions. However, as this technology grows, so do the environmental costs.

Behind every AI-generated image or chatbot reply lies a massive network of data centres necessary to power this technology. With tens of thousands of machines working endlessly, they devour electricity and water.

Training just one model like GPT-3 uses over 1,200 megawatt hours—enough to power 100 homes for a year. Beyond training, every time a chatbot is prompted, the model’s inference engine consumes 5 times more electricity compared to a basic web search.

More than energy, cooling these systems demands water. Each kilowatt-hour of data centre energy requires two litres of water on average, impacting local supplies and ecosystems. With billions of queries and prompts, the multiplied environmental toll is enormous.

 

 

Worse still, we are not at the peak. The competition for more powerful AI models breeds rapid obsolescence. With new AI models replacing their predecessors rapidly, even more energy is necessary to train the more complex next generation.

Furthermore, AI hardware has an immense carbon footprint, with GPUs built through mining, toxic processes, and energy-intensive manufacturing.

The current growing AI system is unsustainable; we must innovate while limiting environmental impacts.

At Borrum Energy Solutions, we believe innovation shouldn’t come at the Earth’s expense. Our self-assembled microgeneration wind turbines offer individuals and communities a clean, reliable option to power their future. As the digital world grows more energy-hungry, we make it easier than ever to generate your own electricity—without a data centre.

Leave a comment

Please note, comments must be approved before they are published