Power-hungry intelligence systems?

The environmental price of training and running Artificial intelligence systems is currently enormous and why it matters for our climate goals

Artificial intelligence (AI) is often introduced to us as something light, almost invisible. We are told it lives in the “cloud”, runs silently in the background, and improves our lives without demanding much in return. Over the last decade, AI has been presented as the engine of efficiency, smarter healthcare, cleaner governance, faster science, better decisions. Yet as AI systems quietly move into every sphere of human activity, a harder truth is beginning to surface. These systems are not abstract or weightless. They are built on heavy, resource-hungry infrastructures that draw deeply from the earth’s energy, water, and material reserves. At a time when climate instability and water stress are no longer distant threats but lived realities, this contradiction can no longer be ignored.

The environmental burden of AI begins with computation itself. Modern machine-learning models, especially large generative systems, demand extraordinary processing power. Training them is not a one-time act. Unlike earlier digital tools that stored information and waited passively for human input, AI systems remain active at all times, learning, predicting, responding. This constant activity locks societies into a permanent rise in electricity demand. Once AI becomes embedded in daily services, search engines, public records, education platforms, financial systems, energy use stops being occasional and becomes structural.

Scientific work has now confirmed what engineers and system designers sensed long ago. A major study published in Patterns demonstrated that the energy use, carbon emissions, and water consumption of AI systems are already significant and are likely to grow rapidly if current development paths continue (De Vries-Gao, 2025). What is especially important in this research is its broader framing. AI’s environmental cost does not end at electricity meters. It extends into cooling systems, hardware supply chains, mineral extraction, and land use. This challenges the comforting belief that digital progress automatically reduces material pressure on the planet.

Water use deserves particular attention because it remains largely hidden from public discussion. High-density servers generate intense heat, and water-based cooling is still one of the most effective ways to manage it. As a result, large data centres withdraw enormous quantities of freshwater, often from local sources. In regions already facing water stress, this creates quiet competition between technological infrastructure and human or ecological needs. The troubling part is not only the volume of water used, but the fact that ordinary users remain unaware of it. Each online query or AI-generated response feels immaterial, yet they consume huge energy and water resources, and also leave behind carbon footprint.

Researchers at the Massachusetts Institute of Technology have pointed out that generative AI represents a qualitatively different kind of digital infrastructure, marked by unusually high resource intensity (MIT Technology Review, 2025). Unlike conventional cloud services, generative AI requires sustained, high-density computation even after deployment. Improvements in hardware efficiency, while real, have not reduced total resource use because the scale of AI adoption keeps expanding. This reflects an old lesson from industrial history, efficiency alone does not guarantee sustainability. Often, it accelerates consumption instead of containing it.

The climate implications of this expansion are difficult to dismiss. As AI data centres multiply, national power grids face new and unpredictable loads. In regions where renewable energy supply is limited or inconsistent, this demand is often met by fossil-fuel-based power generation. The result is a growing emissions footprint that remains partly concealed within corporate accounting. A Reuters investigation into the technology sector revealed that indirect emissions linked to data centres and AI workloads have risen sharply in recent years, even as companies publicly reaffirmed their climate commitments (Reuters, 2025). This gap between promise and practice raises uncomfortable questions about responsibility and transparency.

The rural locations of AI infrastructure compromises environmental justice. While the benefits of AI, profit, convenience, and technological prestige, flow upward to cities and corporate centres, the environmental costs remain local to rural areas. This pattern resembles older extractive economies, where peripheries absorbed damage while centres enjoyed growth.

Supporters of AI often respond that the same technology can help address climate change by optimizing energy systems, improving logistics, and advancing climate science. This claim is not entirely unfounded. AI can indeed contribute to environmental solutions. But this potential depends on how and why AI is deployed. At present, most AI development is driven by competition, speed, and market dominance rather than ecological necessity. When performance and scale are the primary goals, environmental considerations become secondary, sometimes even invisible.

With these findings connected to AI, an illusion has been disrupted, that AI is a different world not touched to nature and environment. The truth is, that every algorithm runs on hardware. Every calculation consumes energy. Every server needs cooling. Forgetting these basics leads to policy blind spots where AI expansion proceeds without serious environmental accounting. In this sense, AI exposes a broader failure to reconcile technological ambition with ecological responsibility.

None of this suggests that AI should be rejected outright. The issue is not technology itself, but direction and restraint. Sustainability cannot be an afterthought added through glossy reports or voluntary pledges. It must be built into the design, regulation, and deployment of AI systems. Transparent reporting of energy and water use, environmental assessments for large models, and careful decisions about where data centres are located are no longer optional.

Most importantly, societies must ask harder questions about necessity. Not every AI application carries equal social value. Using water-intensive systems to refine advertising algorithms is not the same as deploying AI for healthcare diagnostics or disaster forecasting. When environmental costs are real and rising, prioritization becomes unavoidable.

The central question, then, is not whether AI is inherently anti-environment. It is whether existing political and economic systems allow AI to grow without ecological limits. The evidence increasingly suggests that, without restraint, AI risks becoming another driver of environmental degradation disguised as progress. A truly intelligent society would ensure that its most powerful technologies operate within the boundaries of the planet that sustains them.