
The Hidden Cost of Artificial Intelligence: Resource Consumption and Its Global Impact
Jul 3
6 min read
0
3
0

Artificial Intelligence (AI) is rapidly transforming industries and daily life, offering unprecedented capabilities and efficiencies.
However, this technological advancement comes with a significant, often overlooked, environmental and social cost: its immense resource consumption.
This article delves into the energy and water demands of AI data centers, their environmental impacts, and the disproportionate effect on resource access for vulnerable populations. We will also examine the efficiency of using AI for small tasks and whether the benefits outweigh the environmental footprint.
AI Data Center Resource Consumption: A Growing Thirst for Power
The backbone of AI operations lies in vast data centers, which require enormous amounts of electricity to power their servers and cooling systems.
The demand for energy from these facilities is escalating at an alarming rate.
Escalating Electricity Demand
Projections indicate a dramatic increase in electricity consumption by data centers globally. The International Energy Agency (IEA) forecasts that global data center electricity demand will more than double by 2030, reaching approximately 945 terawatt-hours (TWh). This surge is largely driven by the expanding use of AI.
Some analyses suggest that data centers could account for up to 21% of overall global energy demand by 2030. Goldman Sachs Research further supports this trend, predicting a 50% increase in global power demand from data centers by 2027, and a staggering 165% rise by the end of the decade.
Currently, AI's energy use already constitutes a significant portion of global data center power demand, estimated at as much as 20%. In the United States, 4.4% of all energy consumed is attributed to data centers.
Looking ahead, AI data center power consumption could account for 8-12% of total US electricity demand by 2030, a substantial increase from the current 3-4%. To put this into perspective, by 2030, AI alone could consume as much electricity annually as 22% of all US households.
Water Usage and Cooling Systems in AI Data Centers: A Thirsty Business
Beyond electricity, AI data centers are significant consumers of water, primarily for cooling their high-performing servers. The intense heat generated by AI computations necessitates robust cooling solutions, many of which rely heavily on water.
Water Consumption for Cooling
Data centers require immense amounts of water to prevent their servers from overheating. A striking statistic reveals that each string of AI prompts can consume roughly 16 ounces of water at the data centers housing these AI systems.
The most common cooling methods, particularly evaporative cooling, are highly water-intensive. Cooling towers, a staple in many large data centers, can demand millions of gallons of water annually.
This direct water consumption for AI server cooling, along with indirect water usage (estimated by the IEA to be 60% of data center water consumption), contributes significantly to the overall water footprint.
Cooling Methods
Various cooling methods are employed in data centers, each with its own water implications:
Evaporative Cooling: This widely used method involves evaporating water to dissipate heat, making it a major contributor to water consumption.
Liquid-Based Cooling: This more efficient method uses chilled water or other coolants directly over or immersing equipment. While it can be highly efficient and space-saving, it still requires water.
Direct Water-Cooling Solutions: Some advanced systems recycle loops of warm water to cool data center systems, potentially reducing water consumption by up to 40% compared to traditional methods.
Adiabatic Cooling: This method, which uses water, is considered highly efficient for cooling data centers.
Open-System Water Cooling: Some data centers draw in water, use it for cooling, and then discharge it, raising concerns about water waste.
Data centers can source water from various supplies, including potable water, treated effluent, or reclaimed/recycled water.
However, the sheer volume required still places a strain on local water resources.
Environmental and Social Impacts: A Disproportionate Burden
The escalating resource consumption by AI data centers has profound environmental consequences, contributing to climate change and pollution. More critically, these impacts often disproportionately affect poorer populations, exacerbating existing inequalities in resource access.
Environmental Impacts
The energy consumed by data centers, much of which still comes from non-renewable sources, directly contributes to carbon emissions and air pollution. This pollution can have direct negative impacts on human health, particularly in communities located near these facilities.
The environmental footprint extends beyond energy; the proliferation of data centers also generates significant electronic waste. The training of a single AI model, such as a large language model (LLM), can be incredibly resource-intensive, consuming thousands of megawatt-hours of electricity and emitting hundreds of tons of CO2. Furthermore, AI tools are far more energy-intensive than traditional computing tasks; a single AI query can require 5 to 10 times more energy than a standard web search.
Social Impacts: Resource Access for Poorer Populations
The water demands of AI data centers present a critical social justice issue. Many data centers are strategically located in regions that are already water-stressed or prone to scarcity. The surging demand for water from these facilities exacerbates local water shortages, imperiling the socio-economic well-being of these regions.
For instance, arid regions like Saudi Arabia and the United Arab Emirates are welcoming more data centers, despite their inherent water scarcity. In the United States, drought-prone areas such as Arizona, Texas, and the upper Midwest are seeing a boom in data center construction, leading to concerns about water availability for agriculture and residential use.
Globally, big tech companies are building new data centers in water-scarce parts of five continents, including Latin American countries like Chile and Uruguay, where citizens have protested against planned data centers tapping into drinking water reservoirs.
Even in Europe, dry regions are facing increased pressure on their water reserves.
This can lead to increased competition for water, potentially driving up prices and making it harder for poorer communities to access this essential resource for drinking, agriculture, and sanitation.
Is Our Use of OpenAI for Small Tasks Worth the Data and Resource Consumption?
The question of whether using AI, specifically models like those from OpenAI, for small tasks is worth the associated data and resource consumption is complex.
While AI offers undeniable benefits in automation and efficiency, its environmental cost, even for seemingly minor queries, cannot be ignored.
Resource Consumption for Small AI Queries
Each query to a large language model (LLM), such as those powering ChatGPT or similar AI services, consumes significantly more energy than a typical web search.
Estimates suggest a single generative AI query can consume four to five times the energy of a standard search engine request. While the carbon footprint of an individual query might seem small, the cumulative effect of widespread and frequent use across billions of users becomes substantial.
Furthermore, queries that demand more complex logical reasoning or extensive content generation from AI chatbots require even greater energy input.
Worthiness for Small Tasks
On the one hand, AI excels at automating repetitive tasks, streamlining workflows, and improving decision-making.
For businesses and individuals, this can translate into increased productivity and efficiency.
Generative AI, for instance, can be highly effective for tasks like generating computer code, identifying and fixing bugs, or enhancing code quality. Many users find AI invaluable for quickly handling mundane or annoying tasks that would otherwise consume significant human time and effort.
OpenAI's newer, smaller models like o4-mini are specifically optimized for fast, cost-efficient reasoning, making them suitable for certain smaller tasks.
However, it's important to acknowledge that generative AI models can be slow for some tasks and may not always deliver precise results, requiring human oversight and refinement.
The environmental cost, particularly the energy and water consumption, must be weighed against these benefits. For simple tasks that could be accomplished with less resource-intensive methods, the environmental trade-off might not always be justified.
The decision to use AI for small tasks should consider the actual efficiency gains versus the environmental impact, encouraging a mindful approach to AI utilization.
Don't hesitate to contact me at emilie.cotenceau@gmail.com if you want to know more about the best use of AI
References
Energy Consumption & Projections
IEA on AI’s surging electricity demand:
https://www.iea.org/news/ai-is-set-to-drive-surging-electricity-demand
Goldman Sachs on data center power demand forecasts:
https://www.goldmansachs.com/insights/articles/ai-to-drive-165-increase
Water Usage & Cooling
Bloomberg on AI data centers and water scarcity:
https://www.bloomberg.com/graphics/2025-ai-impacts-data-centers-water
MIT Sloan on solutions for data center energy/water costs:
https://mitsloan.mit.edu/ideas-made-to-matter/ai-has-high-data-center-energy-costs
Environmental & Social Justice Impacts
The Guardian on Big Tech’s water footprint:
https://www.theguardian.com/environment/2025/apr/09/big-tech-datacentres-water
Lawfare on AI’s threat to global water security:
https://www.lawfaremedia.org/article/ai-data-centers-threaten-global-water-security
AI Efficiency & Alternatives
OpenAI on smaller, efficient models (o4-mini):
MIT News on generative AI’s environmental impact:
https://news.mit.edu/2025/explained-generative-ai-environmental-impact