A transparent, citation-backed look at what AI compute actually costs the planet — and why regenerative contribution matters.
“How much CO₂ does one person emit from using AI?” The honest answer is: it varies wildly. Per-interaction impacts depend on model choice, response length, and especially context length (big pasted docs, repo-scale code context), plus data-center efficiency and grid carbon intensity.
Using widely cited estimates, a typical frontier-chatbot text query is plausibly in the range 0.3–2.9 Wh, with long-context events reaching ~2.5 Wh (10k input tokens) and ~40 Wh (100k input tokens) (Epoch AI; IEEE Spectrum). Using a global-average grid intensity of about 445 g CO₂/kWh (2024) (IEA), that’s roughly 0.13–1.29 g CO₂ per typical query, with long-context outliers far higher.
Beyond CO₂, AI has material ecological impacts via water use (cooling and electricity generation) and hardware lifecycle (mining, manufacturing, e‑waste). These matter because some impacts are place-based and not captured by carbon arithmetic alone (Li et al., 2023).
Epoch AI (Feb 2025) estimates ~0.3 Wh for a “typical” GPT‑4o text query using updated assumptions about utilization and token counts, and highlights that long input contexts can dominate: ~2.5 Wh (10k) to ~40 Wh (100k) (Epoch AI).
IEEE Spectrum reports an analysis implying ~2.9 Wh per query and discusses sector-scale growth in demand (IEEE Spectrum).
Using 445 g CO₂/kWh as a global-average reference for 2024 (IEA):
For country-specific factors, the IEA provides emissions-factor datasets (IEA Emissions Factors).
10–30 prompts/day
50–200 prompts/day
0.3–3 kWh/day inference
Water impacts arise from direct cooling and electricity generation. Academic work argues AI’s water footprint is underreported (Li et al., 2023). More recent scenario modeling examines global water consumption in AI-driven data centers (Journal of Cleaner Production, 2025).
Operational energy is only part of the picture: accelerators carry embodied emissions and upstream ecological impacts. A 2025 cradle-to-grave assessment examines AI accelerator lifecycle emissions (arXiv:2502.01671), and broader lifecycle reviews emphasize supply-chain and end-of-life burdens (LCA review, 2025).
This is why Regenerative Compute frames its work as regenerative contribution, not carbon offsetting. We fund verified ecological regeneration alongside AI usage — covering carbon, biodiversity, marine, and species stewardship credits — because the real impacts of AI go well beyond CO₂.
Where Wh/query = 0.3–2.9 (typical), g CO₂/kWh = 445 (global average) or a country-specific value.
Example: 50 queries/day × 365 days × 0.3 Wh = 5,475 Wh = 5.475 kWh/yr → ~2.44 kg CO₂/yr at 445 g/kWh.
Ready to account for your AI footprint?
Choose Your Plan