Training and running large AI models requires enormous and growing amounts of energy, with data centre power demand projected to double or triple by 2030, raising environmental and infrastructure concerns.
Verified: 1 March 2026 · Last updated: 1 March 2026 · Region: international
The energy demands of AI are growing rapidly:
- Training a single frontier AI model can consume as much electricity as thousands of homes use in a year
- The International Energy Agency projects data centre electricity consumption could double by 2030
- Major tech companies are investing in nuclear power and other large-scale energy sources specifically for AI
- This growth is happening while countries are trying to reduce carbon emissions
The energy implications of AI development are rarely discussed in governance debates but represent a significant policy issue affecting climate commitments, energy security, and infrastructure planning.
Sources (1)
- Primary Source
energyenvironmentinfrastructuredata centres