CH Health Tech Advisory

28 January 2025 · 2 min read

Everybody is talking about DeepSeek and how it just erased a sizeable chunk of NVIDIA's market c...

Everybody is talking about DeepSeek and how it just erased a sizeable chunk of NVIDIA's market cap — so I asked DeepSeek R1 itself what this means for AI's future energy consumption. The answer is nuanced, and I'm quite impressed with what it came up with.

Last updated

6 May 2026

Everybody is talking about DeepSeek and how it just erased a sizeable chunk of NVIDIA's market cap. A good friend of mine from the energy sector just asked me what that means for the future energy consumption that AI was supposed to grow.

Instead of pulling out my own crystal ball I just asked DeepSeek (R1) itself. Here is what I came up with, and I'm quite impressed:

The emergence of efficient AI training methods, such as those used by DeepSeek, suggests potential reductions in energy consumption per model. However, the overall impact on global energy demand is nuanced and depends on several interrelated factors:

  1. Algorithmic Efficiency: DeepSeek's use of optimized algorithms and simpler chips could lower energy use per model by reducing computational complexity. This efficiency might set a precedent for the industry, encouraging adoption of similar techniques.

  2. Jevons Paradox: Increased efficiency often leads to expanded usage. Cheaper, faster training could incentivize more organizations to develop AI models, potentially increasing the total number of models in use and offsetting per-unit energy savings.

  3. Hardware Innovation: Specialized, energy-efficient chips (e.g., TPUs, ASICs) could reduce operational energy demands. However, the environmental cost of manufacturing these chips and their lifecycle emissions must also be considered.

  4. Scalability and Adoption: Widespread adoption of efficient methods across the AI industry would amplify energy savings. If such practices remain niche, their global impact will be limited.

  5. Energy Sources: The carbon footprint of AI depends on the energy mix powering data centers. Renewables would mitigate environmental impact, even with higher demand, while fossil fuels would exacerbate it.

  6. Regulatory and Market Pressures: Policies mandating energy efficiency or carbon reporting could drive greener practices. Absent regulation, market forces might prioritize performance over sustainability.

  7. Historical Precedent: Past tech efficiency gains (e.g., Moore's Law) often led to increased consumption through expanded applications. AI could follow this pattern, with efficiency gains enabling broader use cases.

Conclusion: While DeepSeek-like advancements may temper energy demand growth compared to worst-case projections, the overall trajectory depends on whether efficiency gains outpace the expansion of AI deployment. Energy consumption could still rise significantly if adoption accelerates, but proactive measures—such as renewable energy integration, regulatory frameworks, and industry-wide efficiency standards—could help align AI growth with sustainability goals. The planned energy increase might be less severe than feared, but it is unlikely to be negligible without systemic changes.