Every time you ask an AI assistant a question, a data center somewhere uses roughly the energy of keeping a lightbulb on for a few minutes. Multiply that by billions of queries per day, and you start to understand why AI's energy consumption is becoming a serious conversation.

The numbers. Training a single large AI model can consume as much electricity as 100 US households use in a year. Running that model — serving millions of users daily — consumes even more over time. The major AI companies are now among the largest energy consumers in the world.

Why this matters beyond the headlines. Energy demand from AI data centers is driving real infrastructure investment — new power plants, grid upgrades, and a renewed interest in nuclear energy. For businesses, this means: (1) the cost of AI services includes energy costs that are rising, even as compute costs drop, and (2) the physical infrastructure buildout is creating entire new industries and investment opportunities.

What SMBs should know. Your individual AI usage has negligible energy impact — this is an industry-scale challenge, not a per-business one. But if sustainability is part of your brand, it's worth understanding which AI providers are investing in renewable energy. Anthropic, Google, and Microsoft have all made significant commitments, though the gap between commitment and execution varies.

The bigger picture. AI's energy demand is accelerating investment in energy infrastructure that benefits everyone — more efficient data centers, new clean energy capacity, and grid modernization. The technology that uses the most energy may ultimately drive the innovations that solve the energy problem.