March 25, 2026 — As AI rapidly becomes a core infrastructure for modern businesses, its energy footprint is scaling just as quickly. Global data center electricity consumption is expected to more than double this decade, rising from roughly 415 TWh in 2024 to nearly 945 TWh by 20301. At the same time, the majority of this demand is driven not by model training, but by inference — the everyday queries and workloads that power real-world applications. 

Yet despite this rapid growth, most organizations still have little to no visibility into the resources behind their AI usage. Inference is typically priced per token, not per unit of compute or energy, leaving companies unable to measure, compare, or optimize the real cost of running AI at scale. 

Neuralwatt and GreenPT today announced a strategic partnership to set a new global standard for sustainable AI inference. The collaboration is designed as a direct response to this gap; introducing infrastructure where energy usage is measurable, transparent, and tied to how AI is actually consumed. 

Most AI inference today operates as a black box. Organizations pay per token with no visibility into the energy their workloads consume, no way to compare efficiency across models, and no connection between what they spend and the resources behind it.  As inference workloads now account for an estimated 80–90% of AI compute usage, this lack of transparency is no longer just a technical limitation, it is becoming an operational and financial risk. Neuralwatt and GreenPT are building the modern alternative. 

“For too long, the AI industry has treated energy as an afterthought, with little visibility into real costs,” said Chad Gibson, co-founder and CEO of Neuralwatt. “We built Neuralwatt Cloud to change that, and together with GreenPT, we’re showing that energy-efficient, transparent inference is where the industry is headed.” 

The partnership is centered on complementary expertise and a joint commitment to bring more sustainable AI products to market. GreenPT’s deep experience in renewable infrastructure, carbon measurement, and privacy-first AI, combined with Neuralwatt’s GPU-level energy optimization and energy-based pricing model, creates a foundation for building AI inference that holds itself to a higher standard than what the industry offers today. 

“We’ve built an infrastructure that proves sustainable AI isn’t a compromise, but a competitive advantage,” said Robert Keus, co-founder and CEO of GreenPT. “With Neuralwatt, we’re taking this a step further by gaining deeper insight into energy usage and how it can be optimized. Together, we’re moving toward AI systems where performance, cost, and energy are part of the same decision.” 

Engineering AI for a sustainable future 

Both companies were built on the same conviction. AI’s energy challenge isn’t solely a supply problem, it’s an engineering one; and the solution starts with how inference is built and delivered, not just how much power is available. Too many AI companies start with great technology and then look for the market. Neuralwatt and GreenPT started from the people and places affected by it — customers who need to understand what their AI consumes, communities that are already feeling the strain of unchecked energy demand, and an environment that can’t absorb the cost of scaling AI in its current state. 

Building a measurable, energy‑transparent AI infrastructure 


The pressure on AI infrastructure is growing worldwide. Grids are aging, electricity costs are rising, and data center demand is expected to more than double by 2030. New regulations like the EU AI Act are introducing energy disclosure requirements2, while in the US, aging infrastructure and local resistance are slowing expansion. Yet many organizations still scale AI without clear insight into resource use. 

At the same time, AI is shifting from experimental to always-on infrastructure, making inference efficiency a key constraint alongside performance. Neuralwatt and GreenPT are responding by deepening collaboration on energy transparency, model optimization, and tools to measure and reduce AI’s energy footprint. The partnership reflects a broader shift toward accountable AI systems, where each request has a measurable impact on energy, carbon, and infrastructure. 

1 & 2: International Energy Agency

Leave a Reply