California leaders discuss ways to upgrade grid — why AI data centres could raise your bill

A.I
California leaders discuss ways to upgrade grid — why AI data centres could raise your bill
Lawmakers, regulators and utilities met in Sacramento as AI‑hungry data centres strain local grids. Here’s what planners are proposing — and what consumers might end up paying.

San Jose, Sacramento and the awkward question the grid didn’t expect

On Tuesday in Sacramento, a short meeting that sounded boring on paper felt oddly tense: california leaders discuss ways to upgrade the grid as AI drives power demand, they said — and the numbers behind the problem keep getting bigger. Lawmakers, utility planners and climate researchers traded blunt lines about rising electricity use, battery costs and the risk that ratepayers could shoulder billions for infrastructure that may never be used. The backdrop was twofold: the immediate, practical fixes pushed at the session — more batteries, virtual power plants, and utility incentives — and a much larger planning headache bubbling up in Silicon Valley, where data‑centre proposals could multiply local demand many times over.

The meeting reflected a familiar Californian contradiction. The state prides itself on ambitious clean‑energy targets but now faces a sudden, private‑sector surge of power demand from AI compute clusters that were never part of the 2045 plan. That collision is pushing regulators to decide how to forecast growth, which projects to approve, and who pays when utilities need new wires or extra capacity.

California leaders discuss ways to upgrade grid — the forecast gap and the data‑centre wildcard

Nobody disputes one simple fact: demand is changing. Utilities report a planning pipeline that, on paper, requests roughly 18.7 gigawatts of service for new data‑centre projects — a number big enough to power more homes than the entire state currently lists as customers in many normal calculations. Regulators and independent analysts say not all of those projects will be built, and the working forecast used by state agencies currently points toward something smaller: a few gigawatts of new load over the coming decades rather than the full planning ask.

That uncertainty is what Stanford’s Bits & Watts researchers described at the panel as the real problem. Liang Min told the audience that the growth in AI is not a steady ramp but a string of bets on new applications. Models built to forecast conventional electricity demand struggle with a business model that can change overnight if a new machine‑learning workload goes viral. "Right now we’re really struggling," he said. "The risk is extremely high in the application layers."

Local governments see both opportunity and risk. San Jose officials have started to estimate that planned projects could push the city's electricity needs toward multiples of current peaks, forcing choice: pause and require stricter proof a project will actually draw the power requested, or move fast to win investment and jobs. Both options carry costs.

California leaders discuss ways to upgrade grid: batteries, virtual power plants and market tweaks

The immediate toolkit discussed in Sacramento is familiar — because it works. Storage is cheaper, small‑scale batteries are more accessible to commercial customers, and the concept known as a virtual power plant (VPP) can aggregate thousands of home batteries, EV chargers and smart loads and present them to the grid as a dispatchable resource. Jigar Shah of Deploy Action noted the rapid fall in installed costs for these systems — "Five years ago it would have cost you $15,000 to put them in, today it's less than $5,000" — and pitched efficiency plus aggregation as the lowest‑cost path to absorb growth.

That leads to the thornier part of the debate: whether California needs more "clean firm" power—geothermal, nuclear or natural gas with carbon capture—to guarantee reliability while keeping emissions down. Several panelists, including Stanford and PG&E‑adjacent voices, argued that without some form of firm, dispatchable low‑carbon power the state risks leaning on more fossil generation as data‑centre loads grow.

Who pays? The political fight over cost allocation and transparency

Perhaps the loudest disagreement was about money. The California Public Advocates Office has warned that if utilities build large upgrades for new customers and those customers don’t materialize, ratepayers could be left paying for stranded equipment. That concern is especially acute because many data‑centre projects file large requests for capacity without committing to final construction timelines or to long‑term offtake.

PG&E’s public case is different: adding big new customers can spread fixed grid costs across a larger base and reduce average rates. The math is real — but it depends on geography and timing. A data centre connected to a lightly used substation is not the same as a cluster all asking for power in the same overloaded industrial corridor.

Some states have begun to split the bill. Oregon adopted tighter rules to prevent household bills from shouldering certain connection costs, and Minnesota has created a billing category to keep giant data‑centre costs separate from residential charges. California has so far refrained from aggressive legal limits, though the legislature and several commissions are watching closely and debating new transparency requirements that stalled earlier.

Can AI actually help the grid? Forecasting, demand response and smarter operations

There’s an irony: the technology stressing the grid might also help manage it. AI improves short‑term load forecasting, can optimise battery dispatch and detect grid faults faster. Panelists explained that better analytics can reduce reserve margins and improve utilization of renewables — but only if utilities and operators adopt new tools and open the right data channels.

Liang Min framed it bluntly: the unpredictable, fast‑moving AI application layer is the forecasting problem. But the same suites of models that power AI services can be retooled for probabilistic demand forecasts and to optimise VPP behaviour. The state and utilities are beginning pilot projects to test these approaches, but governance, data access and privacy remain barriers.

Environmental trade‑offs and local impacts

Not all solutions are equally popular. The conversation at the CalMatters event echoed environmentalists’ concerns: diesel backup generators on data‑centre sites create concentrated air‑pollution risk; water‑intensive cooling methods conflict with local water stress; and proposals for carbon capture and other controversial fixes raise community distrust. Regulators recognise that meeting new loads without increasing system emissions will require both more storage and reliable low‑carbon firm power — a mix that could include nuclear and geothermal in addition to large battery projects, depending on political choices.

Panelists repeatedly emphasised the need for transparency. The lack of consistent, mandatory data on planned demand makes it impossible for communities to know what they’re being asked to pay for or how local environmental burdens will change.

A few precedent moves and the narrow road ahead

Some practical steps are already under consideration: stricter disclosure on proposed load, pilot VPP programmes, utility incentives to prioritise upgrades where they reduce congestion, new billing categories for hyperscale customers and regional market changes to share capacity across a wider western footprint. California’s decision to join a broader Western power market is itself a market‑level response to more volatile, geographically concentrated demand.

But the state’s political economy matters: local governments want jobs and tax base, utilities want clear signals from regulators, environmental advocates want clean power and community groups want protection from localized pollution and bill shocks. That confluence means decisions will be slow, negotiated and imperfect.

Sources

  • Stanford University, Bits & Watts Initiative
  • UC Davis Energy & Efficiency Institute
  • California Public Advocates Office (California Public Utilities Commission)
  • California Independent System Operator (CAISO) — preliminary data‑center forecasts
  • Next 10 / University of California, Riverside (report on data‑centre emissions)

There is a tidy European‑adjacent lesson: Germany has the machinery for renewables and batteries, Brussels has the paperwork, and California now has to decide whether it will write a clean‑power playbook or outsource the problem to someone who will sell it gas. Either route is expensive — the only question is who signs the invoice.

Mattias Risberg

Mattias Risberg

Cologne-based science & technology reporter tracking semiconductors, space policy and data-driven investigations.

University of Cologne (Universität zu Köln) • Cologne, Germany

Readers

Readers Questions Answered

Q How is California planning to upgrade its power grid to meet AI-driven demand?
A
Q What grid improvements are proposed to support AI-powered data centers in the state?
A
Q How can AI help improve reliability and forecast demand on California's electricity grid?
A
Q What are the main challenges California leaders face in upgrading the grid for AI growth?
A
Q What role do renewables and energy storage play in California's AI-driven grid upgrade?
A

Have a question about this article?

Questions are reviewed before publishing. We'll answer the best ones!

Comments

No comments yet. Be the first!