- Resources
- For investors in AI, these 5 questions can help unpack environmental risk
Resources
For investors in AI, these 5 questions can help unpack environmental risk
Published: December 16, 2025 by Jenny Mandel, Senior Manager, Strategic Initiatives, EDF, Andrew Howell, CFA, Senior Director and Head of Research for Sustainable Finance, EDF
The rush to commercialize AI is transforming the technology industry from asset-light, knowledge-based companies to capital- and energy-intensive infrastructure developers. Already enabled by an unprecedented surge in capital, AI spending is projected to reach more than $400 billion in 2025, and McKinsey projects data centers to soak up nearly $7 trillion of capital expenditure worldwide by 2030.
The industry’s focus on speed and scale translates to an urgent need for investors to understand the physical footprint and supply chain risks of AI and its ecosystem – the utilities powering AI data centers, the chipmakers providing hardware, real estate firms developing massive data complexes, and the suppliers of critical minerals and other commodities for tech hardware. Failure to do so could put invested capital at risk.
The “build at any cost” approach of the past few years presents challenges for communities and the public interest. AI infrastructure, especially emissions from the electricity that runs data centers and the water often used to cool them, has real implications for human health and the environment. How that infrastructure is planned, built and paid for affects both nearby communities and electricity customers across wide regions – not to mention the global impacts of climate change driven by AI-related greenhouse gas emissions.
If developed in the right way, AI investments could usher in a clean, modernized power system that lowers electricity costs and helps tame climate change. But the current path points in another direction: a breakneck pace that stresses the environment in myriad ways, ignores efficiency and innovations that can improve environmental outcomes and reduce costs, and risks a backlash – from regulators, customers, and the public at large – with a host of negative consequences for tech companies and their backers. Now is the time for investors to engage with the sector on how companies are incorporating sustainability drivers into data center growth.
Here are five questions that investors can ask technology companies to start the conversation:
1. How are you leveraging efficiency, innovation and zero carbon electricity across the AI supply chain?
Demand for clean electricity is being outpaced by the intense energy needs of AI computing, putting a strain on tech companies’ climate targets. A priority for any tech company should be maximizing efficiency – from chip and model design through development of the data center itself. Similarly, there is a range of innovative customer-side approaches that can shift, manage, or reduce power demand, and tech companies should make full use of these to reduce environmental footprint and cost.
Beyond these measures, data centers should be powered wherever possible from carbon-free energy resources. Developers should engage utilities to accelerate the deployment of clean energy and enabling technologies like energy storage, and support decarbonizing the grid over time to improve the long-term environmental performance of their data center assets.
A range of promising practices use efficiency and innovative technologies to manage energy use and associated costs. Google uses grid emissions data to shift some large computing workloads to times when renewable power is available. Apple has invested heavily in battery-paired solar projects to expand clean power supply at times of peak demand. Tech firms should be able to show how they’re systematically addressing the need for zero carbon power throughout their data center planning, offer load flexibility to support the grid, and maximize opportunities to reduce energy consumption.
2. Where your power is supplied by natural gas, either onsite or indirectly through a utility company, what measures are you taking to minimize carbon dioxide and methane emissions?
Natural gas accounts for over 40% of US power generation, and powering data centers from existing grid infrastructure or newly added generation can translate to increased fossil fuel pollution. Research from Goldman Sachs suggests natural gas use for AI could increase global carbon emissions by a cumulative 220 million tons – equivalent to Spain’s yearly emissions – through 2030 before carbon-free energy supply catches up with demand growth.
Where natural gas use is unavoidable in the near term, companies should address the associated CO₂ emissions through long-term planning for low-carbon alternatives and explore mitigation technologies such as high-quality carbon capture, utilization and storage.
Upstream from the utility, leaks and intentional venting of methane – a greenhouse gas with more than 80 times the warming power of CO₂ over 20 years – drive up data center climate impacts. Methane emissions differ significantly between regions: aerial surveys find methane loss rates varying by a factor of eight between the highest- and lowest-emitting sourcing areas.
AI companies that are serious about their climate impact should account for and seek to reduce these potentially significant upstream methane emissions. Companies should track and report upstream methane emissions from data center energy use, and engage with their fuel and electricity suppliers to source lower-methane-emissions natural gas. This could include prioritizing gas supplied by members of the Oil and Gas Methane Partnership 2.0, a framework for measuring and reducing methane emissions, or supporting credible efforts to measure, report, and verify lower-emissions gas.
3. What steps are you taking to ensure that data center neighbors and ratepayers are not burdened with additional costs or pollution?
The large power demands of data centers can strain local grids and trigger costly investments in system upgrades – with costs passed along to other utility customers and even to other utilities across the region. Federal data shows that US average electricity prices increased 13% over the past three years and will continue to climb, and studies show that growing demands on the power grid from data centers are contributing to the run-up.
In addition, due to gaps in the patchwork of federal, state and local emissions rules, local communities can bear the brunt of air pollution such as smog-producing nitrogen oxides and particulates from backup diesel generators and other new fossil fuel equipment deployed to meet data center load. These impacts often fall disproportionately on lower-income communities and in areas with elevated industrial air pollution.
Tech companies should be expected to pay the full cost of the generation and associated infrastructure they require to offer AI products and services. Investors should consider how companies are addressing local and regional impacts of data center growth to minimize impacts and preserve social license to operate, including how they engage with communities, utilities and regulators to take full responsibility for their power system costs and not impose additional pollution burdens.
4. How do you incorporate water availability and water stress into your data center design and siting processes?
The density of computing power within AI data centers requires sophisticated cooling systems to keep chips functioning. Many cooling systems rely on circulating liquids to manage this heat, and there are often tradeoffs between the energy intensity and water intensity of cooling system design. Analysts further distinguish between the water directly used within data center operations, which can be on the scale of water use by a small town, and the indirect water use associated with generating electricity, which can be far higher – 80% or more of a data center’s water footprint.
A recent Ceres report found that data center water use in Phoenix is expected to increase by 400% in the coming years, driving increased water stress year-round with higher seasonal spikes. With industry development projections heavily concentrated in a limited number of states, assessing both regional water availability and local water stress at proposed data center locations is necessary to develop a clear picture of siting trade-offs. Investments in water resilience, for example through water conservation, infrastructure improvements and contaminant clean-up, can reduce local water stress and bring value to neighboring communities.
As data center deployment accelerates – including in water-stressed regions like Arizona, parts of Texas, and southern California – investors should evaluate operational risks from water availability and water stress, as well as regulatory risks and community concerns linked to the social and ecosystem impacts of water stress.
5. How are you engaging policymakers and regulators on enabling policy priorities?
The pace of data center growth has made long-standing bottlenecks in the US power system more apparent, and solutions more urgent. In some cases, changes in policy or regulation could make the power system more efficient, effective, and responsive to the people and businesses that rely on it.
One urgent policy issue is the lengthy delays that affect interconnection – the process of bringing new energy supply like solar and wind farms onto the grid. Across most of the country, long interconnection queues mean that projects take years to come online, while new load from data centers can be connected more quickly. Tech companies can engage on regulatory approaches to better match those timelines or connect clean energy resources onsite at data centers to speed access.
Another topic ripe for policy engagement is large load flexibility. Some data center operators are exploring ways to shift computing load, and the associated energy use, between times of day or geographic location to reduce strain on the grid. Incorporating large load flexibility into regulatory structures, with appropriate controls for power system management and safety, could support the use of these approaches to meet near-term electricity needs through efficient use of existing resources. Tech company engagement in policy design can support regulatory updates that align technical capabilities with system-wide benefits.
Conclusion
By integrating climate questions into investment decision-making and stewardship, capital providers can direct finance toward companies that have aligned near-term competitive drivers with long-term value within a low-carbon economy. In doing so, investors can differentiate between AI firms that are single-mindedly growing capacity and those that are scaling clean, resilient, and financially sustainable AI for the future.
Our Experts
-
Actionable Insights for a Decarbonizing WorldVisit our Investor Climate Insights Hub