Capitalizing On The U.S. Energy Surge With Natural Gas And Copper

Did you know that every time you type a query into ChatGPT, it requires about 10 times as much electricity to process as a Google search?

That’s according to Goldman Sachs, which writes in a new report that electricity consumption in the U.S. is poised for a major surge for the first time in years, due in large part to the rapid buildout of data centers that power AI platforms such as ChatGPT. Goldman says it’s projecting electricity demand to rise approximately 2.4% from 2022 to 2030, with data centers representing the largest growth segment at 0.9 percentage points—nearly a third of total new demand.

data centers

Goldman isn’t the only firm that’s forecasting huge changes to the U.S. energy grid. The Electric Power Research Institute (EPRI), a Washington, D.C.-based nonprofit, estimates that data centers could consume up to 9% of U.S. electricity generation by 2030, more than double their current consumption.

To help put things in perspective, ChatGPT currently has over 180 million users, but there are around 5.3 billion internet users around the world. Imagine if each of them became a regular user of energy-intensive ChatGPT, whose servers are located in the U.S., according to owner OpenAI.