AI Energy Demand Is Soaring but Not Because of Consumer Queries : US Pioneer Global VC DIFCHQ SFO NYC Singapore – Riyadh Swiss Our Mind

  • Nearly half of U.S. electricity demand growth by 2030 will come from AI-driven data centers, with consumers absorbing higher costs.
  • AI companies provide little to no transparency on energy use or emissions, leaving regulators and consumers in the dark.
  • While AI could eventually offset emissions through innovation, today it is fueling both rising utility bills and climate concerns.

AI

Artificial intelligence (AI) is eating up more and more energy all the time as large language models become increasingly complex and pervasive. In the United States, nearly half of all growth in electricity demand between now and 2030 will come from data centers, driven by the AI boom. But the problem isn’t your daily queries to ChatGPT – it’s indiscriminate AI integration in technologies and services that are far outside the end-users’ control. Yet, it’s consumers who are footing the bill for soaring energy demand.

We don’t know exactly how much energy large language models are consuming, because AI companies aren’t required to disclose the information. As a result, the vast majority of them do not, and the sector is characterized by opacity when it comes to environmental impact. As of May, 84 percent of all large language model traffic was conducted on AI models with zero environmental disclosure. While many researchers are trying to calculate AI’s energy footprint, it’s a difficult task – especially because models are changing all the time, generating shifts in terms of both increased complexity and increased efficiency.

“It blows my mind that you can buy a car and know how many miles per gallon it consumes, yet we use all these AI tools every day and we have absolutely no efficiency metrics, emissions factors, nothing,” says Sasha Luccioni, climate lead at an AI company called Hugging Face. “It’s not mandated, it’s not regulatory. Given where we are with the climate crisis, it should be top of the agenda for regulators everywhere,” she went on to say.

But while we don’t know exactly how much energy AI models use, we do know that it’s a lot. “AI’s integration into almost everything from customer service calls to algorithmic “bosses” to warfare is fueling enormous demand,” reports the Washington Post. “Despite dramatic efficiency improvements, pouring those gains back into bigger, hungrier models powered by fossil fuels will create the energy monster we imagine.”

That being said, there are many things that we as consumers do each and every day that contribute far more to global greenhouse emissions. A handful of AI queries per day is negligible compared to other common and under-scrutinized practices. Watching TV and streaming videos on the internet is likely a far greater culprit of energy usage if your lifestyle is anything close to the average American’s. And your work commute is surely generating much more greenhouse gas emissions.

Put simply, the spike in energy demand from AI models is not consumers’ fault – but it is their problem. While tech companies are consuming more and more energy each year to power their AI ambitions, common consumers are footing the bill. Not only are consumers paying the literal price for AI expansion, but they will also have to bear the burden of the sector’s environmental impacts. Silicon Valley’s backtracking on climate pledges, for example, will directly impact global communities, whether or not they ever benefit from AI.

“We are witnessing a massive transfer of wealth from residential utility customers to large corporations-data centers and large utilities and their corporate parents, which profit from building additional energy infrastructure,” Maryland People’s Counsel David Lapp recently told Business Insider. “Utility regulation is failing to protect residential customers, contributing to an energy affordability crisis.”

On the other hand, AI is gaining efficiency all the time and will be instrumental to reshaping global industries, including the energy sector, to be greener. Large language models can help advance technological breakthroughs for significant emissions gains, with noted potential for innovations in batteries and solar power. The International Energy Agency reports that increased emissions from data centers could even eventually be offset if AI is used to lower emissions from other sectors.

We’re currently in the messy exploration stages of a global transformation, and the up-front costs in terms will be – and already are – high. Training large language models is incredibly energy- and resource-intensive. But as AI advances, we will get much better at learning how to optimize it, and it could be a net benefit – even in terms of emissions – further down the road. But until then, consumers will be paying the price.

https://oilprice.com/Energy/Energy-General/AI-Energy-Demand-Is-Soaring-but-Not-Because-of-Consumer-Queries.amp.html