Can climate AI keep up with the insurmountable energy demands of the arms race? | Department of Technology

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

The rise of artificial intelligence has driven the prices of major tech stocks to fresh highs, but at the expense of the sector's climate ambitions.

Google acknowledged on Tuesday that the technology is jeopardizing its environmental goals after revealing that data centers, a key part of its AI infrastructure, have increased greenhouse gas emissions by 48 percent since 2019. I have helped. Emissions by 2030 – reducing the total amount of CO2 emissions it is responsible for to zero – includes “uncertainty around the future environmental impact of AI, which is complex and difficult to predict”.

So will tech be able to mitigate AI's environmental costs, or will industry plow on regardless because the reward for dominance is so great?


Why is AI a threat to tech companies' green goals?

Data centers are a core component of training and operating AI models such as Google's Gemini or OpenAI's GPT-4. They consist of sophisticated computing devices, or servers, that sift through the vast reams of data that underpin AI systems. They require large amounts of electricity to run, which produces CO2 depending on the energy source, as well as “embedded” CO2 from the cost of manufacturing and transporting the necessary equipment.

According to the International Energy Agency, total electricity consumption from data centers could double from 2022 levels to 1,000 TWh (terawatt hours) in 2026, equivalent to the energy demand of Japan, while research firm SemiAnalysis calculated AI will consume 4.5% of data centers. Global Energy Production to 2030. Water use is also critical, with one study estimating that AI could cause 6.6 billion cubic meters of water to be used by 2027 – around two-thirds of England's annual consumption.


What do experts say about environmental impacts?

A recent UK government-backed report on AI safety says the carbon intensity of the energy source used by tech firms is “a key variable” in working out the environmental cost of the technology. However, it adds that a “significant part” of AI model training still relies on fossil fuel-fired energy.

In fact, tech firms are making renewable energy deals in an effort to meet their environmental goals. Amazon, for example, is the world's largest corporate buyer of renewable energy. Some experts say it pushes other energy consumers to fossil fuels because there isn't enough clean energy to go around.

Alex de Vries, founder of Digiconomist, a website that monitors the environmental impact of new technologies, says, “Not only is energy consumption increasing, but Google is also struggling to meet this growing demand with sustainable energy sources. Is.”


Is there enough renewable energy that can be put away?

World governments plan to triple the world's renewable energy resources by the end of the decade to reduce fossil fuel consumption in line with climate goals. But the ambitious pledge, which was agreed at last year's COP28 climate talks, is already in doubt and experts fear that the rapid increase in energy demand from AI data centers could put it further out of reach. Is.

The world's energy watchdog, the IEA, has warned that although global renewable energy capacity is set to increase in 2023 at the fastest rate recorded in the last 20 years, the world will not meet its renewables targets by 2030 under current government plans. can only double the energy.

The answer to AI's energy hunger could be for tech companies to invest more in building new renewable energy projects to meet their growing electricity demand.


How soon can we build new renewable energy projects?

Onshore renewable energy projects such as wind and solar farms are relatively fast to build – they can take less than six months to build. However, the global logjam in connecting new projects to the power grid, as well as slow planning behavior in many developed countries, can add years to the process. Offshore wind farms and hydropower schemes face similar challenges with construction times of between two and five years.

This has raised concerns about whether renewable energy can keep pace with the expansion of AI. According to the Wall Street Journal, big tech companies have already used a third of US nuclear power plants to power their data centers with low-carbon electricity. But without investment in new sources of electricity, these deals will divert low-carbon electricity away from other consumers, leading to more fossil fuel consumption to meet aggregate demand.


Will AI power demand forever?

The general principles of supply and demand dictate that, as AI consumes more electricity, the price of energy rises and the industry is forced into economies of scale. But the unique nature of the industry means that the world's biggest companies will instead decide to hike the price of electricity, burning billions of dollars as a result.

The largest and most expensive data centers in the AI ​​sector are those used to train “frontier” AI, systems like the GPT-4o and Claude 3.5 that are more powerful and capable than any other. The leader in the field has changed over the years, but OpenAI is generally at the top, battling for position with Anthropic, the maker of Cloud, and Google's Gemini.

Already, “frontier” competition is thought to be “winner-takes-all,” with very few users preventing them from reaching the latest leader. This means that if a business spends $100 million on training for a new AI system, its competitors will have to decide whether to spend more or be out of the race altogether.

Worse, the race to so-called “AGI,” AI systems capable of doing anything a human can do, means spending hundreds of billions of dollars on the same training. Might be worth it – if doing so allows your company to monopolize a technology that, as OpenAI puts it, “can elevate humanity”.


Won't AI firms learn to use less electricity?

Every month, there are new advances in AI technology that enable companies to do more with less. In March 2022, for example, a DeepMind project called Chinchilla showed researchers how to train frontier AI models using very little computing power, the amount of training data and the size of the resulting model. By changing the ratio between

But this did not result in the same AI systems using less power. Instead, this results in the same amount of electricity being used to build an even better AI system. In economics, this phenomenon is known as “Geone's Paradox”, after the economist who noted that James Watt's improvement of the steam engine, which allowed much less coal to be used, Instead, the amount increased dramatically. Fossil fuels burned in England. As the cost of steam power fell after the invention of the watt, new uses were discovered that were not profitable when electricity was expensive.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment