Now, the creator Artificial intelligence is impossible to ignore online. AI-generated summaries may randomly appear at the top of results whenever you do a Google search. Or you might be prompted to try Meta's AI tool while browsing Facebook. And that ever-present glowing emoji haunts my dreams.
This rush to incorporate AI into more and more online interactions can be traced back to OpenAI's boundary-pushing release of ChatGPT in late 2022. Silicon Valley soon became obsessed with generative AI, and about two years later, AI tools powered by large language models. Improve online user experience.
An unfortunate side effect of this proliferation is that the computing processes required to run generative AI systems are very resource-intensive. This ushered in the era of high-use Internet, defined by the proliferation of a new type of computing that required inordinate amounts of electricity and water to build and operate.
“At the back end, these algorithms that any generative AI model needs to run are fundamentally very different from a traditional type of Google search or email,” says Sajjad Mozini, a computer engineering researcher at the University of Washington. ” “For core services, they were very light in terms of the amount of data that needed to go back and forth between processors.” In comparison, Muzzini estimates that creative AI applications are about 100 to 1,000 times more computationally intensive.
The energy needs of technology for training and deployment are no longer AI's dirty little secret, as expert after expert last year predicted an increase in energy demand in data centers where companies work on AI applications. do Almost as if on cue, Google recently stopped considering itself carbon neutral, and Microsoft may be trampling its sustainability goals in the ongoing race to build the biggest, best AI tools.
“The carbon footprint and energy consumption will be proportional to the amount of computing you do, because basically powering these data centers is proportional to the amount of computing they do,” says Jinchen Jiang, a network systems researcher at the University of Chicago. goes”. The larger the AI model, the more computation is required, and these frontier models are getting quite large.
Although Google's total energy consumption doubled from 2019 to 2023, company spokeswoman Corina Standford said it would not be fair to suggest that Google's energy consumption increased during the AI race. “It is extremely challenging to reduce emissions from our suppliers, who make up 75 percent of our footprint,” she says in an email. Among the suppliers Google blames are makers of servers, networking equipment, and other technical infrastructure for data centers—an energy-intensive process required to make the physical parts for frontier AI models.