US academics warn of huge AI energy requirements

Share
Credit: Chris Clor/Getty
AI’s potential pitfalls are becoming clear, with worries growing over job losses and privacy concerns; the world can add environmental damage to the list

The enormous energy requirements essential to artificial intelligence and machine learning technologies is an “800-pound gorilla” that needs to be tackled, US academics have warned. And if nothing is done to correct this, by 2040, all the power produced worldwide will be needed just for computing.

University of Pennsylvania’s Deep Jariwala, assistant professor in the Department of Electrical and Systems Engineering at the School of Applied Science and Engineering, and Benjamin Lee, Professor in the Department of Electrical and Systems Engineering and the Department of Computer and Information Science, issued the warnings in a recent interview.

“We take it for granted, but all the tasks our machines perform are transactions between memory and processors, and each of these transactions requires energy,” says Jariwala, assistant professor in the Department of Electrical and Systems Engineering at the School of Applied Science and Engineering at the University of Pennsylvania. “As these tasks become more elaborate and data-intensive, two things begin to scale up exponentially: the need for more memory storage and the need for more energy.”

An estimate from the Semiconductor Research Corporation, a consortium of all the major semiconductor companies, says that if the world continues to scale data at this rate using silicon to store memory, we will outpace the global amount of silicon produced every year, says Jariwala. 

“So, pretty soon we will hit a wall where our silicon supply chains won’t be able to keep up with the amount of data being generated,” he says. “Couple this with the fact that our computers currently consume roughly 20-25% of the global energy supply, and we see another cause for concern. If we continue at this rate, by 2040 all the power we produce will be needed just for computing, further exacerbating the current energy crisis.”

Concerns over computation carbon emissions

There is also concern about the operational carbon emissions from computation, explains Lee. “So even before products like ChatGPT started getting a lot of attention, the rise of AI led to significant growth in data centres, facilities dedicated to housing IT infrastructure for data processing, management, and storage.”

Companies like Amazon, Google, and Meta have been building massive facilities nationwide, says Lee, who also points out that data centre power and carbon emissions associated with data centres doubled between 2017 and 2020. 

“Each facility consumes in the order of 20 megawatts up to 40 megawatts of power, and most of the time data centres are running at 100% utilisation, meaning all the processors are being kept busy with some work,” says Lee. “So, a 20-megawatt facility probably draws 20 megawatts fairly consistently — enough to power roughly 16,000 households — computing as much as it can to amortise the costs of the data centre, its servers, and power delivery systems.”

The problem should be clear, says Jariwala, and it is“an 800-pound gorilla in the room”.

“Our computers and other devices are becoming insatiable energy beasts that we continue to feed,” he says. “That’s not to say AI and advancing it needs to stop because it’s incredibly useful for important applications like accelerating the discovery of therapeutics. We just need to remain cognizant of the effects and keep pushing for more sustainable approaches to design, manufacturing, and consumption.”

Share

Featured Articles

Cloudera: Simplifying the Middle East's Cloud Environments

Cloudera's new offerings showcased at GITEX is aimed at helping Middle Eastern Enterprises manage the complexities of modern cloud deployments

The Humanist Data Centre: Making City Life Sustainable

Mitch Clifton, Senior Designer at Woods Bagot, shares how data centres integrated into urban environments could improve city life and boost sustainability

US Data Centre Market is Evolving to Become Carrier Neutral

An independent study conducted on behalf of DE-CIX finds 80% of US Internet Exchanges are now data centre and carrier neutral to support AI, cloud & IoT

New JLL Data Centre Site Designed to Bolster AI Workloads

Without a Clear Data Strategy, Business AI Growth Could Fail

How Google is Making the Most of its Data Centre Spend