Energy Estimates Across Layers of Computing: From Devices to Large-Scale Applications in Machine Learning for Natural Language Processing, Scientific Computing, and Cryptocurrency Mining1 | IEEE Conference Publication | IEEE Xplore

Energy Estimates Across Layers of Computing: From Devices to Large-Scale Applications in Machine Learning for Natural Language Processing, Scientific Computing, and Cryptocurrency Mining1


Abstract:

Estimates of energy usage in layers of computing from devices to algorithms have been determined and analyzed. Building on the previous analysis [3], energy needed from s...Show More

Abstract:

Estimates of energy usage in layers of computing from devices to algorithms have been determined and analyzed. Building on the previous analysis [3], energy needed from single devices and systems including three large-scale computing applications such as Artificial Intelligence (AI)/Machine Learning for Natural Language Processing, Scientific Simulations, and Cryptocurrency Mining have been estimated. In contrast to the bit-level switching, in which transistors achieved energy efficiency due to geometrical scaling, higher energy is expended both at the at the instructions and simulations levels of an application. Additionally, the analysis based on AI/ML Accelerators indicate that changes in architectures using an older semiconductor technology node have comparable energy efficiency with a different architecture using a newer technology. Further comparisons of the energy in computing systems with the thermodynamic and biological limits, indicate that there is a 27–36 orders of magnitude higher energy requirements for total simulation of an application. These energy estimates underscore the need for serious considerations of energy efficiency in computing by including energy as a design parameter, enabling growing needs of compute-intensive applications in a digital world.
Date of Conference: 25-29 September 2023
Date Added to IEEE Xplore: 25 December 2023
ISBN Information:

ISSN Information:

Conference Location: Boston, MA, USA

I. Introduction

As computing becomes ubiquitous from intelligent sensing at the edge to digitalization of systems and with wider adoption of Artificial Intelligence and Machine Learning (AI/ML), it is clear that energy used in computing is expected to increase non-linearly [1], [2]. In continuation of the earlier analysis [3], in this work, additional quantitative analysis to include hardware components from the transistor level to including the Application have been provided, where ‘Application’ is defined to include the entire simulation by the process of computation. Energy for Application is a metric that includes algorithms and software similar to energy/instructions at the hardware system level and energy/bit switching at the transistor level. Specific examples of computational simulations included in this analysis are training of a large language model (LLM) or inference needed for machine learning applications [5], large-scale simulation of a single Covid virion particle [6], and computer mining of a single Crypto coin.

Contact IEEE to Subscribe

References

References is not available for this document.