I. Introduction
Machine learning (ML) technologies has brought unparalleled capabilities in data processing and analysis. However, this progress comes with a hidden cost: a significant increase in carbon emissions due to the high computational requirements of ML models. The utilization of the resources like PUs, GPUs, and RAM intensively in complex model development, as seen in generative AI, contribute substantially to environmental concerns. For instance, prominent LLMs like T5 [1], Meena [2], and GPT [3] require extensive computations, leading to high carbon emissions during their training and testing phases[4], [5], [6].