Abstract:
Optimizing Python code is essential for enhancing performance and efficiency. This project investigates the use of large pre-trained language models, specifically GPT (Ge...Show MoreMetadata
Abstract:
Optimizing Python code is essential for enhancing performance and efficiency. This project investigates the use of large pre-trained language models, specifically GPT (Generative Pre-trained Transformer), for Python code optimization. By leveraging the advanced capabilities of these models, we aim to improve traditional optimization techniques. Our approach involves preprocessing Python code, feeding it into the pre-trained model, and generating optimized code sequences. Experimental results show significant improvements in code efficiency and execution time, validating the effectiveness of our method. Additionally, we explore the practical implications, challenges, and future directions of incorporating large pre-trained models into code optimization, aiming to bridge the gap between standard optimization techniques and Python programming for a more intelligent and efficient software development practice.
Date of Conference: 04-06 October 2024
Date Added to IEEE Xplore: 20 March 2025
ISBN Information: