Loading [MathJax]/extensions/MathMenu.js
Ensuring Dataset Accountability in Machine Learning: Insights from Software Engineering | IEEE Conference Publication | IEEE Xplore

Ensuring Dataset Accountability in Machine Learning: Insights from Software Engineering


Abstract:

Machine learning is facing a crisis of accountability. On some tasks, deep learning can match or surpass human performance. It is extensive This work may be reproduced in...Show More

Abstract:

Machine learning is facing a crisis of accountability. On some tasks, deep learning can match or surpass human performance. It is extensive This work may be reproduced in whole or in part, for educational or personal purposes, in hard copy or digital format without monetary compensation as long as copies are made or distributed for non-commercial purposes and carry this notice and the whole citation on the first page. Conversely, the datasets that are crucial to machine learning (ML) are typically produced using opaque creation processes, poor maintenance, poor documentation, and a lack of answerability, which frequently results in errors. The study focused on several problems related to software engineering tasks, including performance evaluation metrics, software metrics, failure prediction, and problems with data quality. The paper draws attention to many methodological problems and difficulties associated with these software fault prediction tasks. Feature extraction and classification are commonly used to explore the excessive dimensionality of data and data class imbalance linked to software quality issues.
Date of Conference: 18-20 September 2024
Date Added to IEEE Xplore: 15 January 2025
ISBN Information:
Conference Location: Greater Noida, India

References

References is not available for this document.