Active Learning Efficiency Benchmark for Coreference Resolution Including Advanced Uncertainty Representations | IEEE Conference Publication | IEEE Xplore

Active Learning Efficiency Benchmark for Coreference Resolution Including Advanced Uncertainty Representations


Abstract:

Active learning is a powerful technique that accelerates model learning by iteratively expanding training data based on the model's feedback. This approach has proven par...Show More

Abstract:

Active learning is a powerful technique that accelerates model learning by iteratively expanding training data based on the model's feedback. This approach has proven particularly relevant in natural language processing and other machine learning domains. While active learning has been extensively studied for conventional classification tasks, its application to more specialized tasks like neural coreference resolution has the potential for improvement. In our research, we present a significant advancement by applying active learning to the neural coreference problem, and setting a benchmark of 39 % reduction in required annotations for training data. Simultaneously, it preserves performance compared to the original model trained on the full data. We compare various uncertainty sampling techniques along with Bayesian modifications of coreference resolution models, conducting a comprehensive analysis of annotation efforts. The results demonstrate that the best-performing techniques seek to maximize label annotation in previously chosen documents, showcasing their effectiveness and preserving performance.
Date of Conference: 24-26 November 2023
Date Added to IEEE Xplore: 25 March 2024
ISBN Information:
Conference Location: Xi’an, China

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.