Abstract:
In the Quantum-Train (QT) framework, mapping quantum state measurements to classical neural network weights is a critical challenge that affects the scalability and effic...Show MoreMetadata
Abstract:
In the Quantum-Train (QT) framework, mapping quantum state measurements to classical neural network weights is a critical challenge that affects the scalability and efficiency of hybrid quantum-classical models. The traditional QT framework employs a multi-layer perceptron (MLP) for this task, but it struggles with scalability and interpretability. To address these issues, we propose replacing the MLP with a tensor network-based model and introducing a distributed circuit ansatz designed for large-scale quantum machine learning with multiple small quantum processing unit nodes. This approach enhances scalability, efficiently represents high-dimensional data, and maintains a compact model structure. Our enhanced QT framework retains the benefits of reduced parameter count and independence from quantum resources during inference. Experimental results on benchmark datasets demonstrate that the tensor network-based QT framework achieves competitive performance with improved efficiency and generalization, offering a practical solution for scalable hybrid quantum-classical machine learning.
Published in: ICASSP 2025 - 2025 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Date of Conference: 06-11 April 2025
Date Added to IEEE Xplore: 07 March 2025
ISBN Information:
ISSN Information:
References is not available for this document.
References is not available for this document.