arXiv:2410.18618 [quant-ph]AbstractReferencesReviewsResources
Adiabatic training for Variational Quantum Algorithms
Ernesto Acosta, Carlos Cano Gutierrez, Guillermo Botella, Roberto Campos
Published 2024-10-24Version 1
This paper presents a new hybrid Quantum Machine Learning (QML) model composed of three elements: a classical computer in charge of the data preparation and interpretation; a Gate-based Quantum Computer running the Variational Quantum Algorithm (VQA) representing the Quantum Neural Network (QNN); and an adiabatic Quantum Computer where the optimization function is executed to find the best parameters for the VQA. As of the moment of this writing, the majority of QNNs are being trained using gradient-based classical optimizers having to deal with the barren-plateau effect. Some gradient-free classical approaches such as Evolutionary Algorithms have also been proposed to overcome this effect. To the knowledge of the authors, adiabatic quantum models have not been used to train VQAs. The paper compares the results of gradient-based classical algorithms against adiabatic optimizers showing the feasibility of integration for gate-based and adiabatic quantum computing models, opening the door to modern hybrid QML approaches for High Performance Computing.