arXiv Analytics

Sign in

arXiv:2410.18618 [quant-ph]AbstractReferencesReviewsResources

Adiabatic training for Variational Quantum Algorithms

Ernesto Acosta, Carlos Cano Gutierrez, Guillermo Botella, Roberto Campos

Published 2024-10-24Version 1

This paper presents a new hybrid Quantum Machine Learning (QML) model composed of three elements: a classical computer in charge of the data preparation and interpretation; a Gate-based Quantum Computer running the Variational Quantum Algorithm (VQA) representing the Quantum Neural Network (QNN); and an adiabatic Quantum Computer where the optimization function is executed to find the best parameters for the VQA. As of the moment of this writing, the majority of QNNs are being trained using gradient-based classical optimizers having to deal with the barren-plateau effect. Some gradient-free classical approaches such as Evolutionary Algorithms have also been proposed to overcome this effect. To the knowledge of the authors, adiabatic quantum models have not been used to train VQAs. The paper compares the results of gradient-based classical algorithms against adiabatic optimizers showing the feasibility of integration for gate-based and adiabatic quantum computing models, opening the door to modern hybrid QML approaches for High Performance Computing.

Comments: 12 pages, 6 figures, Euro PAR 2024 EuroQHPC Workshop
Categories: quant-ph, cs.ET
Related articles: Most relevant | Search more
arXiv:2311.12771 [quant-ph] (Published 2023-11-21)
Mod2VQLS: a Variational Quantum Algorithm for Solving Systems of Linear Equations Modulo 2
arXiv:2207.01801 [quant-ph] (Published 2022-07-05)
Knowledge Distillation in Quantum Neural Network using Approximate Synthesis
arXiv:2506.16938 [quant-ph] (Published 2025-06-20)
Enhancing Expressivity of Quantum Neural Networks Based on the SWAP test