QuACK: Accelerating Gradient-Based Quantum Optimization with Koopman Operator Learning
arxiv(2022)
摘要
Quantum optimization, a key application of quantum computing, has
traditionally been stymied by the linearly increasing complexity of gradient
calculations with an increasing number of parameters. This work bridges the gap
between Koopman operator theory, which has found utility in applications
because it allows for a linear representation of nonlinear dynamical systems,
and natural gradient methods in quantum optimization, leading to a significant
acceleration of gradient-based quantum optimization. We present Quantum-circuit
Alternating Controlled Koopman learning (QuACK), a novel framework that
leverages an alternating algorithm for efficient prediction of gradient
dynamics on quantum computers. We demonstrate QuACK's remarkable ability to
accelerate gradient-based optimization across a range of applications in
quantum optimization and machine learning. In fact, our empirical studies,
spanning quantum chemistry, quantum condensed matter, quantum machine learning,
and noisy environments, have shown accelerations of more than 200x speedup in
the overparameterized regime, 10x speedup in the smooth regime, and 3x speedup
in the non-smooth regime. With QuACK, we offer a robust advancement that
harnesses the advantage of gradient-based quantum optimization for practical
benefits.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要