Quantum Circuit Optimization through Iteratively Pre-Conditioned Gradient Descent

2023 IEEE International Conference on Quantum Computing and Engineering (QCE)(2023)

引用 0|浏览2
暂无评分
摘要
For typical quantum subroutines in the gate-based model of quantum computing, explicit decompositions of circuits in terms of single-qubit and two-qubit entangling gates may exist. However, they often lead to large-depth circuits that are challenging for noisy intermediate-scale quantum (NISQ) hardware. Additionally, exact decompositions might only exist for some modular quantum circuits. Therefore, it is essential to find gate combinations that approximate these circuits to high fidelity with potentially low depth, for example, using gradient-based optimization. Traditional optimizers often run into problems of slow convergence requiring many iterations, and perform poorly in the presence of noise. Here we present iteratively preconditioned gradient descent (IPG) for optimizing quantum circuits and demonstrate performance speedups for state preparation and implementation of quantum algorithmic subroutines. IPG is a noise-resilient, higher-order algorithm that has shown promising gains in convergence speed for classical optimizations, converging locally at a linear rate for convex problems and superlinearly when the solution is unique. Specifically, we show an improvement in fidelity by a factor of $10^4$ for preparing a 4-qubit W state and a maximally entangled 5-qubit GHZ state compared to other commonly used classical optimizers tuning the same ansatz. We also show gains for optimizing a unitary for a quantum Fourier transform using IPG, and report results of running such optimized circuits on IonQ's quantum processing unit (QPU). Such faster convergence with promise for noise-resilience could provide advantages for quantum algorithms on NISQ hardware, especially since the cost of running each iteration on a quantum computer is substantially higher than the classical optimizer step.
更多
查看译文
关键词
optimization,quantum state preparation,gradient descent
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要