Arbitrary Polynomial Separations in Trainable Quantum Machine Learning
CoRR(2024)
摘要
Recent theoretical results in quantum machine learning have demonstrated a
general trade-off between the expressive power of quantum neural networks
(QNNs) and their trainability; as a corollary of these results, practical
exponential separations in expressive power over classical machine learning
models are believed to be infeasible as such QNNs take a time to train that is
exponential in the model size. We here circumvent these negative results by
constructing a hierarchy of efficiently trainable QNNs that exhibit
unconditionally provable, polynomial memory separations of arbitrary constant
degree over classical neural networks in performing a classical sequence
modeling task. Furthermore, each unit cell of the introduced class of QNNs is
computationally efficient, implementable in constant time on a quantum device.
The classical networks we prove a separation over include well-known examples
such as recurrent neural networks and Transformers. We show that quantum
contextuality is the source of the expressivity separation, suggesting that
other classical sequence learning problems with long-time correlations may be a
regime where practical advantages in quantum machine learning may exist.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要