Quantum Graph Transformers

ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)(2023)

Cited 0|Views29
No score
We propose Quantum Graph Transformers (QGT), a novel approach for realizing the Transformer architecture for graph learning with quantum processors. QGT is built on top of the Graph Trans-former (GT) architecture and addresses the main challenge of mapping GT basic functions such as node encodings, graph structure, all-to-all connectivity, and message passing to quantum computing primitives and processors. We empirically demonstrate the training and inference efficacy of our proposed QGT architecture for the graph classification task on quantum devices over various graph datasets.
Translated text
Key words
graph classification task,graph datasets,graph learning,graph structure,Graph Trans-former architecture,mapping GT basic functions,QGT architecture,quantum computing primitives,quantum devices,Quantum Graph Transformers,quantum processors,Transformer architecture
AI Read Science
Must-Reading Tree
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined