Unleashing the Potential of Fractional Calculus in Graph Neural Networks with FROND
arxiv(2024)
摘要
We introduce the FRactional-Order graph Neural Dynamical network (FROND), a
new continuous graph neural network (GNN) framework. Unlike traditional
continuous GNNs that rely on integer-order differential equations, FROND
employs the Caputo fractional derivative to leverage the non-local properties
of fractional calculus. This approach enables the capture of long-term
dependencies in feature updates, moving beyond the Markovian update mechanisms
in conventional integer-order models and offering enhanced capabilities in
graph representation learning. We offer an interpretation of the node feature
updating process in FROND from a non-Markovian random walk perspective when the
feature updating is particularly governed by a diffusion process. We
demonstrate analytically that oversmoothing can be mitigated in this setting.
Experimentally, we validate the FROND framework by comparing the fractional
adaptations of various established integer-order continuous GNNs, demonstrating
their consistently improved performance and underscoring the framework's
potential as an effective extension to enhance traditional continuous GNNs. The
code is available at .
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要