Graph Neural Networks for Learning Equivariant Representations of Neural Networks
ICLR 2024(2024)
摘要
Neural networks that process the parameters of other neural networks find
applications in domains as diverse as classifying implicit neural
representations, generating neural network weights, and predicting
generalization errors. However, existing approaches either overlook the
inherent permutation symmetry in the neural network or rely on intricate
weight-sharing patterns to achieve equivariance, while ignoring the impact of
the network architecture itself. In this work, we propose to represent neural
networks as computational graphs of parameters, which allows us to harness
powerful graph neural networks and transformers that preserve permutation
symmetry. Consequently, our approach enables a single model to encode neural
computational graphs with diverse architectures. We showcase the effectiveness
of our method on a wide range of tasks, including classification and editing of
implicit neural representations, predicting generalization performance, and
learning to optimize, while consistently outperforming state-of-the-art
methods. The source code is open-sourced at
https://github.com/mkofinas/neural-graphs.
更多查看译文
关键词
Deep weight space,Graph neural networks,Transformers,Permutation equivariance,Implicit neural representations,Networks for networks,Neural graphs
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要