Neural predictor-based automated graph classifier framework

Machine Learning(2023)

引用 0|浏览33
Graph Neural Architecture Search (Graph-NAS) methods have shown great potential in finding better graph neural network designs compared to handcrafted designs. However, existing Graph-NAS frameworks are based on complex algorithms and fail to maintain low costs for high scalability with high performance. They require full training of thousands of graph neural networks to inform the search process, resulting in a prohibitive computational cost, which is not necessarily affordable for the users interested. Due to the computation cost, many researchers have limited the search space exploration ability, which may lead to a local optimum solution. In this paper, we propose a performance predictor-based graph neural architecture search (PGNAS) framework. The proposed approach consists of three conceptually much simpler and basic phases, and can broadly explore a search space with a much cheaper computation cost. We train n sampled architectures from a search space to generate n (architecture, validation accuracy) pairs used to train a performance distributions learner where the features are represented by the architecture description and the validation accuracy denotes the target. Next, we use this performance distribution learner to predict the validation accuracies of architectures in the search space. Finally, we train the top-K predicted architectures and choose the architecture with the best validation result. Although our approach seems simple, it is efficient and scalable; experiment results show that PGNAS outperforms existing both handcrafted and Graph-NAS models on four benchmark datasets.
Graph classification,Neural architecture search,Neural performance predictor,Graph neural network
AI 理解论文
Chat Paper