NeurIPS2020 Graph Machine LearningContains papers related to graph machine learning published at the NeurIPS 2020 conference.
NeurIPS 2020, (2020)
Reproducible graph Machine learning research, we introduce the Open Graph Benchmark —a diverse set of realistic graph datasets in terms of scales, domains, and task categories
Cited by35BibtexViews379Links
0
0
NeurIPS 2020, (2020)
We propose a theoretical framework to study the expressive power of classes of graph neural networks based on their ability to count substructures
Cited by10BibtexViews189Links
0
0
NeurIPS 2020, (2020)
We compare the results of our model trained to minimize the F1 score to a typical baseline algorithm used in particle physics - the Adaptive Vertex Reconstruction algorithm
Cited by6BibtexViews175Links
0
0
NeurIPS 2020, (2020)
We believe that our findings constitute a step towards establishing a hierarchy of models w.r.t. their expressive power and, in this sense, the Principal Neighbourhood Aggregation model appears to outperform the prior art in Graph Neural Networks layer design
Cited by5BibtexViews231Links
0
0
NeurIPS 2020, (2020)
Given a network with n users, we implement a GNN1 with the goal of predicting the ratings given by user 405, which is the user who has rated the most movies in the dataset
Cited by3BibtexViews52Links
0
0
NeurIPS 2020, (2020)
Studying the local symmetries of graphs, we propose a more general algorithm that uses different kernels on different edges, making the network equivariant to local and global graph isomorphisms and more expressive
Cited by2BibtexViews34Links
0
0
NeurIPS 2020, (2020)
We introduced a novel distance metric between graphs and optimization routine, computing a coordinated pair of optimal transport maps simultaneously
Cited by2BibtexViews15Links
0
0
NeurIPS 2020, (2020)
We present the implicit graph neural network model, a framework of recurrent graph neural networks
Cited by1BibtexViews36Links
0
0
NeurIPS 2020, (2020)
We hope to investigate the interpretability of the edges learned by the Graph Finite-State Automaton layer to determine whether they correspond to useful general concepts, which might allow the GFSA edges to be shared between multiple tasks
Cited by1BibtexViews14Links
0
0
NeurIPS 2020, (2020)
We propose a path integral based Graph neural networks framework, which consists of self-consistent convolution and pooling units, the later is closely related to the subgraph centrality
Cited by1BibtexViews19Links
0
0
NeurIPS 2020, (2020)
We introduce the idea of Physical Scene Graphs, which represent scenes as hierarchical graphs, with nodes in the hierarchy corresponding intuitively to object parts at different scales, and edges to physical connections between parts
Cited by1BibtexViews77Links
0
0
NeurIPS 2020, (2020)
We introduced the multipole graph kernel network, a graph-based algorithm able to capture correlations in data at any length scale with a linear time complexity
Cited by1BibtexViews26Links
0
0
NeurIPS 2020, (2020)
The local subgraph approach is fundamentally different from prior work using entire graphs, which only captures broad structure at the loss of finer topological structure
Cited by1BibtexViews69Links
0
0
NeurIPS 2020, (2020)
Under the defined K-shot learning setting, Graph Extrapolation Networks learn to extrapolate the knowledge of a given graph to unseen entities, with a stochastic transductive layer to further propagate the knowledge between the unseen entities and model uncertainty in the link pr...
Cited by1BibtexViews44Links
0
0
Simon Geisler, Daniel Zügner,Stephan Günnemann
We propose a robust aggregation function, Soft Medoid, for the internal use within Graph Neural Networks
Cited by0BibtexViews11Links
0
0
We proposed a novel Proxy-based deep Graph Metric Learning approach from the perspective of graph classification, which offers a new insight into deep metric learning
Cited by0BibtexViews10Links
0
0
Diego Mesquita, Amauri Souza,Samuel Kaski
NeurIPS 2020, (2020)
We show that most graph neural network architectures employ convolutions that can quickly lead to smooth node representations
Cited by0BibtexViews33Links
0
0
Tailin Wu,Hongyu Ren, Pan Li,Jure Leskovec
NeurIPS 2020, (2020)
We have demonstrated the efficacy of Graph Information Bottleneck by evaluating the robustness of the Graph Attention Networks model trained under the GIB principle on adversarial attacks
Cited by0BibtexViews23Links
0
0
NeurIPS 2020, (2020)
Edge streams are obtained by randomly permuting the edges in each graph, and the same edge order is used for all the methods
Cited by0BibtexViews6Links
0
0
NeurIPS 2020, (2020)
To utilize the strength of both Euclidean and hyperbolic geometries, we develop a novel Geometry Interaction Learning method for graphs, a well-suited and efficient alternative for learning abundant geometric properties in graph
Cited by0BibtexViews10Links
0
0
Keywords
Authors
Marinka Zitnik
Paper 3
Hongyu Ren
Paper 3
Jure Leskovec
Paper 3
Daniel Tarlow
Paper 2
Yuxiao Dong
Paper 2
Junzhou Huang
Paper 2
Yu Rong
Paper 2
Veličković Petar
Paper 2
Samuel Vaiter
Paper 1
Yunzhu Li
Paper 1