Pasar al contenido principal

Seminario: Generalization and Efficient Training of Graph Neural Networks

 

Expositor: Juan Cerviño, postdoctoral researcher en Laboratory for  Information and Decision Systems at the Massachusetts Institute of  Technology, USA  Abstract: Graph Neural Networks (GNNs) extend convolutional neural networks  to operate on graphs. Despite their impressive performances in various  graph learning tasks, the theoretical understanding of their generalization capability is still lacking. In this talk, we take a manifold perspective  to establish the statistical generalization theory of GNNs on graphs sampled from a manifold in the spectral domain. As demonstrated empirically, we prove that the generalization bounds of GNNs decrease linearly with the size of the graphs in the logarithmic scale, and increase linearly with the spectral continuity constants of the filter functions. 
Notably, our theory explains both node-level and graph-level tasks. In the second part of the talk, we will deal with efficient GNN training, motivated by the fact that on large-scale graphs convolutions incur high computational costs, leading to scalability limitations. We propose a novel algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training. This algorithm is further benchmarked on a decentralized control problem, where it retains comparable performance to its large-scale counterpart at a reduced computational cost. The talk will finish 
discussing future and ongoing projects.

Fecha: Lunes 15 de setiembre, 14:00. 

Lugar: Sala de reuniones del Instituto de Computación. Edificio del Instituto de Computación, Senda Landoni s/n, Tercer Piso.