
Abstract
The majority of currently used graph neural networks fall in the category of message passing neural nets (MPNNs). MPNNs are equivariant to permutations of the vertices, but at the expense of severe constraints on what form the message passing operation can take, and this ultimately limits their expressiveness. In response, a variety of proposals have been put forth to extend the message passing paradigm to message passing between subgraphs rather than just individual vertices and/or to utilize higher order messages. However, ensuring that such higher order MPNNs still remain equivariant is not trivial. In this talk we describe a mathematical formalism called P-tensors that both guarantees equivariance for higher order MPNNs, and also makes the implementation of such networks in software more transparent and more efficient. We also describe an extension of the formalism called Schur-nets that can explicitly account for the automorphism structure of the local topology.
The work described in this talk was done in collaboration with Andrew Hands, Richard Xu, Qingqi Zhang, Tianyi Sun and Ludwig Schneider.