CS224W: Machine Learning with Graphs | 2021 | Lecture 7.2 - A Single Layer of a GNN

Jure Leskovec Computer Science, PhD Under the general perspective on GNN, we first introduce the concept of a general GNN layer. A general GNN layer consists of 2 main components: (1) message computation, (2) Aggregation. In a GNN layer, each node will first create a message based on the message computation, then the messages will be sent to its neighboring node, finally each node will aggregate the messages from its neighbors based on the aggregation function. Based on this idea, we discuss how GCN, GraphSAGE and GAT layers can be represented as message computation aggregation in detail. We further introduce how to design a GNN layer in practice, including how to include Batch Normalization (BatchNorm), Dropout, and different activation / non-linearities in GNNs. To follow along with the course schedule and syllabus, visit:
Back to Top