# Background and Definitions

## Background

Early studies fall into the category of recurrent graph neural networks(RecGNNs). With the success of CNNs, new approach developed. (ConvGNNs) ConvGNNs are divided into two main streams, the spectral-based approaches and the spatial-based approaches.

Network embedding aims at representing network nodes as low-dimensional vector representations, preserving both network topology structure and node content information.

Graph kernel methods employ a kernel function to measure the similarity between pairs of graphs. Graph kernels can embed graphs or nodes into vector spaces by a mapping function. The difference between GNN and graph kernels is that this mapping function is deterministic rather than learnable.

## Definitions

Deﬁnition 1 (Graph): A graph is represented as $G = (V,E)$ where $V$ is the set of vertices or nodes, and $E$ is the set of edges. Let $v_i \in V$ to denote a node and $e_{ij} = (v_i,v_j) \in E$ to denote an edge pointing from $v_j$ to $v_i$. The neighborhood of a node $v$ is deﬁned as $N(v) = {u \in V|(v,u) \in E}$. The adjacency matrix $A$ is a $n \times n$ matrix with $A_{ij} = 1$ if $e_{ij} \in E$ and $A_{ij} = 0$ if $e_{ij} \notin E$. A graph may have node attributes $x_{v}$ , where $X \in R^{n×d}$ is a node feature matrix with $x_{v} \in R^d$ representing the feature vector of a node $v$. Meanwhile, a graph may have edge attributes $X^e$, where $X^e \in R^{m×c}$ is an edge feature matrix with $x^{e}_{v,u} \in R^c$ representing the feature vector of an edge $(v,u)$.

Deﬁnition 2 (Directed Graph): A directed graph is a graph with all edges directed from one node to another. An undirected graph is considered as a special case of directed graphs where there is a pair of edges with inverse directions if two nodes are connected. A graph is undirected if and only if the adjacency matrix is symmetric.

Deﬁnition 3 (Spatial-Temporal Graph): A spatial-temporal graph is an attributed graph where the node attributes change dynamically over time. The spatial-temporal graph is deﬁned as $G^{(t)} = (V,E,X^{(t)})$ with $X^{(t)} \in R^{n×d}$.

# Categorization and Frameworks

Recurrent Neural Network - aim to learn node representations with recurrent neural architectures.

Convolutional graph neural networks - generate a node $v$’s representation by aggregating its own features $x_v$ and neighbors’ features $x_u$, where $u \in N(v)$

Graph autoencoders(GAE) learn network embeddings and graph generative distributions