# 20.6: Network Diffusion Kernals

$$\newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} }$$ $$\newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}}$$$$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}[1]{\| #1 \|}$$ $$\newcommand{\inner}[2]{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\id}{\mathrm{id}}$$ $$\newcommand{\Span}{\mathrm{span}}$$ $$\newcommand{\kernel}{\mathrm{null}\,}$$ $$\newcommand{\range}{\mathrm{range}\,}$$ $$\newcommand{\RealPart}{\mathrm{Re}}$$ $$\newcommand{\ImaginaryPart}{\mathrm{Im}}$$ $$\newcommand{\Argument}{\mathrm{Arg}}$$ $$\newcommand{\norm}[1]{\| #1 \|}$$ $$\newcommand{\inner}[2]{\langle #1, #2 \rangle}$$ $$\newcommand{\Span}{\mathrm{span}}$$$$\newcommand{\AA}{\unicode[.8,0]{x212B}}$$

Earlier, we defined a distance metric between two nodes as the weighted shortest path. This simple distance metric is sucient for many purposes, but it notably does not use any information about the overall graph structure. Often times, defining distance based on the number of possible paths between two nodes, weighted by the plausibility or likelihood of taking such paths, gives a better representation of the actual system we are modeling. We explore alternative distance metrics in this section.

Diffusion kernel matrices help capture the global network structure of graphs, informing a more complex definition of distance.

Let A be our regular adjacency matrix. D is the diagonal matrix of degrees. We can define L, the Laplacian matrix, as follows:

$L=D-A\nonumber$

We then define a diffusion kernel K as

$K=\exp (-\beta L) \nonumber$
Where $$\beta$$ is the diffusion parameter. Note that we are taking a matrix exponential and not an element-wise exponential, which is based on the Taylor series expansion as follows:

$=\sum_{k=0}^{\infty} \frac{1}{k}(-\beta L)^{k} \nonumber$

So what does the matrix K represent? There are multiple ways to interpret K, we will list the most relevant to us below:

Random Walks – One way to interpret K as the results of a random walk. Let’s assume we have a graph and at the node of interest, we have a probability distribution over the edges representing that probability that we move along that edge. Like the figure below:

$$\beta$$ is the transition probability along a specific edge. And there is also a probability that we don’t move (represented here as a self loop). Note that for the probability distribution to be valid it must sum up to 1.

If we have the setup above, then Kij is equal to the probability of the walk that started at i being at j after infinite time steps. To derive that result, we can write our graph as a Markov model and take the limit as $$t \rightarrow \infty$$

Stochastic Process – Another way we can interpret the diffusion kernel is through a stochastic process.

• for each node i, consider a random variable Zi(t)
• let Zi(t) be zero-mean with some defined variance.
• covariance for Zi(t) and Zj(t) is zero (independent to each other).
• each variable sends a fraction to the neighbors

$\begin{array}{c} Z_{i}(t+1)=Z_{i}(t)+\beta \sum_{j \neq i}\left(Z_{j}(t)-Z_{i}(t)\right) \\ Z(t+1)=(I-\beta L) Z(t) \\ Z(t)=(I-\beta L)^{t} Z(0) \end{array}\nonumber] let the time evolution operator T(t) be \[T(t)=(I-\beta L)^{t} \nonumber$

then the covariance is equal to

$\operatorname{Cov}_{i j}(t)=\sigma^{2} T_{i j}(2 t) \nonumber$

Then as we take $$\Delta t \rightarrow 0$$ we get

$\operatorname{Cov}(t)=\sigma^{2} \exp (-2 \beta t L)\nonumber$

This page titled 20.6: Network Diffusion Kernals is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Manolis Kellis et al. (MIT OpenCourseWare) via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.