Recall that in the definition of a [[What is a Markov Chain?|Markov chain]], there is a probability transition matrix which is indexed by states $s \in \mathcal{S}$ from the state space
$\begin{align}
P_{s,s^\prime} &= \text{Probability to go from }s\text{ to }s^\prime\text{ in one step} \\ &= \mathbb{P}(S_{1} = s^\prime | S_{0} = s) \\ &= \mathbb{P}(S_{t+1} = s^\prime | S_{t} = s)
\end{align}$
There are two main ways to use this matrix, which correspond to multiplication on the left or multiplication on the right.
# Update rule for the Probability Vectors aka Kolmogorov Forward Equation
This can be used to update the probability vectors by matrix multiplication. The setup is to set $\vec{p}_t = [ \mathbb{P}(S_t = s_{0}),\mathbb{P}(S_t = s_{1}),\ldots,\mathbb{P}(S_t = s_{n-1}) ] \in \mathbb{R}^{n}$ where $n$ is the number of states in the state space $\mathcal{S}$ to be the vector of probabilities at time $t$, then we have:
$\vec{p}_{t+1} = \vec{p}_{t} P $
where $P = P_{s,s^\prime}$ is the $n\times n$ transition matrix.
## Why is this left matrix multiplication true?
The reason is the law of total probability which you may rememebr from conditional probability. If $A_1,A_{2},\ldots,A_{k}$ are events, that are mutually exclusive $A_i \cap A_j = \emptyset$ and cover all of the probability space, $\cup A_i = \Omega$ then we always have that any event can be split up by conditioning over which of the $A