$\require{physics}\newcommand{\cbrt}[1]{\sqrt[3]{#1}}\newcommand{\sgn}{\text{sgn}}\newcommand{\ii}[1]{\textit{#1}}\newcommand{\eps}{\varepsilon}\newcommand{\EE}{\mathbb E}\newcommand{\PP}{\mathbb P}\newcommand{\Var}{\mathrm{Var}}\newcommand{\Cov}{\mathrm{Cov}}\newcommand{\pperp}{\perp\kern-6pt\perp}\newcommand{\LL}{\mathcal{L}}\newcommand{\pa}{\partial}\newcommand{\AAA}{\mathscr{A}}\newcommand{\BBB}{\mathscr{B}}\newcommand{\CCC}{\mathscr{C}}\newcommand{\DDD}{\mathscr{D}}\newcommand{\EEE}{\mathscr{E}}\newcommand{\FFF}{\mathscr{F}}\newcommand{\WFF}{\widetilde{\FFF}}\newcommand{\GGG}{\mathscr{G}}\newcommand{\HHH}{\mathscr{H}}\newcommand{\PPP}{\mathscr{P}}\newcommand{\Ff}{\mathcal{F}}\newcommand{\Gg}{\mathcal{G}}\newcommand{\Hh}{\mathbb{H}}\DeclareMathOperator{\ess}{ess}\newcommand{\CC}{\mathbb C}\newcommand{\FF}{\mathbb F}\newcommand{\NN}{\mathbb N}\newcommand{\QQ}{\mathbb Q}\newcommand{\RR}{\mathbb R}\newcommand{\ZZ}{\mathbb Z}\newcommand{\KK}{\mathbb K}\newcommand{\SSS}{\mathbb S}\newcommand{\II}{\mathbb I}\newcommand{\conj}[1]{\overline{#1}}\DeclareMathOperator{\cis}{cis}\newcommand{\abs}[1]{\left\lvert #1 \right\rvert}\newcommand{\norm}[1]{\left\lVert #1 \right\rVert}\newcommand{\floor}[1]{\left\lfloor #1 \right\rfloor}\newcommand{\ceil}[1]{\left\lceil #1 \right\rceil}\DeclareMathOperator*{\range}{range}\DeclareMathOperator*{\nul}{null}\DeclareMathOperator*{\Tr}{Tr}\DeclareMathOperator*{\tr}{Tr}\newcommand{\id}{1\!\!1}\newcommand{\Id}{1\!\!1}\newcommand{\der}{\ \mathrm {d}}\newcommand{\Zc}[1]{\ZZ / #1 \ZZ}\newcommand{\Zm}[1]{\left(\ZZ / #1 \ZZ\right)^\times}\DeclareMathOperator{\Hom}{Hom}\DeclareMathOperator{\End}{End}\newcommand{\GL}{\mathbb{GL}}\newcommand{\SL}{\mathbb{SL}}\newcommand{\SO}{\mathbb{SO}}\newcommand{\OO}{\mathbb{O}}\newcommand{\SU}{\mathbb{SU}}\newcommand{\U}{\mathbb{U}}\newcommand{\Spin}{\mathrm{Spin}}\newcommand{\Cl}{\mathrm{Cl}}\newcommand{\gr}{\mathrm{gr}}\newcommand{\gl}{\mathfrak{gl}}\newcommand{\sl}{\mathfrak{sl}}\newcommand{\so}{\mathfrak{so}}\newcommand{\su}{\mathfrak{su}}\newcommand{\sp}{\mathfrak{sp}}\newcommand{\uu}{\mathfrak{u}}\newcommand{\fg}{\mathfrak{g}}\newcommand{\hh}{\mathfrak{h}}\DeclareMathOperator{\Ad}{Ad}\DeclareMathOperator{\ad}{ad}\DeclareMathOperator{\Rad}{Rad}\DeclareMathOperator{\im}{im}\renewcommand{\BB}{\mathcal{B}}\newcommand{\HH}{\mathcal{H}}\DeclareMathOperator{\Lie}{Lie}\DeclareMathOperator{\Mat}{Mat}\DeclareMathOperator{\span}{span}\DeclareMathOperator{\proj}{proj}$
There are $3$ main types of convergence of functions $(\Omega, \FFF, \mu)\to (\RR, \BBB_\RR)$ relevant for our purposes; each implies the next. Given a sequence of functions $f_n$ and another function $f$,
1. $f_n\to f$ ==**pointwise**== if $f_n(\omega)\to f(\omega)$ for all $\omega$.
2. $f_n\to f$ ==**with respect to $\mu$ almost everywhere**== if$\mu\left( \left\{\omega: \lim_{n\to\infty} f_n(\omega) \neq f(\omega)\right\}\right) = 0$
3. $f_n\to f$ **==in $\mu$-measure==** if for all $\eps > 0$, $ \lim_{n\to \infty} \mu\left( \left\{\omega: \abs{f_n(\omega) - f(\omega)} \geq \eps\right\}\right) = 0$
In [[Probability]] theory, these are called pointwise, **==almost sure==**, and **==in probability==** convergence. Note that in-probability convergence fixes the limit almost surely.
Other common types of convergence:
- Convergence in $L^2$ norm. This implies convergence in $\mu$-measure on probability measures.
- [[Weak Convergence|Convergence in Distribution]].
# Random Problems
>[!problem]
> Show $f_n\to f$ ae if $
> \forall \delta>0 \qquad \PP\left(\exists N: \forall m > N, \abs{f_m(\omega) - f(\omega)} > \delta\right) = 0
> $
>[!problem] Ky Fan metric
>Convergence in probability [[defines a metric]], metrizable by the Ky Fan metric:$d(X,Y) = \inf\{\eps > 0: \PP(\abs{X - Y} > \eps)\leq \eps \}.$
>
>Prove this:
> - This is metric.
> - Convergence in probability $\iff$ convergence in this Ky Fan metric.
>[!problem] $\PP$ to AS Boostrap
>Suppose $X_1, X_2,\cdots$ converge in probability to $X$. Show they also converge almost surely along some subsequence.
>
>Conversely, suppose every subsequence $X_{i_k}$ of $(X_i)_{i\in \NN}$ itself admits a subsequence which converges almost surely to $X$. Then, show that $X_i$ converges in probability.
> [!idea]
> This converse is not as contrived as you might think! Often, at the beginning of a problem, one write something like "consider any sequence of points satisfying so-and-so" or "consider any (other complicated structure)," where the conditions to be met are closed under subsequences. Then, they proceed to claim that some $X_n$ generated by these objects converges in probability. It suffices to exhibit *any* subsequence of $X_n$ which converges almost surely, because this same proof can then be applied to *all* subsequences of $X_n$!
> [!solution]- Boring
> WLOG $X = 0$. Suppose $X_1, X_2,\dots,\to X$ in probability. Then, there is an increasing sequence $a_n$ such that for all $n$, for all $m \geq a_n$,
> $
> \mu\left(\{\omega: \abs{X_m(\omega)}\geq \frac1n\}\right)\leq \frac1{2^n}
> $
> Extract this subsequence $Y_n \equiv X_{a_n}$. Then, $\limsup_{m\to \infty} \abs{Y_m(\omega)}$ is a measurable function. Along this subsequence, for all $n$,
> $\begin{align*}
> \mu\left(\{\omega: \limsup_{m\to \infty} \abs{Y_m(\omega)} \geq \frac1n\}\right) &= \mu\left(\bigcap_{m\geq 0} \bigcup_{n\geq m} \left\{\abs{Y_m(\omega)} \geq \frac1n\right\}\right)\\
> &\leq \inf_m \mu\left(\bigcup_{n\geq m} \left\{\abs{Y_m(\omega)} \geq \frac1n\right\}\right)\\
> &\leq \inf_m \mu\left(\bigcup_{n\geq m} \left\{\abs{Y_m(\omega)} \geq \frac1n\right\}\right)\\
> &\leq \inf_m \frac{1}{2^{m-1}}\\
> & = 0
> \end{align*}
> $
> Take the countable union across $n$.
>
> On the other hand, if $X_n$ do not converge in probability, then for all $\eps$, there exists a $\delta$ such that $\PP(\abs{X_n - X} > \eps) > \delta$ infinitely often. Pick this subsequence, where $\PP(\abs{X_n - X} > \eps) > \delta$ for all $n$. Then obviously no subsequence is going to converge almost surely.
>[!problem] Cauchy in Probability
>Suppose $X_1, X_2, \dots$ are random variables such that
>$\forall \eps > 0, \lim_{m,n\to \infty} \PP\left(\abs{X_m - X_n} \geq \eps\right)\to 0$
>Then, we say $X_i$ are **==Cauchy in Probability==**. Show that there exists a random variable $X$ such that $X_i\to X$ almost surely.
> [!solution]-
> Ummm, first lets unpack some quantifiers.
>$\forall \eps,\delta > 0 \quad \exists M \forall m,n\geq M\qquad \PP\left(\abs{X_m - X_n} \geq \eps\right) < \delta$
>And we want to show that there exists an $X$ such that
>$\forall \eps,\delta > 0 \quad \exists N \forall n\geq N\qquad \PP\left(\abs{X_n - X}\geq \eps\right) < \delta$
>
>> [!proof]- There almost surely exists a convergent subsequence.
>>Cool. For all $k$, I construct an $M_k$ such that for all $m,n\geq M_k$, $\PP\left(\abs{X_m - X_n}\geq 0.01 \cdot 2^{-k}\right) < 0.01 \cdot 2^{-k}$
>>
>>The event that $\abs{X_{M_k} - X_{M_{k+1}}} < 0.01 \cdot 2^{-k}$ (which is an event, because it is the preimage of an interval under a measurable function) occurs with probability $0.01 \cdot 2^{-k}$. Let $E$ be the event that only finitely many such discrepancies occur (this is clearly a countable union of intersections, and is thus an event). By [[Borel-Cantelli]], $\PP(E) = 1$.
>>
>>Thus let $X$ equal the limit of $X_{M_k}$ on $E$, and set $X = 0$ elsewhere. This is a random variable because it is the limit of random variables.
>
>>[!proof]- Verification
>>
>> Now, if $\eps,\delta > 2^{-k}$, we can select $N = M_k$. For all $n\geq M_k$,
>> - $\abs{X_n - X_{M_k}} < 0.01\cdot 2^{-k}$ with probability at least $1 - 0.01\cdot 2^{-k}$.
>> - $\abs{X_m - X_{m +1}} < 0.01 \cdot 2^{-m}$ for all $m\geq k$ with probability at least $1 - 0.01\cdot \left(2^{-k} + 2^{-k-1} + \cdots\right)$.
>>
>>Thus, with probability at least
>>$1 - 0.01\cdot 2^{-k} - 0.01\cdot \left(2^{-k} + 2^{-k-1} + \cdots\right) > 1 - 2^{-k} > 1 - \delta,$
>>we have
>>$\abs{X_n - X} < 0.01\cdot \left(2^{-k} + 2^{-k} + 2^{-k-1} + \dots\right) < 2^{-k} < \eps$
>>as desired.
>[!problem] Convergence in Distribution is pretty good
>Suppose $X_n\to X$ in distribution. Then, there are random variables $\tilde{X}_n$ defined one a common probability space $(\Omega, \FFF, \PP)$ such that $\tilde{X}$ has the same distribution as $X$, $\tilde{X}_n$ has the same distribution as $X_n$, and $\tilde{X}_n\to \tilde{X}$ almost surely.
See [[continuous mapping theorem]] for more problems!