Let $X$ be a [[random variable]]. It's moment generating function (mgf) is denoted and defined as $M_X(t) = E[e^{tX}] = \int_{-\infty}^\infty e^{tX}f_X(x) \ dx$ For discrete variables, replace the integral with the sum over all $xs. $M_X(t) = E[e^{tX}] = \sum_{i=1}^n e^{tX} \ P(X=x)$ Moment generating functions can be used to produce [[moments]] for $X$. When you have *iid* samples, the mgf for the joint distribution is the mgf of one of them raised to the power $n$. Suppose that $X_1, X_2, \dots, X_n$ is a random sample from some distribution with mgf $M_X(t)$. Let $Y = \sum_{i=1}^n X_i$. $M_Y(t) = E[e^{tY}] = E \Big [ e^{t \sum X_i} \Big] = E \Big [ \prod_{i=1}^n e^{tX_i} \Big] \overset {iid}{=} E \Big [ e^t X_1 \Big ]^n = \Big (M_{X_1}(t) \Big)^n$ Moment generating functions uniquely determine the distribution. In other words, if you can show two random variables have the same mgf, you have shown they have the same distribution. We can use these properties of moment generating functions to determine the distributions of [[sums of random variables]]. For example, given $X_1, X_2, \dots, X_n$ random variables from the [[Bernoulli distribution]], $X \sim Bern(p)$, the distribution of the sum $Y = \sum_{i=1}^n X_i$ can be calculated by raising the mgf for the Bernoulli distribution to the power of $n$. $M_Y(t) \overset{iid}{=} [M_X(t)]^n = [1 - p + pe^t]^n$ Looking up this result in a table of moment generating functions, we can see that this is the mgf for the [[Binomial distribution]] with parameters $n$ and $p$. We can say that the distribution of a sum of $n$ Bernoulli random variables is $\sim bin(n,p)$. ## example: derive the moment generating function for the Bernoulli distribution Let $X$ be a random variable from the [[Bernoulli distribution]] $X \sim Bern(p)$. The moment generating function is defined as $M_X(t) = \sum_{x=0}^1 e^{tx}P(X=x)$ Substituting the pmf for the Bernoulli distribution we get $M_X(t) = \sum_{x=0}^1 e^{tx}p^x(1-p)^{1-x}$ The support of the Bernoulli only includes $x=0$ and $x=1$, so we can expand the sum to get $\displaylines{ \begin{align} M_X(t) &= e^{t*0} \ p^0 (1 - p)^{1-0} + e^{t*1} \ p^1 (1 - p)^0 \\ &= e^{0} \ p^0 (1 - p)^{1} + e^{t} \ p^1 (1 - p)^0 \\ &= 1 * 1 * (1 - p) + e^{t} \ p * 1 \\ &= 1 - p + pe^{t} \end{align}}$ ## example: derive the moment generating function for the binomial distribution Let $X$ be a random variable from the [[Binomial distribution]] $X \sim bin(n, p)$. The moment generating function is $\displaylines{ \begin{align} M_X(t) &= \sum_{x=0}^n e^{tx}\binom{n}{x}p^x(1-p)^{n-x} \\ &= \sum_{x=0}^n \binom{n}{x} (pe^t)^x(1-p)^{n-x} \end{align}}$ Using the [[binomial theorem]], we can see that $pe^t$ corresponds to the $a$ term and $(1-p)$ corresponds to the $b$ term. Thus, we can simplify to $M_X(t) = (pe^t + 1 - p)^n$ > [!NOTE] Binomial Theorem > ![[binomial theorem]] Let $X$ be a [[random variable]]. It's moment generating function (mgf) is denoted and defined as $M_X(t) = E[e^{tX}] = \int_{-\infty}^\infty e^{tX}f_X(x) \ dx$ For discrete variables, replace the integral with the sum over all $xs. $M_X(t) = E[e^{tX}] = \sum_{i=1}^n e^{tX} \ P(X=x)$ Moment generating functions can be used to produce [[moments]] for $X$. When you have *iid* samples, the mgf for the joint distribution is the mgf of one of them raised to the power $n$. Suppose that $X_1, X_2, \dots, X_n$ is a random sample from some distribution with mgf $M_X(t)$. Let $Y = \sum_{i=1}^n X_i$. $M_Y(t) = E[e^{tY}] = E \Big [ e^{t \sum X_i} \Big] = E \Big [ \prod_{i=1}^n e^{tX_i} \Big] \overset {iid}{=} E \Big [ e^t X_1 \Big ]^n = \Big (M_{X_1}(t) \Big)^n$ Moment generating functions uniquely determine the distribution. In other words, if you can show two random variables have the same mgf, you have shown they have the same distribution. We can use these properties of moment generating functions to determine the distributions of [[sums of random variables]]. For example, given $X_1, X_2, \dots, X_n$ random variables from the [[Bernoulli distribution]], $X \sim Bern(p)$, the distribution of the sum $Y = \sum_{i=1}^n X_i$ can be calculated by raising the mgf for the Bernoulli distribution to the power of $n$. $M_Y(t) \overset{iid}{=} [M_X(t)]^n = [1 - p + pe^t]^n$ Looking up this result in a table of moment generating functions, we can see that this is the mgf for the [[Binomial distribution]] with parameters $n$ and $p$. We can say that the distribution of a sum of $n$ Bernoulli random variables is $\sim bin(n,p)$. ## example: derive the moment generating function for the Bernoulli distribution Let $X$ be a random variable from the [[Bernoulli distribution]] $X \sim Bern(p)$. The moment generating function is defined as $M_X(t) = \sum_{x=0}^1 e^{tx}P(X=x)$ Substituting the pmf for the Bernoulli distribution we get $M_X(t) = \sum_{x=0}^1 e^{tx}p^x(1-p)^{1-x}$ The support of the Bernoulli only includes $x=0$ and $x=1$, so we can expand the sum to get $\displaylines{ \begin{align} M_X(t) &= e^{t*0} \ p^0 (1 - p)^{1-0} + e^{t*1} \ p^1 (1 - p)^0 \\ &= e^{0} \ p^0 (1 - p)^{1} + e^{t} \ p^1 (1 - p)^0 \\ &= 1 * 1 * (1 - p) + e^{t} \ p * 1 \\ &= 1 - p + pe^{t} \end{align}}$ ## example: derive the moment generating function for the binomial distribution Let $X$ be a random variable from the [[Binomial distribution]] $X \sim bin(n, p)$. The moment generating function is $\displaylines{ \begin{align} M_X(t) &= \sum_{x=0}^n e^{tx}\binom{n}{x}p^x(1-p)^{n-x} \\ &= \sum_{x=0}^n \binom{n}{x} (pe^t)^x(1-p)^{n-x} \end{align}}$ Using the [[binomial theorem]], we can see that $pe^t$ corresponds to the $a$ term and $(1-p)$ corresponds to the $b$ term. Thus, we can simplify to $M_X(t) = (pe^t + 1 - p)^n$