Suppose $(\Omega, \mathcal H, \mathbb P)$ is a probability space and let $X$ be a random variable. Then, we may define a measure on $(\mathbb R, \mathcal B(\mathbb R))$ by setting $$\mu_X(A) = \mathbb P\{X \in A\} \qquad A \in \mathcal B(\mathbb R).$$ We call $\mu_X$ the *distribution* of $X$.

Associated to any measure $\mu$ on $(\mathbb R, \mathcal B(\mathbb R))$, we define the *cumulative distribution function* of $\mu$ by $F(x) = \mu\{ (-\infty, x] \}$. When $\mu = \mu_X$ is the distribution of a random variable then the cumulative distribution function can be given by $F_X(x) = \mathbb P\{X \leq x\}$.

**Properties of cumulative distribution functions:**

- A cumulative distribution function $F : \mathbb R \rightarrow [0, \infty]$ is non-decreasing;
- ${\displaystyle \lim_{x \rightarrow -\infty} F(x) = 0}$;
- If $F_X$ is the cumulative distribution function of random variable $X$, then ${\displaystyle \lim_{x \rightarrow \infty} F_X(x) = 1}$.

If $F$ is differentiable, then we write $f(x) = F'(x)$, and in this situation $f$ is the Radon-Nikodym derivative of $\mu$ and if $g$ is a measurable function, $\int g d\mu = \int_{\mathbb R} g(x) f(x) \, dx$.

When $F_X$ is the cumulative distribution function of $X$, then we write $f_X = F_X’$, when it exists, and call $f_X$ the *probability density function* of $X$.

**Theorem:** Let $g : \mathbb R \rightarrow \mathbb R$ be Borel measurable, and suppose $X$ has probability density function $f_X$. Then $$\mathbb E[g(X)] = \int_{\mathbb R} g(x) f_X(x) \, dx.$$ In particular $$\mathbb E[X] = \int_{\mathbb R} x f_X(x) \, dx.$$