Some notes on foundational ideas and definitions in modern probability theory.

- Sample Spaces, Events & $\sigma$-Algebras
- Measure & Probability
- Random Variables & Measurable Functions
- Expectation & Integral
- Almost Surely, Almost Never & Almost Everywhere
- Haar & Lebesgue Measure
- Distributions & Densities
- Joint Distributions
- Independence & Conditioning
- Conditional Expectation
- Random Processes & Random Fields
- Point Processes
- Math 672/673
- Some Inner Product Calculations

### Sample Spaces, Events & $\sigma$-Algebras

The set of possible outcomes of a random experiment is called the sample space. Subsets of the sample space are called events. We will eventually axiomatize probability measures, but for the moment we view probability as a number $\mathbb P(E) \in [0,1]$ we associate to certain events which represents the likelihood that an outcome of… Read…

### Measure & Probability

We begin with the sample space $\Omega$ of a random experiment, and a $\sigma$-algebra $\mathcal H$ on $\Omega$ consisting of subsets of $\Omega$ to which we want to assign probabilities. Definition: A probability measure on $(\Omega, \mathcal H)$ is a function $\mathbb P: \mathcal H \rightarrow [0,1]$ such that $\mathbb P(\emptyset) = 0$ If $E_1,… Read…

### Random Variables & Measurable Functions

A measurable space consists of a set and a $\sigma$-algebra on that set. If $(E, \mathcal E)$ and $(F, \mathcal F)$ are measurable spaces, then we say $f: E \rightarrow F$ is measurable if for each $B \in \mathcal F$, we have $f^{-1}(B) \in \mathcal E$. If $\Omega$ is the sample space of a random… Read…

### Expectation & Integral

Given a probability space $(\Omega, \mathcal H, \mathbb P)$ and positive simple random variable $X \in \mathcal H_+$, $$X = \sum_n a_n \boldsymbol 1_{A_n},$$ the expectation of $X$ is given by $$\mathbb E[X] = \sum_n a_n \mathbb P(A_n).$$ By the Monotone Class Theorem, every positive random variable is the increasing limit of positive simple random… Read…

### Almost Surely, Almost Never & Almost Everywhere

Let $(\Omega, \mathcal H, \mathbb P)$ be a probability space and let $(E, \mathcal E, \mu)$ be a measure space. Set $A \in \mathcal H$ and $B \in \mathcal E$ are called null sets if $\mathbb P(A) = 0$ or $\mu(B) = 0$. Null sets are not detectable by measures, but introduce complications in their… Read…

### Haar & Lebesgue Measure

Our expectations and integrals are examples of Lebesgue integrals. How do these relate to the Riemann integrals we learned about in calculus? The group of real numbers under addition $(\mathbb R, +)$ is an example of a locally compact group. Any Locally compact group $(G,+)$ can be topologized and the $\sigma$-algebra generated by the $\pi$-system… Read…

### Distributions & Densities

Suppose $(\Omega, \mathcal H, \mathbb P)$ is a probability space and let $X$ be a random variable. Then, we may define a measure on $(\mathbb R, \mathcal B(\mathbb R))$ by setting $$\mu_X(A) = \mathbb P\{X \in A\} \qquad A \in \mathcal B(\mathbb R).$$ We call $\mu_X$ the distribution of $X$. Associated to any measure $\mu$… Read…

### Joint Distributions

Let $(\Omega, \mathcal H, \mathbb P)$ be a probability space and consider a random vector $\mathbf X: \Omega \rightarrow \mathbb R^N$. By calling $\mathbf X$ we assume that each coordinate $\mathbf X_n : \Omega \rightarrow \mathbb R$ is a random variable (measurable with respect to the Borel $\sigma$-algebra on $\mathbb R$). Equivalently $\mathbf X$ is… Read…

### Independence & Conditioning

Here we introduce a purely probabilistic concept, independence. Let $(\Omega, \mathcal H, \mathbb P)$ be a probability space. Definition: Given $A, B \in \mathcal H$ with $\mathbb P(B) \neq 0$, we define the conditional probability of $A$ given $B$ to be $$\mathbb P(A | B) = \frac{\mathbb P(A \cap B)}{\mathbb P(A)}.$$ Definition: Two events $A,… Read…

### Conditional Expectation

Let $(\Omega, \mathcal H, \mathbb P)$ be a probability space, and suppose $\mathcal F \subseteq \mathcal H$ is a sub-$\sigma$-algebra of $\mathcal H$. Then, if $X$ is a random variable measurable with respect to $\mathcal F$, then it is measurable with respect to $\mathcal H$, but the converse is not true. The central question that… Read…

### Random Processes & Random Fields

Let $\mathbb T$ be an index set, usually a subset of $\mathbb R^N$ for some $N$. Then a collection of random variables $\{X_t\}_{t \in \mathbb T}$ on the same probability space $(\Omega, \mathcal H, \mathbb P)$ is a random process. From another perspective, for each $\omega \in \Omega$ we have $t \rightarrow X_t(\omega)$ is a… Read…

### Point Processes

Let $(E, \mathcal E)$ be a measurable space, and let $(E^N, \mathcal E^{\otimes N})$ be the product space equipped with the product $\sigma$-algebra. We will view $\mathbf x = (x_1, \ldots, x_N)$ as the position of $N$ points, or particles in $E$. If the position of the particles is random, then we think of $X_1,… Read…

### Math 672/673

Theory of Probability Winter/Spring 2023. MWF 10-11am. Catalog description: Measure and integration, probability spaces, laws of large numbers, central-limit theory, conditioning, martingales, random walks. We will cover most of Probability and Stochastics by Erhan Ã‡inlar. I anticipate we will cover the first four chapters in Winter quarter, and the remaining chapters (perhaps focusing on particular… Read…

### Some Inner Product Calculations

In Nathan Hunter’s thesis, the inner product of two monomials $x^M$ and $x^L$, with $M \leq L$ are given as $$\begin{eqnarray}\langle x^M , x^L \rangle &=& 2 \pi \sum_{n=-NM}^M {M \choose \frac{NM + n}{N+1}} {L \choose \frac{NL + n}{N+1}} \\ && \quad N^{(2n – M – L)/(N+1)} \left( \frac{N M + n}{N+1} + 1 \right)^{-1}… Read…