Random Variables
-
A random variable is a function that maps an event in the sample space to a numerical value.
-
A discrete random variable operates on a countable sample space. That is, it may be quantified using a probability mass function
where and
-
A continuous random variable operates on an uncountable sample space. It can be quantified using a probability density function
wherein the probability is given by A corresponding function called the cumulative distribution function
is defined such that -
Change of variables extends to probability distributions Given
is a continuous random variable from , with probability distribution and , assuming is an invertible function, we have that Where
denotes the Jacobian for multidimensional .- If
were discrete, then we simply have
- If
Properties of Probability Distributions
-
The Quantile denoted
defined as the value such that -
The mean or expectation or expected value is defined as the average value of the distribution.
-
An alternative notation is
which is obtained as -
Linearity of Expectation.
-
-
The variance is defined as the average spread of the distribution
-
An important formulation of the variance is as follows:
-
The standard deviation is expressed in the same units of
. It is simply -
The coefficient of variation denoted
is defined asThe squared coefficient of variation is simply
.It serves as a measure of variability. The CV denotes how concentrated the probability distribution is around the mean (higher = more spread out)
-
-
The covariance between
and measure the degree to which they are linearly related. It is defined asWe may define a covariance matrix
whose entries contain the pairwise covariances between variables. That is- The Covariance satisfies the following
- The Covariance satisfies the following
-
The precision is the inverse of the variance. It is given by
The precision matrix is given as the inverse of the covariance matrix
-
The Pearson Correlation between
and defines a normalized measure of the covariance. It is defined asWe may define a correlation matrix
whereIn fact,
indicates that . Correlation is a measure to the degree in which two variables are linearly related.
Theorems
- Law of Total Expectation also called the Law of Iterated Expectations. States that for random variables
we have
-
The following property applies to expectations. Let
be a joint distribution. We can write the expectation of sampled from this distribution asObserve we expanded the joint distribution according to the chain rule of probability. hen, we manipulated the internals sums by pushing them out.
Remarkably, for a Markov Process, the expectation collapses simply as
-
Markov’s Inequality states the following:
If
is a nonnegative random variable and , thenWhen
, we take for to get