Definition 3.10
The cumulative distribution function (CDF) of a random variable X is a function given by
Example 3.9: Say we toss a coin twice. Let X be the number of heads we observe. Find the CDF of X.
Solution: Notice that with range and a probability mass function given by: To find the CDF, we first note that if then We can also see that for , Now we need to consider the values that lie between 0 and 2. If , implies that X can only take the value 0. Therefore: for .
Similarly, for , . Putting all of these together we get
Notice that we’ll always have and
Theorem: Given that X is a random variable with probability mass function and cumulative distribution function ,
(a) For all , we have (b) For any , we have
Example 3.10: Let X be a discrete random variable with range . Suppose the probability mass function of X is given by (a) Find and plot the cumulative distribution function of X
(b) Find
(c) Find
Solution: First let’s verify that is indeed a probability mass function. Now onto the problem:
(a) Since the range begins at 1, we have for .
Now, let for any in .
So we have: (b) Using the above theorem, (c)
Definition 3.11: Let X be a discrete random variable with range . The expected value of X, denoted EX, is defined as The expected value may be written using several different notations which are all equivalent:
Example 3.11: Let . Find EX.
Solution: Hence, .
Example 3.12: Let . Find EX.
Solution: Hence, .
We know that if . Differentiating both sides we get
Example 3.13: Let . Find EX.
Solution: Hence, .
Theorem 3.2: The expected value has the following properties:
(a) For any random variable X and (b) Given any number of random variables , which may or may not be independent,
Example 3.16: Let Find EX.
Solution: We know that for a binomial distribution, where are independent random variables. So we can write: Hence, .
Example 3.15: Let Find EX.
Solution: In this case, we also have a sum where . So we have: Hence .
Let X be a random variable and define to be a function of that random variable. Now, Y itself is also a random variable. So it makes sense to discuss things like the probability mass function, cumulative distribution function, and expected value of this function.
To start off, the range of will be and we can write
Example 3.16: Let X be a discrete random variable with . Let . Find and the probability mass function of .
Solution: Now, to find the PMF:
So we have
The Expected Value of a Function of a Random Variable
The law of the unconscious statistician (or LOTUS) is a theorem which states that the expected value of a function of a random variable, or , can be expressed using the probability mass function of X (without needing the find the PMF of !). For a discrete random variable, this is expressed at
Example 3.17: Let X be a discrete random variable with where . Find .
Solution: Using LOTUS, we have
Example 3.18: Prove .
Solution: Here, , so we can use LOTUS to get:
The variance of a random variable X with mean is defined as The standard deviation, in turn, is defined as
Theorem: Given a random variable X,
Proof: We know, by the previous definition, that Expanding this expression using Theorem 3.2, we get
Example 3.19 Say we roll and fair, 6-sided die, and let X be the resulting number. Find , , and .
Solution: First of all, we know that with Therefore, we have Now we can calculate variance using . First, we need to find . Now we can calculate Finally, we have
Theorem 3.3: Given a random variable X and ,
Proof: Let . From the previous section, we know that Therefore, using our original definition of variance
Theorem 3.4: If are independent random variables and then
Proof:
Example 3.20: Let . Find Var(X).
Solution: Once again, we know that where .
For each , And so we have that Hence, .
Problem 1: Let X be a discrete random variable with the following probability mass function:
Find the cumulative distribution function of X.
Solution: The cumulative distribution function is defined by . So we have:
Problem 2: Let X be a discrete random variable with the following probability mass function:
(a) Find EX.
(b) Find Var(X).
(c) Let and find EY.
Solution:
(a)
(b) First we need to find : Now we have (c) Using LOTUS, we know that
Problem 3: Let X be a discrete random variable with the following probability mass function:
Let . Find the probability mass function of Y.
Solution: First, note that Thus, So our probability mass function is
Problem 4: Let . Find .
Solution: The probability mass function of X is given by
where . So we have
Problem 5: Let . Find EX.
Solution: The probability mass function of X is given by
where .
Define the indicator random variables as where
So we can write which implies that Now, we have that for each , so we can deduce that Finally we have
Problem 6: Show that if , then .
Solution:
Problem 7: Let X be a discrete random variable with . Prove that
Solution: First, note that and so on.
Thus,
Problem 8: Let . Find Var(X).
Solution: In Example 3.13, we showed that . Therefore . Standard wisdom would tell us to next find , but let’s instead find for reasons that will be clear in a moment. So now we have Finally, we can plug everything into our formula to get
Problem 9: Let X and Y be two independent random variables. Suppose that we know and . Find Var(X) and Var(Y).
Solution: We have that and Setting up a system of equations, we can solve for and