It's probable
Shhh, let's be discrete
You continuously surprise me
Random stuff
We all have limits
100
This is the definition of independent events A and B.
What is P(A and B) = P(A)P(B)? Intuitively, knowledge about A gives no new information about the likelihood of B.
100
This is the definition of the PMF of X conditioned on Y.
What is p(x|y) = P(X=x | Y=y)? Equivalently, this equals pxy(x,y)/py(y) where pxy is the joint PDF and py is the marginal PDF of Y. [See pages 98 and 100 in the text]
100
This is the probability that P(X=a) for any continuous random variable X.
What is zero?
100
This is the moment generating function of a random variable X.
What is the function M(t) = E(e^(tX))?
100
This is the Central Limit Theorem.
What is: The CDF of a sum of n iid random variables converges to the standard normal CDF as n increases.
200
These are the law of total probability and Bayes' Rule.
What is: [Law of TP:] P(A) = P(A|B1)P(B1) + P(A|B2)P(B2) + ... _ P(A|Bn)P(Bn), where B1, ..., Bn form a partition of the sample space. [Bayes:] P(A|B) = P(B|A)P(A)/P(B).
200
In a two-step procedure, it is often useful to use this law when computing the expectation of an end result.
What is the law of total expectation? E(X) = E(X|A1)P(A1) + E(X|A2)P(A2) + ... where A1, A2, ... form a partition of the sample space.
200
These are the limits of a CDF as x goes to -infinity and infinity, respectively.
What are 0 and 1?
200
This is the procedure to find the PDF of a random variable Z from the known PDFs of its related random variables X and Y.
What is find the CDF of Z using the PDFs of X and Y and then take the derivative to get the PDF? [See Section 4.1]
200
This is the definition of convergence in probability.
What is: For any e>0, P(|Xn-X| > e) -> 0 as n increases. Then Xn converges to X in probability.
300
This is the law of additivity (aka the union law).
What is P(A or B) = P(A) + P(B) - P(A and B)?
300
I roll a fair 6-sided die and square the number. This is the expected value of the number I end up with.
What is 15.17? [Moral: E( g(X) ) = g(x1)P(X=x1) + g(x2)P(X=x2) + ...]
300
This is how to find a CDF of X from a PDF of X.
What is integrate the PDF from -infinity to x to obtain F(x). [See Section 3.2 for details]
300
DOUBLE JEOPARDY!! You may wager anywhere from $0 to $1200, in increments of $300.










































The correlation coefficient of two random variables is always between these two numbers.
What is -1 and 1? [Follow up: What do negative/positive values mean? What does it mean when the coefficient is very close to 1 or very close to -1? What does it mean when the coefficient is 0?]
300
These are the two laws of large numbers.
What is: The average of n iid random variables with means mu converges to mu in probability (weak law) and almost surely (strong law) as n increases.
400
This is the probability of being dealt a Royal Flush in Hearts (Ten through Ace of hearts - in any order).
What is 1 / (52choose5) = 0.000000384? [See practice problems, exams, homework, etc. for more complicated problems using these values]
400
DOUBLE JEOPARDY!! You may wager anywhere from $0 to $1200, in increments of $400.










































This is how you get a marginal PDF for X from a joint PDF for X and Y.
What is px(x) = p(x,y1) + p(x,y2) + ... over all values yi that Y takes. [See p. 93 of the text]
400
When X is normally distributed with mean m and variance s^2, this random variable is a standard normal random variable.
What is (X-m)/s ?
400
This is a random variable which when Y=y is equal to the expected value of X given Y=y.
What is E(X|Y)?
400
These are the advantages and disadvantages to using Markov's, Chernoff's, and Chebyshev's inequalities.
What are: 1) Markov's is simple but only works for non-negative random variables. 2) Chebyshev's works for negative rvs as well, but the variance must be known. 3) Chernoff's requires computation of the MGF, but once this is known, you can optimize t to get a tighter bound.
500
This is the validity (true or false?) of the statement that P(cX
What is FALSE? Note: A few of you made this mistake on HW 14 when proving that if Yn converges to y then cYn converges to cy [in probability].
500
This is the joint PDF of X and Y when X and Y are independent.
What is the product of the PDFs of X and Y?
500
These are the continuous versions of Bayes' rule.
What are: fx|y = fx * fy|x / fy and fx| = P(A|X=x) * fx / P(A) [See p.178-179 in your text. Also see problem 2 on midterm 2 and 3.34 in the text and classroom examples]
500
This is why a moment generating function is named as such. [Be specific]
What is because you can take the nth derivative of the MGF and evaluate it at t=0 to obtain the nth moment E(X^n).
500
These are the four types of convergence in probability along with which type of convergence implies the others.
What are convergence in: 1) probability [implies distribution] 2) distribution 3) mean [implies probability and thus distribution] 4) almost surely [implies probability and thus distribution] **Some of these are harder to show than others. For the simpler proofs, see the practice problems.