Skip Navigation

AMS 311, Probability Theory

Catalog Description: Probability spaces, random variables, moment generating functions, algebra of expectations, conditional and marginal distributions, multivariate distributions, order statistics, law of large numbers.

Prerequisite: AMS 301 and 310 

Co-prerequisiteAMS 261 or MAT 203 or 205

3 credits

Required Textbook for Summer 2023:
"Introduction to Probability" by Mark Daniel Ward and Ellen Gundlach, second printing, W.H. Freeman and Company; 2016; ISBN: 978-0-7167-7109-8

Required Textbook for Fall 2023:
"A First Course in Probability" by Sheldon Ross, 10th Edition, 2018, Pearson Publishing; ISBN 978-0134753119

Actuarial Exam: The material in this course is the basis of the first actuarial exam, Exam P, of the Society of Actuaries AMS 410 is a review course for Exam P, in which the material in AMS 311 is revisited in the form of practice questions for Exam P. For more details about actuarial preparation at Stony Brook see Actuarial Program 

1.  Probability Spaces  –  3 class hours.
2.  Conditional probability and independence –  4 class hours.
3.  Random Variables; Special Distributions – 6 class hours.
4.  Expectation – 4 class hours.
5.  Joint Distributions – 4 class hours.
6.  Conditional Distributions – 3 class hours.
7.  Covariance and correlations – 2 class hours.
8.  Moment Generating Functions – 3 class hours.
9.  Transformation of variables – 4 class hours.
10. Order Statistics – 2 class hours.
11. Law of Large Numbers – 3 class hours.

Learning Outcomes for AMS 311, Probability Theory

1.) Demonstrate an understanding of core concepts of probability theory and their use in applications:
      * experiments, outcomes, sample spaces, events, and the role of set theory in probability;
      * the axioms of probability and the theorems and their consequences;
      * using counting principles to calculate probabilities of events in sample spaces of equally likely outcomes;
      * independence and disjointness;
      * conditional probability;
      * the law of total probability and Bayes. law;
      * the method of conditioning to solve problems;
      * Markov chains and associated conditioning arguments.

2.) Demonstrate an understanding of the theory of random variables and their applications: 
      * the difference between discrete random variables, continuous random variables, and random variables with hybrid distributions;
      * cumulative distribution functions and their properties;
      * probability mass functions for discrete random variables and computations to evaluate probabilities;
      * properties of commonly used discrete distributions, such as binomial, geometric, Poisson, and hypergeometric distributions;
      * probability density functions, computing them from cumulative distribution functions, and vice versa; 
        properties of commonly used density functions, such as uniform, exponential, gamma, beta, and normal densities;
        means, variances, and higher moments of random variables, and their properties;
        connections and differences between different distribution functions, e.g., normal approximation to binomial, Poisson approximation to binomial, and the difference between binomial and hypergeometric;
        * Markov and Chebyshev inequalities and utilizing them to give bounds and estimates of probabilities.

3.) Demonstrate an understanding of the theory of jointly distributed random variables and their applications:
        computations with joint distributions, both for discrete and continuous random variables;
        computations with joint density functions and conditional density functions;
        conditional expectation and conditioning arguments in computations involving two or more random variables;
        computations with the bivariate normal distribution, the t-distribution, and chi-squared distributions, order statistics;
        applying indicator random variables to compute expectations;
        using moment generating functions in solving problems with sums of independent random variables;
        the weak and strong laws of large numbers;
        applying the central limit theorem in estimating probabilities.