MAT 330: Probability.
4 hours, 4 credits. Basic probability theory. Combinatorial problems, distributions, expectation, law of large numbers and central limit theorem, Bernoulli processes, and Markov chains. Other topics from probability and statistics. PREREQ: MAT 176.
MAT 681: Probability.
4 hours, 4 credits. Probability models, combinatorial problems, random variables, expectation and variance, binomial, normal and Poisson variables, law of large numbers, central-limit theorem, markov chains, and selected additional topics. PREREQ: Two semesters of calculus.
Location: Gi 225 TTH 9:00 -10:40 AM
Instructor: Robert Schneider
Contact Info:
Grading Policy:
- Homework test (20 minutes) 20%
- Midterm - 30 %
- Final: 40%
- Homework- 10%
- extra credit will be given for various projects and exceptional class comments
Course Objectives:
- understand fundamental theorems and assumptions underlying probability
- prove some of the fundamental theorems
- apply appropriate theorems from probability
Materials, Resources and Accommodating Disabilities:
- textbook: A First Course in Probability; Sheldon Ross; 8th or 9th edition; Pearson
- Classic Texts
- An Introduction to Probability Theory and Its Applications, William Feller; Vol. 1, 3rd Edition; Wiley-- one of great texts; very complete and compact; good for graduate students to look at
- Probability: A Survey of the Mathematical Theory (Wiley Series in Probability and Statistics);John W. Lamperti; 2 edition (August 23, 1996)
- Schaum's Outline Of Probability And Statistics, 4th Edition;Mcgraw Hill -- lots of problems concise; cheap
- Accommodating Disabilities: Lehman College is committed to providing access to all programs and curricula to all students. Students with disabilities who may need classroom accommodations are encouraged to register with the Office of Student Disability Services. For more info, please contact the Office of Student Disability Services, Shuster Hall, Room 238, phone number, 718-960-8441.
Tentative Course Calendar:
We will try to cover the material in Chapters 1-5 of the book at a speed appropriate for maximal understanding and retention. Continuous distributions will be introduced early and I may deviate or skip around in book to cover these. I may introduce some topics from chap 6 and 7 if time permits. Graduate students will be required to read these chapters and contact me for help. I will not accept homework late. Last years syllabus is in red. I will do things differently than last Fall but leave last falls detailed syllabus as a guide.
- 8/27- Lesson 1; 9/2 Lesson 2, Lesson 3-- Intro to probability concepts and start of Ch 1. The intro may take several lessons. These topics will be repeated in depth during the course but we should have an overview and questions in mind. Please challenge all assertions I may make. A good challenge is worth extra credit. I never penalize for a bad challenge of course concepts.
- Counting concepts
- Coin Tosses, Urns-- how do we count--- what is fair,equiprobable
- Bernoulli Trials relating to concepts of Random Numbers and Monte Carlos methods
- Problems (both editions different pages but same problems at end of chapter) -due Wed Sep 3
- Problems section : 1,2,4,7
- Theoretical Section:2
- Do one of the following or something you think is interesting related to these questions.
- an experiment with Red Balls and Black balls in and Urn to see if the long term number of red balls drawn divided by the number of balls drawn (with replacement) approaches the number of red balls in the urn divided by the number of balls in the urn
- do a similar experiment with flips of some coin you think is fair (or not fair)
- find some experiment in the literature to report on
- Grad students add:Theoretical 3,4,8,13
- Ch1 start of Ch 2, Lesson 4
- Problems: Ch 1: 8,9,10,24,26; Due 9.17
- Note that section 1.6 treats how many multinomial coefficients there are (proposition 6.2). You may prepare this section for extra credit and present the reasoning for the result in one of my office hours.
- Ch2 Lesson 5
- Problems Ch 2: 1,2,3,4 ,8,9,10,13,15,16,19,29 ; Theoretical exercises 1-8 (due )(we may have to take a while on this)Due 10.8
- Cheat Sheet for Maple commands for sets and probability. You should right click and download to get this Maple file
- Dice & Cards start for Maple(right click)
- Grad Students should pick a topic below and make appointment to discuss concepts with me. I will give more topics and you will only need 3 discussions.
- Philosophy of probability
- Finish Ch 2 and begin conditional probability Ch 3
- HW TEST Oct 22
- Due: Thur Oct 29: problems 3.1,3.2,3.3,3.4,3.5,3.16,3.17,3.18
- HW Test:Tue Nov 3
- Due Tue Nov 3: 3.21,3.33,3.59,3.62 and on theoretical parts do 3.2,3.9
- Show that if 3 sets are independent then any choice of 3 different sets or their negations are independent
- ie: sets like E,~F,~G or ~E,F,G etc (do not repeat a letter like E,E,F)
- look at proof for two sets
- Graduate students: Be able to present example 3b,4m (this second one is a Markov Chain type result and will take you some time-- you can have another week to do this)
Midterm-Tuesday November 17--Answer sheet
- . These samples are redundant.
- Ch 4
- Ch4 problems 4.1,4.2,4.5,4.7,4.8,4.10 due 11/24
- Ch 4 problems 4.13,4.17,4.19,4.21,4.25,4.35 due 12/1
- HW quiz
- HW
- Be able to prove that the var(c*X)= c2 *var(X)
- Let X1 be the value on one roll of a fair die. What is the mean, variance and standard deviation.
- Let Y= X1+X2 +...+Xn be the sum of n independent throws of the fair die. What is the mean, variance and standard deviation of Y? What is the mean, variance and standard deviation of Y/n. Let Z= (Y-n*mean(X1 )/(sqrt(n)*stdev). What is its mean, variance and stdev.
- Prove that if Y= X1+X2 +X3 where the X are pairwise independent then the Var(Y) = Var(X1 ) + Var(X2 )+ Var(X3 )
- Grad Students: Prove that if we have n pairwise inependent random variables then the var of their sum is the sum of their variance.
- Let X1 and X2 be two independent rolls of a fair die. Let S=X1*X2. Follow our proof in class to show that E(S)=E(X1)*E(X2).
- Create two random variables X,Y that you can prove are not independent.
- If Variance(X) = 10**(-6) (ten to the -6) and E(X)=0 what does Chebyshev say about the probability that |X|> 10**(-2). Could there possibly be a value of X >200. Explain.
- Let S be the number of heads in 3 Bernoulli trials (3 independent flips of a biased coin (p,q)). Show directly that the E(S)=3p and Var(S)=3pq. How do we get this result from our theorems about the expected value and the variance of the sum of random variables.
Department of Mathematics and Computer Science, Lehman College, City University of New York