MAT 330: Probability.

4 hours, 4 credits. Basic probability theory. Combinatorial problems, distributions, expectation, law of large numbers and central limit theorem, Bernoulli processes, and Markov chains. Other topics from probability and statistics. PREREQ: MAT 176.

MAT 681: Probability.

4 hours, 4 credits. Probability models, combinatorial problems, random variables, expectation and variance, binomial, normal and Poisson variables, law of large numbers, central-limit theorem, markov chains, and selected additional topics. PREREQ: Two semesters of calculus.

Location:Gi 225 MW 11-12:40

Instructor:Robert Schneider

Contact Info:

**email:**robert.schneider@lehman.cuny.edu**web page**: comet.lehman.cuny.edu/schneider/**office hours**: MW 10-11 Gillet 200 and by appointment

Grading Policy:

- Homework test (half/three quarter hour) 20%
- Midterm - 30 %
- Final: 40%
- Homework- 10%
- extra credit will be given for various projects and exceptional class comments

Course Objectives:

- understand fundamental theorems and assumptions underlying probability
- prove some of the fundamental theorems
- apply appropriate theorems from probability

Materials, Resources and Accommodating Disabilities:

textbook:A First Course in Probability; Sheldon Ross; 8th or 9th edition; Pearson

- Classic Texts

- An Introduction to Probability Theory and Its Applications, William Feller; Vol. 1, 3rd Edition; Wiley-- one of great texts; very complete and compact; good for graduate students to look at
- Probability: A Survey of the Mathematical Theory (Wiley Series in Probability and Statistics);John W. Lamperti; 2 edition (August 23, 1996)
- Schaum's Outline Of Probability And Statistics, 4th Edition;Mcgraw Hill -- lots of problems concise; cheap
Accommodating Disabilities:Lehman College is committed to providing access to all programs and curricula to all students. Students with disabilities who may need classroom accommodations are encouraged to register with the Office of Student Disability Services. For more info, please contact the Office of Student Disability Services, Shuster Hall, Room 238, phone number, 718-960-8441.## Course Calendar:

We will try to cover the material in Chapters 1-5 of the book at a speed appropriate for maximal understanding and retention. Continuous distributions will be introduced early and I may deviate or skip around in book to cover these. I may introduce some topics from chap 6 and 7 if time permits. Graduate students will be required to read these chapters and contact me for help. I will not accept homework late. Last years syllabus is in red. I will do things differently than last Fall but leave last falls detailed syllabus as a guide.

- 9/3-- Intro to probability concepts and start of Ch 1. The intro may take several lessons. These topics will be repeated in depth during the course but we should have an overview and questions in mind. Please challenge all assertions I may make. A good challenge is worth extra credit. I never penalize for a bad challenge of course concepts.

- Counting concepts
- Coin Tosses, Urns-- how do we count--- what is fair,equiprobable
- Bernoulli Trials relating to concepts of Random Numbers and Monte Carlos methods
- Problems (both editions different pages but same problems at end of chapter) -due Wed Sep 10

- Problems section : 1,2,4,7
- Theoretical Section:2
- Do one ofthe following or something you think is interesting related to these questions.

- an experiment with Red Balls and Black balls in and Urn to see if the long term number of red balls drawn divided by the number of balls drawn (with replacement) approaches the number of red balls in the urn divided by the number of balls in the urn
- do a similar experiment with flips of some coin you think is fair (or not fair)
- find some experiment in the literature to report on
- Grad students add:Theoretical 3,4,8,13
- 9/17 Ch1 start of Ch 2

- Problems: Ch 1: 8,9,10,24,26; Ch 2: 1,2,3,4 (hand in second set on Wed 9/17)

- 9/22 Ch2

- Problems 8,9,10,13,15,16,19,29 ; Theoretical exercises 1-8 (due 9/29)(we may have to take a while on this)

- Grad Students should pick a topic below and make appointment to discuss concepts with me. I will give more topics and you will only need 3 discussions.
- Philosophy of probability
- 10/6 Finish Ch 2 and begin conditional probability Ch 3

HW TEST Wed 10/15

10/20 Do problems 3.1,3.2,3.3,3.4,3.5,3.16,3.17,3.18- 11/3 Do 3.21,3.33,3.59,3.62 and on theoretical parts do 3.2,3.9

- Show that if 3 sets are independent then any choice of 3 different sets or their negations are independent

- ie: sets like E,~F,~G or ~E,F,G etc (do not repeat a letter like E,E,F)
- look at proof for two sets
- Graduate students: Be able to present example 3b,4m (this second one is a Markov Chain type result and will take you some time-- you can have another week to do this)

- HW test Wed November 5
- Midterm Monday November 10-- below are sample problems and tests -- may change depending on how far we get. These samples are redundant.
- 11/17 Ch 4

- Ch4 problems 4.1,4.2,4.5,4.7,4.8,4.10 due 11/24
- Ch 4 problems 4.13,4.17,4.19,4.21,4.25,4.35 due 12/1
## Last Years Schedule Below--- Will Adjust as we proceed

- Due Oct 8: 3.1,3.2,3.4,3.5 + Explain how we get probability of a length between 0 and 1 when we are making boxes of equiprobable volumes between 0 and 1.
- create a tree of depth three for three flips of a biased coin with bias p and q where the first flip is 1/2 heads and 1/2 tails and then the probabilities become p and q.

- Show that in this situation the probability of the event of getting a head on the 3rd trial is p.
- Prove (typically by induction) that the probability of the event of getting a head on the kth flip after flipping n time (k>1) is p. How about tails?
- Due Oct 29

- 3.16,3.17,3.18,3.21,3.33 and on theoretical parts do

- Consider the Markov Model of coin flip where the P(H|H)= a, P(H|T)=b, P(T|H)=1-a and P(T|T)=1-b. Suppose probability of a head on first flip = probability of a tail =1/2. Take the model to 3 flips.

- (graduate prob) Now "reverse" the model by starting with the third flip with the probabilities of getting a head on the third flip as calculated (same with the tail). Now calculate the conditional probability of getting a Head on the second flip given one on the third (same with all the combos) and make the reverse tree . What is similar and dissimilar about this reverse tree.
- If a=1/4 and b=1/2 find a probability for heads and tails on the first flip that will stay the same for succeeding flips (not 1/2)

- (graduate) what can you say about this tree?
- Due Nov 5

- quiz hw
- problems: 3.59, 3.62
- theoretical ex: 3.2, 3.9
- Show that A and B are independent given G iff P(A|BG)=PA|~BG).
- Test Nov 12
- Due Nov 20

- Ch4 problems 4.1,4.2,4.5,4.7,4.8,4.10
- Find a grapher that will graph the Poisson Distribution and graph it for lambda= .1,.5,10( we will discuss what you see)
- Due Nov 26

- Ch 5 problems 5.1,5.2,5.4
- Ch 4 problems 4.13,4.17,4.19,4.21,4.25,4.35
- Due Dec 10

- 4.37,4.38
- 5.6,5.7
- Due by Sun at 5 - will post solutions after that

- problems in pdf notes
- Create two random variables X,Y that you can prove are not independent.
- If Variance(X) = 10**(-6) (ten to the -6) and E(X)=0 what does Chebyshev say about the probability that |X|> 10**(-2). Could there possibly be a value of X >200. Explain.
- Explain in 5 lines or less why the Weak Law of Large Numbers tells you that E(X) is an important quantity.
- Let S be the number of heads in 3 Bernoulli trials (3 independent flips of a biased coin (p,q)). Show directly that the E(S)=3p and Var(S)=3pq. How do we get this result from our theorems about the expected value and the variance of the sum of random variables.

Department of Mathematics and Computer Science, Lehman College, City University of New York