Recitation 2006



Comments



Description

Massachusetts Institute of TechnologyDepartment of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2006) Recitation 1 February 9, 2005 1. Problem 1.2, page 52 of text. Let A and B be two sets. (a) Show the following two equalities Ac = (Ac ∩ B) ∪ (Ac ∩ B c ), B c = (A ∩ B c ) ∪ (Ac ∩ B c ) (b) Show that (A ∩ B)c = (Ac ∩ B) ∪ (Ac ∩ B c ) ∪ (A ∩ B c ) (c) Consider rolling a six-sided die. Let A be the set of outcomes where the roll is an odd number. Let B be the set of outcomes where the roll is less than 4. Calculate the sets on both sides of the equality in part (b), and verify that the equality holds. 2. Problem 1.5, page 53 of text. Out of the students in a class, 60% are geniuses, 70% love chocolate, and 40% fall into both categories. Determine the probability that a randomly selected student is neither a genius nor a chocolate lover. 3. Example 1.5, page 13 of text. Romeo and Juliet have a date at a given time, and each will arrive at the meeting place with a delay between 0 and 1 hour, with all pairs of delays being equally likely. The first to arrive will wait for 15 minutes and will leave if the other has not yet arrived. What is the probability that they will meet? Page 1 of 1 Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2006) Recitation 1 Solutions February 9, 2005 1. Problem 1.2, page 52 of text. See online solutions. 2. Problem 1.5, page 53 of text. See online solutions. 3. Example 1.5, page 13 of text. See solutions in text. Page 1 of ?? Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2006) Recitation 2 February 14, 2006 1. A coin is tossed twice. Alice claims that the event of two heads is at least as likely if we know that the first toss is a head than if we know that at least one of the tosses is a head. Is she right? Does it make a difference if the coin is fair or unfair? How can we generalize Alice’s reasoning? 2. We are given three coins: one has head on both faces, the second has tails on both faces, and the third has a head on one face and a tail on the other. We choose a coin at random, toss it, and it comes up heads. What is the probability that the opposite face is tails? 3. Fischer and Spassky play a sudden death chess match. Each game ends up with either a win by Fischer, this happens with probability p, a win for Spassky, this happens with probability q, or a draw, this happens with probability 1 − p − q. The match continues until one of the players wins a game (and the match). (a) What is the probability that Fischer will win the last game of the match? (b) Given that the match lasted no more than 5 games, what is the probability that Fischer won in the first game? (c) Given that the match lasted no more than 5 games, what is the probability that Fischer won the match? (d) Given that Fischer won the match, what is the probability that he won at or before the 5th game? Page 1 of 1 Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2006) Recitation 2: Solutions February 14, 2006 1. Problem 1.12, page 55 of text. See online solutions. 2. Problem 1.13, page 55 of text. See online solutions. 3. (a) P(Fischer wins) = p + p(1 − p − q) + p(1 − p − q)2 + · · · p = 1 − (1 − p − q) = p p+q We may also find the solution through a simpler method: P(Fischer wins | Someone wins) = = Fischer wins P(Fischer wins) P(Someone wins) p p+q p q Spassky wins Fischer wins q Spassky wins 1-p -q p Draw p Draw q Fischer wins Spassky wins (b) P(the match lasted no more than 5 games) = (p + q) + (p + q)(1 − p − q) + (p + q)(1 − p − q)2 + (p + q)(1 − p − q)3 + (p + q)(1 − p − q)4 5 = (p+q)[1−(1−p−q) ] 1−(1−p−q) = 1 − (1 − p − q)5 P(Fischer wins in the first game ∩ the match lasted no more than 5 games) = p Therefore, P(Fischer wins | the match lasted no more than 5 games) wins ∩ the match lasted no more than 5 = P(Fischer the match lasted no more than 5 games) games) P( p = 1−(1−p−q)5 1pq 1pq Draw p q Fischer wins Spassky wins p Draw q Fischer wins Spassky wins 1pq 1p- q ... Page 1 of 2 Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis (Spring 2006) (c) P(the match lasted no more than 5 games) = 1 − (1 − p − q)5 P(Fischer wins ∩ the match lasted no more than 5 games) = p + p(1 − p − q) + p(1 − p − q)2 + p(1 − p − q)3 + p(1 − p − q)4 5 = p[1−(1−p−q) ] 1−(1−p−q) = p[1−(1−p−q)5 ] p+q Therefore, P(Fischer wins | the match lasted no more than 5 games) wins ∩ the match lasted no more than 5 = P(Fischer the match lasted no more than 5 games) games) P( p = p+q (d) P(Fischer wins at or before the 5th game | Fischer wins) before the 5th = P(Fischer wins at or P(Fischer wins)game ∩ Fischer wins) � � � � 5 p = p[1−(1−p−q) ] / p+q p+q = 1 − (1 − p − q)5 This part may be solved by observing that the events {Fischer wins} and {the match lasted no more than 5 games} are independent (we know this from parts (a) and (c)): P(the match lasted no more than 5 games | Fischer wins) = P(the match lasted no more than 5 games) = 1 − (1 − p − q)5 Page 2 of 2 Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. and with probability pb if the weather is bad. Let H1 be the event that the first toss results in heads. each being chosen with probability 1/2. and H2 be the event that the second toss results in heads. the events “first toss was tails” and “10th toss was tails” are independent. the events “first toss was tails” and “10th toss was tails” are independent. the probability of heads in any given toss is 0. and if B.431: Probabilistic Systems Analysis (Spring 2006) Recitation 3 February 16.041/6. then P (B | Ac ) = P (B). A particular class has had a history of low attendance.99.01. For each one of the following statements. (d) If the events A1 . Page 1 of 1 . . (c) If 10 out 10 independent fair coin tosses resulted in tails. indicate whether it is true or false. calculate the probability that the professor will teach her class on that day. C are some other events. and provide a brief explanation. Given the probability of bad weather on a given day. 2006 1. . The coins are biased: with the blue coin. . then P (B | C) = n � i=1 P (Ai | C)P (B | Ai ). 2. Each student will independently show up with probability pg if the weather is good. (a) If P (A | B) = P (A). The annoyed professor decides that she will not lecture unless at least k of the n students enrolled in the class are present. Consider two coins. are the events H1 and H2 (conditionally) inde­ pendent? 3. whereas for the red coin it is 0. a blue and a red one. (b) If 5 out 10 independent fair coin tosses resulted in tails. (a) Are the events H1 and H2 (unconditionally) independent? (b) Given that the blue coin was selected. An form a partition of the sample space. We choose one of the two coins at random. . P (A1 ) = 1. page 61. Note also for the expression to be true. Therefore the two events are independent. 2. by the definition of independence: P (B | Ac ) = P (B) (b) False Since there are only 5 tails out of ten.35. then A and B are independent. In other words. Problem 1. Example 1.041/6. n � i=1 P (Ai | C)P (B | Ai ) = = n � P (Ai ∩ C) P (B ∩ Ai ) i=1 n � i=1 P (C) P (Ai ) P (Ai ∩ B ∩ C) P (C)P (Ai ) where the last line is ONLY TRUE if the events Ai ∩ C and B ∩ Ai are independent of each other.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. (d) False On the left hand side of the expression. See online solutions.431: Probabilistic Systems Analysis (Spring 2006) Recitation 3 Solutions February 16. so knowledge of one coin toss provides no additional knowledge about the tenth coin toss. P (B | C) = = = P (B ∩ C) P (C) n � P (Ai )P (B ∩ C | Ai ) i=1 n � i=1 P (C) P (Ai ∩ B ∩ C) P (C) However. then B is also independent of Ac . the knowledge that the first coin toss was a tails influences the probability that the tenth coin toss is a tails. (a) True If P (A | B) = P (A). which means the two events are not independent. since Ai ’s are disjoint. This implies. (c) True Here. all tosses are tails. i. knowledge of one coin toss provides knowledge about the other coin tosses. Therefore. 2006 1. And if B is independent of A. See solutions in text.e. i = 1 and A1 has to be the entire sample space. the right hand side of the given expression shows. Page 1 of 1 . page 37 of text. 3. the given expression only holds if Ai ∩ C and B ∩ Ai are independent and i = 1.21. The buses carry. N2 . . The birthday problem.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. which is larger? Give your reasoning informally. 4 buses carrying 148 job-seeking MIT students arrive at a job fair. 40. Use this to state an identity for binomial coefficients.. Page 1 of 1 .041/6. 25. page 66 in the text. . For any ℓ ∈ {0. k}. find the number of distinct samplings with replacement. 2006 1. and 50 students. Recall from Lecture 4 the different cases that arise from the problem of selecting/sampling k balls from an urn containing n numbered balls. Also. (c) Let X1 denote the number of balls selected that are numbered 1. . 2. Let X denote the number of students that were on the bus carrying this randomly selected student. numbered 1 through n: • Sampling with replacement and ordering • Sampling without replacement and ordering • Sampling without replacement and without ordering • Sampling with replacement and without ordering The objective of this problem is to study the fourth case. and ignore the additional complication presented by leap years (i.431: Probabilistic Systems Analysis (Spring 2006) Recitation 4 February 23. What is the probability that each person has a distinct birthday? Assume that each person has an equal probability of being born on each day during the year. Let Y denote the number of students on his bus.e. 1. 3. without ordering such that X1 = ℓ. respectively. (b) Compute E[X] and E[Y ]. Nn ). one of the 4 bus drivers is also randomly selected. without ordering. 33. . where Ni is the number of times the ball numbered i gets selected. Consider n people who are attending a party.45. nobody is born on February 29). . Problem 1. . One of the students is randomly selected. (a) Explain why we must have N1 + N2 + · · · + Nn = k. A distinct solution may be expressed in terms of the vector of nonnegative integers (N1 . . (a) Do you think E[X] and E[Y ] are equal? If not. independently of everyone else. . (b) How many distinct solutions does the equation above have? Explain why this is answer to the number of distinct results of sampling with replacement. 3 148 148 148 148 1 1 1 1 · 40 + · 33 + · 25 + · 50 = 37 4 4 4 4 Page 1 of 1 . Then E[X] = E[Y ] = 40 33 25 50 · 40 + · 33 + · 25 + · 50 ≈ 39. (b) There is a nice visualization for this. (a) The Ni s are the numbers of times each ball is selected.431: Probabilistic Systems Analysis (Spring 2006) Recitation 4 Solutions February 23.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. Make a dot for each drawn ball. 2. Think of there being a separator mark between groups. 3. � The number of solutions is the � number of ways to place k dots in k + n − 1 locations: k + n − 1 . Problem 1. we have k �� ℓ=0 k+n−ℓ−2 k−ℓ � = � k+n−1 k � . then applying the result of the previous part to the remaining � � balls and remaining draws from the urn gives (k − ℓ) + (nℓ− 1) − 1 as the desired number. so there are n − 1 separator marks: ··· ··· ··· ���� | ���� | ���� | ���� N1 N2 ··· Nn This gives a grand total of k + n − 1 dots and marks. grouped according the ball’s identity: ··· ··· ··· ���� ���� ���� ���� N1 N2 ··· Nn There is a total of k dots put in n groups. page 66. so the sum of the Ni s must be the total number of draws from the urn. (a) Students might say they are equal (both being the average number of students per bus) or have the correct intuition. k− Since this is just a way of breaking down the problem of the previous part.45. 2006 1. (b) Make sure to define the PMFs of X and Y .041/6. k (c) If we know that X1 = ℓ. See online solutions. Prove the following version of the Total Expectation Theorem: E[X] = n � i=1 P(Ai )E[X | Ai ] whenever A1 . Their properties are tabulated on pages 116–117 of the text. . b] (or uniform over {a. The first coin comes up a head with probability p and the second with probability q. . page 123 in the text. • Discrete uniform over [a. Make sure you understand how these random variables arise and how to derive their means and variances. (b) What is the probability that the last toss of the first coin is a head? 2. . . A2 . 3. and the variance of the number of tosses. . the expected value. Show that E[X] = ∞ � k=0 P(X > k). Suppose a discrete random variable X can have only non-negative integer values. b}) • Bernoulli with parameter p • Binomial with parameters p and n • Geometric with parameter p • Poisson with parameter λ Problems: 1. . 2006 Review/discussion: The following five types of discrete random variables arise frequently in applications and in the remainder of the course. Two coins are simultaneously tossed until one of them comes up a head and the other a tail.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.22. An is a partition of the sample space. Page 1 of 1 . a + 1. .041/6.431: Probabilistic Systems Analysis (Spring 2006) Recitation 5 February 28. . All tosses are assumed independent. Problem 2. (a) Find the PMF. ∞ ∞ �� �=1 k=� pX (k) P(X > � − 1) = The manipulations could look unmotivated. and (b) conditioning on a discrete random variable. Page 1 of 1 .22. 2006 1. This could be a good time to point out that the Total Probability Theorem and Total Expectation Theorem each have versions phrased with (a) conditioning on events forming a partition. 3. You could also point out that technically. then the inter­ change of summations is clear.431: Probabilistic Systems Analysis (Spring 2006) Recitation 5 Solutions February 28. See online solutions. page 123 in the text.041/6. These are equivalent because the collection of events {Y = y} over all y is a partition. but if you sketch the k-� plane. The result follows by rewriting the expectation summation in the following manner: E[X] = = ∞ � k=0 ∞ � �=1 kpX (k) = ∞ � k=1 � k � � �=1 ∞ � n=0 1 pX (k) = P(X > n). Total expectation follows easily from total probability. when we write E[X] = � y pY (y)E[X | Y = y] we better only include in the summation y such that P(Y = y) > 0.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. 2. Problem 2. independently of other papers. Each paper receives a grade from the set {A. Consider 2m persons forming m couples who live together at a given time.32.40. let A be the number of persons that are alive and let S be the number of couples in which both partners are alive. What is the variance of Alice’s commuting time? 3. and each light is equally likely to be green or red. At that later time. independently of the others. B−.431: Probabilistic Systems Analysis (Spring 2006) Recitation 6 March 2. Problem 2. A−.38. B.041/6.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. page 127 in the text. find E[S|A = a]. page 132 in the text. A particular professor is known for his arbitrary grading policies. 2006 1. Problem 2. How many papers do you expect to hand in before you receive each possible grade at least once? Page 1 of 1 . the mean. C+}. Alice passes through four traffic lights on her way to work. and the variance of the number of red lights that Alice encounters? (b) Suppose that each red light delays Alice by exactly two minutes. (a) What are the PMF. D. with equal probability. 2. B+. the probability of each person being alive is p. page 132 in the text. For any number of total surviving persons a. independently of other persons. Problem 2. Suppose that at some later time. Bernoulli’s problem of joint lives. See online solutions. 3.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. 2006 1. 2.431: Probabilistic Systems Analysis (Spring 2006) Recitation 6 Answers March 2. . See online solutions. See online solutions.041/6. The random variable X is exponentially distributed with parameter λ. Oscar uses his high-speed 300-baud modem to connect through his Internet Service Provider. σ ) Binary Signal Oscar’s PC Encoded Signal Modem 0 1 -1 +1 Channel Receiver/ Decoder Decision (0 or 1?) We assume that the probability of the modem sending −1 is p and the probability of sending 1 is 1 − p. What is the probability of making an error? (b) Answer part (a) assuming that p = 2/5. the receiver on the other end gets a signal which is the sum of the transmitted signal and the channel noise). var(X) and find P(X ≥ E[X]). The telephone line has additive zero-mean Gaussian (normal) noise with variance σ 2 (so. Page 1 of ?? .041/6.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. and conclude +1 was sent when the value is more than a. 2 Noise ~ N(0. (a) Suppose we conclude that an encoded signal of −1 was sent when the value received on the other end of the line is less than a (where −1 < a < +1). 2006 1. Your score in test i. You are allowed to take a certain test three times.431: Probabilistic Systems Analysis (Spring 2006) Recitation 07 March 07. � fX (x) = λe−λx x ≥ 0 0. a = 1/2 and σ 2 = 1/4. 2. The modem transmits bits in such a fashion that -1 is sent if a given bit is zero and +1 is sent if a given bit is one. 2. Hint: � ∞ P(x ≥ k) = fX (x)dx k (b) Find P(X ≥ t + k|X > t). independently o f the scores in other tests. otherwise (a) Calculate E[X]. The value of the noise is assumed to be independent of the encoded signal value. and your final score will be the maximum of the test scores. where i = 1. What is the PMF of the final score? 3. 3 takes one of the values from i to 10 with equal probability 1/(11 − i). Wanting to browse the net. var(X) = 1 λ2 and P(X ≥ E[X]) = 1 e (b) P(X > t + k|X > t) = e−λ(k) Note: the exponential random variable is memoryless. We first compute the CDF FX (x) and then obtain the PMF as follows pX (k) = We have. .431: Probabilistic Systems Analysis (Spring 2006) Recitation 07 (Answers) March 07. 10 ≤ k.6 · Φ( 1/2 ) Page 1 of 1 .    1 k k−1 k−2 10 9 8 FX (k) − FX (k − 1) 0. 3. otherwise. 2006 1 1.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. (a) P(error) = P(R1 |S0 )P(S0 ) + P(R0 |S1 )P(S1 ) = P(Z − 1 > a)(p) + P(Z + 1 < a)(1 − p) a−1 a − (−1) + (1 − p) · Φ = p· 1−Φ σ σ a+1 1−a = p−p·Φ + (1 − p) · 1 − Φ σ σ 1−a a+1 − (1 − p) · Φ = 1−p·Φ σ σ 3/2 1/2 (b) P(error) = 1 − 0. 2. 3 ≤ k ≤ 10. FX (k) =   0. k < 3. (a) E[X] = λ .4 · Φ( 1/2 ) − 0.10..041/6.. if k = 3. 0 1.0 -2. The amount X (in dollars) he takes to the casino each evening is a random variable with a PDF of the form � fX (x) = ax 0 if 0 ≤ x ≤ 40 otherwise At the end of each night. Alexei is vacationing in Monte Carlo. fY (y).Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. fully labeled sketches of fX (x). (b) Are X and Y independent? (c) Find fX.0 -1.431: Probabilistic Systems Analysis (Spring 2006) Recitation 08 March 09.1 X. y ) (b) What is the probability that on a given night Alexei makes a positive profit at the casino? (c) Find the PDF of Alexei’s profit Y −X on a particular night.Y (x. y).Y |A (x.0 f (x. Random variables X and Y have the joint PDF shown below: y 2. (a) Determine the joint PDF fX.Y -1.0 1. the amount Y that he has when leaving the casino is uniformly distributed between zero and twice the amount that the came with. 2. fY |X (y|x) and fX|Y (x|y). 2006 1. y ) within the unit circle centered at the origin. Page 1 of 1 .0 x (a) Prepare neat. (d) Find E[X|Y = y] and var(X|Y = y).y) = 0.0 2. where the event A corresponds to points (x.041/6. and also determine its expected value. 431: Probabilistic Systems Analysis (Spring 2006) Recitation 08 Answers March 09. y 0 2.1 π0.041/6.0 ≤ y ≤ 1.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.0 ≤ y ≤ −1.0 f x.y 0) = 0.0 1.0 ≤ y ≤ 2.Y |A (x. (b) X and Y are NOT indepenent since fXY (x.0 0.0 -1.y(x 0.0 1. from the figures we have fX|Y (x|y) = fX (x).2 1.0 Page 1 of 3 .Y ((x.0 2.y) P(A) 0 0. 2006 1.1 (x.0 0. (c) fX.0 2. (a) The marginal distributions are obtained by integrating the joint distribution along the X and Y axes and is shown in the following figure.2 x 0 -1.0 -1.2 -2.0 x0 1.0 Figure 1: Marginal probabilities fX (x) and fY (y) obtained by integration along the y and x axes respectively The conditional PDFs are as shown in the figure below. y) = = (d) E[X|Y = y] = fX.0 −1.0 0. Also.1 0.0 -2.0 1. y) = fX (x)fY (y). y) ∈ A otherwise 0 (x.4 f(x) x 0.0 y0 2. y) ∈ A otherwise        0   2  0 1 −2.3 fy (y)-1. y 0) = 0.0 2.0 {-1<x<=1} {1<x<=2} Figure 2: Conditional Probabilities Page 2 of 3 .0 0 y 0 1.0 1.y(x 0.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.0 x 0 f (x|y) {-2<y<=2} -1.0 -1.1 -1.0 1/2 x|y 2.0 f (y|x) y|x -2.0 1/4 1/2 f (y|x) y|x -1.041/6.0 -1.431: Probabilistic Systems Analysis (Spring 2006) f (x|y) {1 < y <= 2} x|y y 0 1/2 2.0 1.0 -2.0 f (x|y) {-1<y<=1} 1/3 x|y 1.0 y 2.0 1.0 f x.0 -1. if 0 ≤ x ≤ 40 and 0 ≤ y ≤ 2x otherwise.0 ≤ y ≤ 1. 1/1600 0.431: Probabilistic Systems Analysis (Spring 2006) The conditional variance var(X|Y = y) is given by  4  12   12  4 12 9 var(X|Y = y) = −2.   1600  0.0 ≤ y ≤ −1.0 1.0      2.      Page 3 of 3 . y) = (b) P(Y > X) = 1/2 (c) Let Z = Y − X. otherwise. (a) We have a = 1/800. We have  1 1  1600 z + 40 .0 −1. so that fXY (x.0 ≤ y ≤ 2. if 0 ≤ z ≤ 40. 1 1 40 . fZ (z) = E[Z] = 0.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6. − + if − 40 ≤ z ≤ 0. Each stop as soon as they get a head. that is. compact.6 0.5 1 < x < 3 0 elsewhere 0.4 0.4 0. carefully sketch. What is the PMF of the total amount of flips until both stop? (That is. (b) Determine the probability that Al wins a total of exactly 7 of the next 10 races.8 1. f X(x) 5 4 3 2 1 0.) (c) Determine.2 0.0 x f Y(y) 5 4 3 2 1 0. while Bob’s coin comes up head with probability 3/4.431: Probabilistic Systems Analysis (Spring 2006) Recitation 09 March 21. Alice and Bob flip bias coins independently. Random variables X and Y are independent and have PDFs as shown below. You may use P (A) symbolically in your answer.5 2 < y < 4 0 elsewhere fY (y) = (a) Determine P (A). it need not be simplified. (As long as your answer is explicit. Alice’s coin comes up heads with probability 1/4.0 y Let W = X + Y . and fully explained. Assume all races are independent. 2. Denote Al’s and Bo’s elapsed times with random variables X and Y .6 0. Fully explain each step of your work. and label the PDF for W .2 0. the elapsed time for the winner of the race. Al and Bo are in a race.041/6. what is the PMF of the combined total amount of flips for both Alice and Bob until they stop?) Page 1 of 1 . the probability that Al wins the race. 2006 1. respectively. and find fW (w) using a graphical argument. These independent random variables are described by the following PDFs: � fX (x) = � 0.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. 3. Alice stops when she gets a head while Bob stops when he gets a head.8 1. 9 ≤ w ≤ 2. 3.9 0. 1 < w0 ≤ 2 2 < w0 ≤ 3 otherwise fW (w) = ∞ −∞ fX (x)fY (w − x)dx for w = x + y and x. X + Y is the total number of flips until both stop.041/6.0 otherwise 0. y independent. 1.         5(0.0 − w).   1. 7 4 ( 7 )7 ( 1 )3 8 8 − w0 2 .1 0.          5(2.9 1.1 − w)).   5w.1 + (w − 0. This operation is called the convolution of fX (x) and fY (y).9 ≤ w ≤ 1. respectively. By convolution.5.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.0 ≤ w ≤ 1.1 + (1.          5(0. Thus. Let X and Y be the number of flips until Alice and Bob stop.  2   0.0 fW (w) =        0.9)).5.1 1. 2006 1. respectively.1 ≤ w ≤ 1.       0 ≤ w ≤ 0. 2 4j Page 1 of 2 .1 ≤ w ≤ 0. The random variables X and Y are independent geometric random variables with parameters 1/4 and 3/4.431: Probabilistic Systems Analysis (Spring 2006) Recitation 09 Answers March 21. (a) P [A] = 7 8 10 7 (b) P [Al wins 7 out of 10 races] = (c) fw (w0 ) = 2.         0. we have ∞ pX+Y (j) = k=−∞ j−1 pX (k)pY (j − k) (1/4)(3/4)k−1 (3/4)(1/4)j−k−1 k=1 j−1 = = = = 1 3k 4j k=1 1 4j 3j − 1 −1 3−1 3 (3j−1 − 1) . and 0 otherwise.041/6.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.431: Probabilistic Systems Analysis (Spring 2006) if j ≥ 2.) Page 2 of 2 . it roughly behaves like one with parameter 3/4. (Even though X + Y is not geometric. (b) Find P (X = 0). c) Check your answers in (b) by computing the moments directly. 2. E[X 2 ]. E[X]. and indicate which one is the true transform. P (X = 2) = . P (X = 3) = . b) Use the transform in (a) to find the mean and the variance of X.431: Probabilistic Systems Analysis (Spring 2006) Recitation 10 March 23. 2 4 4 a) Find the transform of the above random variable. a) Find the transform of X.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. 2006 1. A three sided die is described by the following probabilities: 1 1 1 P (X = 1) = . 3. E[X 3 ].041/6. Suppose a nonnegative discrete random variable has one of the following two expressions as its transform: (i) MX (s) = e2(e (ii) MX (s) = e2(e s−1 −1) s −1) (a) Explain why one of the two could not possibly be its transform. Suppose X is uniformly distributed between a and b. Page 1 of 1 . b) Use the transform to find the first three moments. r(b − a) b b) To find the mean and the variance we use the moment generating properties of the transform. E[X 2 ] = � d2 � E[e−rx ]� 2 r=0 dr �� �� � � −ra � � −ra � 2 −ra 2 1 e − e−rb 2 ae − be−rb a e − b2 e−rb � � = + + � � b−a r3 b−a r2 b−a r r=0 (L′ H opital) = ˆ = 1 b3 − a3 a3 − b3 b3 − a3 + + 3 b−a b−a b−a 1 2 (b + ab + a2 ) 3 and therefore we have: 1 V ar[X] = E[X ] − E[X] = (b2 + ab + a2 ) − 3 2 2 � b+a 2 �2 2. E[e −rx ] = = � e−rx a b−a e−ra − e−rb . 2006 1. namely: � d � E[X n ] = (−1)n E[e−rx ]� r=0 dr Thus we have: E[X] = − � d � E[e−rx ]� r=0 dr �� �� � � −rb � −rb � e − e−ra 1 be − ae−ra � 1 + = − � � b−a r2 b−a r r=0 (L′ H opital) = − ˆ = b2 − a2 a2 − b2 − b−a b−a b+a . 2 To find the Variance we need to find E[X 2 ] and thus we need to take the second derivative of the transform and evaluate at r = 0.041/6. against an exponential. we integrate the density function over its full domain.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. The transform for nonegative integer valued random variables is defined as: pT (z) = x ∞ � i=1 z xi P (X = xi ) = E[z X ] Page 1 of 2 .431: Probabilistic Systems Analysis (Spring 2006) Recitation 10 Solutions March 23. a) To find the transform. This is often expressed as finding the expected value of the function e−rx . E[X 3 ] = = � d � E[z X ]� z=1 dz 1 1 3 7 + + = 2 2 4 4 � d2 � E[z X ]� + E[X] 2 z=1 dz 1 3 7 15 + + = 2 2 4 4 � d3 � E[z X ]� + 3E[X 2 ] − 2E[X] z=1 dz 3 6 45 14 37 + + = 4 4 4 4 c) Direct computation thankfully produces the same results. a) Note that by the definition of the transform. We see that only the second option satisfies this requirement. MX (s) = � x esx pX (x) and therefore when evaluated at s = 0. 3. b) It is observed that the transform is that of a Poisson random variable with parameter λ = 2.041/6. Hence the pdf is given as follows: λk k! pX (k) = e−λ pX (0) = e−2 Page 2 of 2 . 2 4 4 b) We observe from above that if we take n derivatives of the transform and evaluate at z = 1 then we will have a linear combination of the first n moments. � d � E[z X ]� = E[X] z=1 dz � d2 � E[z X ]� = E[X 2 ] − E[X] z=1 dz 2 � d3 � E[z X ]� = E[X 3 ] − 3E[X 2 ] + 2E[X] z=1 dz 3 and therefore we have: and therefore we find: E[X] = = and similarly. the transform should equal 1. E[X 2 ] = = and finally.431: Probabilistic Systems Analysis (Spring 2006) 1 1 1 E[z X ] = z + z 2 + z 3 .Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. 1] according to the uniform distribution. each with success probability p.041/6. and each person is equally likely to choose any type of drink.431: Probabilistic Systems Analysis (Spring 2006) Recitation 11 April 4. If there are N types of drinks. independently of what anyone else chooses. Each person who comes in buys a drink. find the expected number of different types of drinks the bartender will have to make. Imagine that the number of people that enter a bar in a period of 15 minutes has a Poisson distribution with rate λ. and then a sequence of independent Bernoulli trials is performed. 2006 1.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. A number p is drawn from the interval [0. What is the mean and the variance of the number of successes in k trials? 2. Page 1 of 1 . 041/6. 2006 1.431: Probabilistic Systems Analysis (Spring 2006) Recitation 11 April 4. Page 1 of 1 . E[X] = var(X) = 2. k .Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. 2 k k2 + . 6 12 λ E[D] = N − N e− N . . 2. The number of cartons in a crate. The weight (in pounds) of a widget. if you dare). . n = 1. Widgets are packed into cartons which are packed into crates. . k! k = 0. (c) The transform or the PDF for the total weight of the widgets in a crate. 3. 2. The wheel of fortune is calibrated infinitely finely and has numbers between 0 and 1. 2006 1. we will decide how many times to spin a fair wheel of fortune. Using a fair three-sided die (construct one.431: Probabilistic Systems Analysis (Spring 2006) Recitation 12 April 6. (b) The expected value and variance of the number of widgets in a crate. . . N . K. K. 1. is a random variable with PMF pK (k) = µk e−µ . 2. Page 1 of 1 . X. is a random variable with PMF pN (n) = pn−1 (1 − p).Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6. . The number of widgets in any carton. Whichever number results from our throw of the die. The die has the numbers 1. we will spin the wheel of fortune that many times and add the results to obtain random variable Y . x≥0 . . (b) Determine the variance of Y . (a) Determine the expected value of Y . Random variables X. Determine (a) The probability that a randomly selected crate contains exactly one widget. (d) The expected value and variance of the total weight of the widgets in a crate. .2 and 3 on its faces. and N are mutually independent. is a continuous random variable with PDF fX (x) = λe−λx . From lecture. Since fX. Since there is complete symmetry between X and W . (a) The minimum mean squared error estimator g(Y ) is known to be g(Y ) = E[X|Y ]. one cannot really solve the problem with the available information. w) is not symmetric. which finally yields E[X|Y ] = Y /2. therefore... to get Y = E[Y |Y ] = E[X|Y ] + E[W |Y ]. Therefore.v. (a) First note that X should be a r. Now. So in this case. yo f ( x .431: Probabilistic Systems Analysis (Spring 2006) Recitation 12 Solutions April 5. y) is uniformly distributed in the defined region. This can be done by looking at the horizontal line crossing the compound PDF. though. fW (w) are the same. even though the marginals fX (x). we really need the joint distribution in order to compute the conditional expectations.W (x.y o o 0 1 2 if x − 1 ≤ y ≤ x + 1 otherwise 0 1 10 if x − 1 ≤ y ≤ x + 1 and 5 ≤ x ≤ 10 otherwise 10 5 5 10 xo We now compute E[X|Y ] by first determining fX|Y (x|y). for any symmetric distribution. fX|Y (x|y) is uniformly distributed as well. we cannot simply conclude that the distribution fX.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.W (x. w) = fX. X = E[X|Y ].041/6. we also have E[X|Y ] = E[W |Y ]. not a number. Let us first find fX. (b) In the dependent case. we are to minimize over all ˆ ˆ r. fX. Since Y = X + W . g(y) = E[X|Y = y] = The plot of g(y) is shown here.Y (x. x)).e. 2. y). 2006 ˆ 1.      5+(y+1) 2 y 10+(y−1) 2 if 4 ≤ y < 6 if 6 ≤ y ≤ 9 if 9 < y ≤ 11 Page 1 of 3 .Y (x.W (w. The solution given in the independent case still works. we can write fY |X (y|x) = and. take conditional expectations.Y (x.v. E[X|Y ] = E[W |Y ] in general. Since fX.’s X that can be expressed as functions of Y . fX.W (x. w) is symmetric in its two argument (i. In particular. y ) = 1/10 x. y) = fY |X (y|x) · fX (x) = as shown in the plot below. 431: Probabilistic Systems Analysis (Spring 2006) 11 10 9 8 g (yo ) 7 6 5 4 4 5 6 7 yo 8 9 10 11 (b) The linear least squares estimator has the form gL (Y ) = E[X] + cov(X. Note that we use the fact that (X − E[X]) and (W − E[W ]) are independent and E[(X − E[X])] = 0 = E[(W − E[W ])]. σW = (1 − (−1))2 /12 = 4/12 and.5). We compute E[X] = 7. σX = (10 − 5)2 /12 = 25/12. Y ) = E[(X − E[X])(Y − E[Y ])]. that X and W are independent. Note that g(Y ) is piecewise linear in this problem. Y ) = E[(X − E[X])(Y − E[Y ])] = E[(X − E[X])(X − E[X] + W − E[W ])] = E[(X − E[X])(X − E[X])] + E[(X − E[X])(W − E[W ])] 2 2 = σX + E[(X − E[X])]E[(W − E[W ])] = σX = 25/12.5.5 + 25 (Y − 7.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. Furthermore. using the fact 2 = σ 2 + σ 2 = 29/12. σY X W cov(X. Y ) (Y − E[Y ]) 2 σY where cov(X. Therefore.5. gL (Y ) = 7.041/6. 29 The linear estimator gL (Y ) is compared with g(Y ) in the following figure. E[Y ] = E[X] + 2 2 E[W ] = 7. 11 10 9 g (yo ) . gL (yo ) 8 7 6 5 Linear predictor 4 4 5 6 7 yo 8 9 10 11 Page 2 of 3 . 041/6. so we need only to perform the integration: P (x2 + x 2 ≤ α) = 1 2 = 0 x2 +x2 ≤α 2 1 2π α 1 0 2 −α 2 1 − x2 +x2 1 2 2 e dx1 dx2 2π r2 2π e− 2 rdrdθ = 1−e Page 3 of 3 .431: Probabilistic Systems Analysis (Spring 2006) 3. The information given completely determines the 1 2 joint density function. The problem asks us to find P (x2 + x2 ≤ α).Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. 264–265. 3.041/6. 267–268. E[AB] = E[(W + X)(X + Y )] = E[W X + X 2 + XY + W Y ] = E[X 2 ] = var(X) + E[X]2 = 1 E[AC] = E[(W + X)(Y + Z)] = E[W Y + XY + XZ + W Z] = 0 2.431: Probabilistic Systems Analysis (Spring 2006) Recitation 13 Solutions April 11. pp. 2006 1. pp. Solution is in the text. Page 1 of 1 . Solution is in the text.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. then ρ(X. Y − E[Y ] is a positive multiple of X − E[X]. (Problem 4.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. then. ˆ (b) Let X = E[X | Y ] be the least mean squares estimator of X given Y . Y ) ρ(X.29) Let X and Y be two random variables with positive variances. Y )| ≤ 1. (c) If Y − E[Y ] is a negative multiple of X − E[X]. with probability 1. Show that ˆ E[(X − XL )Y ] = 0. Y ) = −1. X. are known to be pairwise uncorrelated and to satisfy E[W ] = E[X] = E[Y ] = E[Z] = 0 and var(W ) = var(X) = var(Y ) = var(Z) = 1. ˆ Use this property to show that the correlation of the estimation error X − XL with Y is zero. (c) Is it true that the estimation error X − E[X | Y ] is independent of Y ? Page 1 of 1 . then. 2. then ρ(X. Let A = W + X.431: Probabilistic Systems Analysis (Spring 2006) Recitation 14 April 11. Show that: (a) |ρ(X. Y − E[Y ] is a negative multiple of X − E[X]. Consider the correlation coefficient cov(X. (d) If ρ(X.25) Correlation Coefficient. Y ) = 1. 3. for any function h. respectively. (Problem 4. Y ) = � var(X)var(Y ) of two random variables X and Y that have positive variances. ˆ (a) Let XL be the linear least mean squares estimator of X based on Y .041/6. (b) If Y − E[Y ] is a positive multiple of X − E[X]. 2006 1. Compute E[AB] and E[AC]. Y ) = 1. Hint: Use the Schwarz inequality: (E[XY ])2 ≤ E[X 2 ]E[Y 2 ]. with probability 1. (e) If ρ(X. Suppose four random variables. W . Show that ˆ E[(X − X)h(Y )] = 0. B = X + Y and C = Y + Z. Y ) = −1. or the correlation between A & B and A & C. Y and Z. B) = . C. X2 ) σX1 σX2 σA = σB = and therefore: � 2 √ Var(B) = 2 Var(A) = √ 1 ρ(A. 2. C) = 0. pp. 264–265. pp. 3. Solution is in the text. Solution is in the text. C) = E[AC] − E[A]E[C] = E[W Y + W Z + XY + XZ] = 0 and therefore ρ(A. 267–268. 2 We proceed as above to find the correlation of A.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. X2 ) = Therefore we first find the covariance: Cov(A. 2006 1.041/6. We know that: ρ(X1 .431: Probabilistic Systems Analysis (Spring 2006) Recitation 14 Solutions April 11. Cov(A. B) = E[AB] − E[A]E[B] = E[W X + W Y + X 2 + XY ] = E[X 2 ] = 1 and � Cov(X1 . Page 1 of 1 . be a sequence of independent random variables that are uniformly distributed between 0 and 1. . but also guarantee an upper bound p on the probability that the estimator F differs from the true value f by a value greater than or equal to d i. and let a and b be scalars. . where S denotes the number of people in a size-n sample who are smokers. 3. . . in probability. for a given accuracy d and given confidence p. X2n+1 . we order X1 . (b) the probability p is reduced to half of its original value. Joe would like to sample the minimum number of people. Joe wishes to estimate the true fraction f of smokers in a large population without asking each and every person. X2 . Let X1 . Joe wishes to select the minimum n such that P(|F − f | ≥ d) ≤ p .431: Probabilistic Systems Analysis (Spring 2006) Recitation 15 April 20.05 and a particular value of d.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. For p = 0.025. .000. [That is. 2006 1. . 2. or p = 0. Let X and Y be random variables. . we let Yn be the median of the values of X1 . . where MY (s) is the transform of Y .] Show that the sequence Yn converges to 1/2. For every n.e. .. X2n+1 in increasing order and let Yn be the (n + 1)st element in this ordered sequence. Page 1 of 1 . X takes nonnegative values. X2 . . Determine the new minimum value for n if: (a) the value of d is reduced to half of its original value. (a) Use the Markov inequality on the random variable esY to show that P (Y ≥ b) ≤ e−sb MY (s). He plans to select n people at random and then employ the estimator F = S/n. Joe uses the Chebyshev inequality to conclude that n must be at least 50. .041/6. for every s > 0. .5 + ǫ or larger.5 < ǫ. there exists N such that for all n > N .5 − ǫ.431: Probabilistic Systems Analysis (Spring 2006) Recitation 15 April 20. this will imply that P (Yn ≤ 0.. See textbook pg. Let Zi be a Bernoulli random variable which is equal to 1 if and only if Xi ≥ 0. 000. Z2 . . Page 1 of 1 . (a) N = 200.5 − ǫ) also converges to zero. we must have at least n + 1 of the {Zi } to take value 1. we need to show that for any given ǫ > 0. the sequence (Z1 + � · · · + Z2n+1 )/(2n + 1) converges to 0. .Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.5 + ǫ} to occur.5 + ǫ} to occur. 399 2. for the event {Yn ≥ 0. 000.5) 2n + 1 Note that P (Zi = 1) = 0.5 + ǫ) is bounded by P ( it also converges to zero.5 + ǫ 0 otherwise {Z1 .5..} are i.5 − ǫ. .d random variables and E[Zi ] = P (Zi = 1) = P (Xi ≥ 0.5 − ǫ ensures the existence of such N . P (Yn ≥ 0. Let us fix some ǫ > 0. Since P (Yn ≥ 0. To show that P Z1 +···+Z2n+1 ≥ 0.5 + ǫ) = 0.i.5 converges 2n+1 to �zero.5 + ǫ) = P ( 2n+1 � 2n + 1 Zi 1 = P ( i=1 ≥ 0. in probability. (b) N = 100.5).5 + ) 2n + 1 2(2n + 1) �2n+1 �2n+1 = P( Zi ≥ i=1 �2n+1 i=1 Zi n + 1) ≥ n+1 ) 2n + 1 ≤ P( i=1 Zi ≥ 0. � P Z1 +···+Z2n+1 ≥ 0. By the weak law of large � numbers. we must have at least n + 1 of the random variables X1 . Zi 2n+1 ≥ 0. .5 + ǫ: � Zi = 1 if Xi ≥ 0. For the event {Yn ≥ 0. 2006 1.5 + ǫ) converges to 0. and it will follow that Yn converges to 0. X2 . By symmetry.041/6. Hence. The fact that the sequence (Z1 + · · · + Z2n+1 )/(2n + 1) converges to 2n+1 �2n+1 i=1 0.5 − ǫ. 3.. We will show that P (Yn ≥ 0. X2n+1 to have a value of 0. 2006 1.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. that Y is geometric with parameter pq. . and operates in discrete time units (slots). In this context. X2 . (d) Z = the number of slots after the first slot of the first busy period up to and including the first subsequent idle slot. (Example 5. mean and variance of the following random variables. (a) Relate the number of failures before the rth success (sometimes called a negative binomial random variable) to a Pascal random variable and derive its PMF. A priority task arises with probability p at the beginning of each slot. (b) Find the expected value and variance of the number of failures before the rth success. Show. (Problem 5. priority and non priority. (c) Obtain an expression for the probability that the ith failure occurs before the rth success. where the random variables Xi are geometric with parameter p. and otherwise let us call it idle. We call a string of idle (or busy) slots .. independently of other slots and requires one full slot to complete. without using transforms.6) Sum of a geometric number of independent random variables.4) Consider a Bernoulli process with probability of success in each trial equal to p. 3. Hint: Interpret the various random variables in terms of a split Bernoulli process. With this in in mind let us call a slot busy if within this slot the computer executes a priority task. (b) B = the length (number of slots) of the first busy period. are independent. Assume that the random variables N. flanked by busy (or idle.041/6. respectively) slots an idle period or busy period respectively. Let Y = X1 + X2 + . X1 . (c) I = the length of the first idle period.3) A computer executes two types of tasks. (Problem 5... it may be important to the know the probabilistic properties of the time intervals available for non priority tasks.431: Probabilistic Systems Analysis (Spring 2006) Recitation 16 April 25. (a) T = the time index of the first slot.. . and N is geometric with parameter q. Derive the PMF. 2. A non priority task is always available and is executed at a given slot if no priority task is available. + XN . page 302. page 303. 2006 1. .Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.431: Probabilistic Systems Analysis (Spring 2006) Recitation 16 April 25. (Problem 5. 3.6) See textbook.041/6.4) See textbook. 2. page 276. (Example 5. (Problem 5.3) See textbook. Given that a single arrival occurred in a given interval [0. (a) If it takes the criminal t seconds to commit the crime. t]. one at a time. of any particular bulb of a particular type is an independent random variable with the following PDF: � −x e x≥0 For Type-A Bulbs: fX (x) = 0 elsewhere � −3x 3e x≥0 For Type-B Bulbs: fX (x) = 0 elsewhere (a) Find the expected time until the first failure. . (f) Up to and including the 12th bulb failure.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. (c) Given that there are no failures until time t. (d) Find the variance of the time until the first bulb failure. (Problem 5. to illuminate a room. The lifetime. (b) Find the probability that there are no bulb failures before time t. Bulbs are replaced immediately upon failure. Each new bulb is selected independently by an equally likely choice between a Type-A bulb and a Type-B bulb. that a total of exactly 4 Type-A bulbs have failed? (g) Determine either the PDF or the transform associated with the time until the 12th bulb failure. (e) Find the probability that the 12th bulb failure is also the 4th type-A bulb failure. Police officers walk by the store according to a Poisson process of rate λ per minute. 3. If an officer walks by while the crime is in progress. 2. find the probability that the criminal will be caught.16) Consider a Poisson process. X. determine the conditional probability that the first bulb used is a type-A bulb. 2006 1. (h) Determine the probability that the total period of illumination provided by the first two Type-B bulbs is longer than that provided by the first Type-A bulb. (Problem 5. (b) Repeat part (a) under the new assumption that the criminal will only be caught if two police officers happen to walk by while the crime is in progress. the criminal will be caught.431: Probabilistic Systems Analysis (Spring 2006) Recitation 17 April 27.041/6. An amateur criminal is contemplating shoplifting from a store. what is the probability. show that the PDF of the arrival time is uniform over [0. t].12) Beginning at time t = 0 we begin using bulbs. the three arrival times are independent of each other. 3. The desired probability is obtained by integrating by parts and is 1 − e−λt (λt + 1). .041/6. (b) We are interested in the probability that the second arrival occurs before time t. the desired probability is 1 − e−tλ . Therefore the probability that all three occur within the first 30 minutes is: (1/3)3 = 1/27. This follows intuitively from the definition of the Poisson process: given that there was an arrival at some particular time. (a) The arrival time of each of the three calls is uniformly distributed in the interval of 90 minutes (see Problem 16 in Chapter 5 of the text). 2.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. Since the time until the first arrival is exponentially distributed. (Problem 5. Furthermore. 2006 1. (a) The criminal will be caught if the first officer comes by in fewer than t seconds. this gives us no information on what may have happened at other times. by the same reasoning as above.431: Probabilistic Systems Analysis (Spring 2006) Recitation 17 April 27. page 307. 1 − (8/27) = 19/27. The Erlang PDF of order 2 is fY2 (y) = λ2 ye−λy .16) See textbook. (b) The probability that at least one ocurs in the first 30 minutes is. What is the expected value of the time until the last bulb burns out ? 3. the first time when a bulb burns out ? 2. Two light bulbs have independent and exponentially distributed lifetimes Ta and Tb . What is the distribution of Z = min{Ta . Find the PMF of the number of Poisson arrivals during the time interval [0. X2 } .21) The number of Poisson arrivals during an exponentially distributed interval. Tb }. 2006 1.041/6. Consider a Poisson process with parameter λ. (Problem 5.16) More on Competing Exponentials. (Problem 5. respectively. T ]. (Example 5.15) Competing Exponentials. and an independent random variable T . 4.17a) Let X1 and X2 be independent and exponentially distributed.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. Three light bulbs have independent exponentially distributed lifetimes with a common parameter λ. (Example 5. which is exponential with parameter ν.431: Probabilistic Systems Analysis (Spring 2006) Recitation 18 May 2. with parameters λa and λb . with parameters λ1 and λ2 . Page 1 of 1 . Find the expected value of max{X1 . respectively. 16 in text. Problem 5. Example 5. page 296 3.15 in text. 2006 1.431: Probabilistic Systems Analysis (Spring 2006) Recitation 18 May 2. page 310 Page 1 of 1 .041/6. Example 5. Problem 5. page 307 4. page 296 2.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.21 in text.17a in text. Model this machine as a Markov chain. 1.4) Existence of a recurrent state.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. 2}. When the spider and the fly land in the same position. and will continue working with probability 1 − b. i ∈ {0. 1. (Problem 6. If it is working. i. If it breaks down on a given day. moves away from the spider by one unit with probability 0.3. i. for any i. it will be repaired and be working in the next day with probability r. Page 1 of 1 ..431: Probabilistic Systems Analysis (Spring 2006) Recitation 19 May 4. despite the repair efforts.3) A spider and a fly move along a straight line in unit increments. and stays in place with probability 0. 2}. The spider always moves towards the fly by one unit. the spider captures the fly. Can you infer the limiting behavior of the n-step transition probabilities ? 3. The initial distance between the spider and the fly is integer. Provide a recursive formula to evaluate the n-step transition probabilities rij (n). Suppose that whenever the machine remains broken for a given number of l days. Show that in a Markov chain at least one recurrent state must be accessible from any given state. 2006 1.3) A machine can be either working or broken down on a given day.3.4.e. (Example 6. there is at least one recurrent j in the set A(i) of accessible states from i. it will break down in the next day with probability b. The fly moves towards the spider by one unit with probability 0. (Problem 6. Compute r2i (3). it is replaced by a new working machine. j ∈ {0. and will continue to be broken down with probability 1− r.041/6. (c) Assume that the initial distance between the spider and the fly is 2 units. (a) Construct a Markov chain that describes the relative location of the spider and fly. (b) Identify the transient and recurrent states. 2. 36 0.252 0.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. parts (a) and (b) Problem 6.4 in text.55 0.6 0  0.100 0. page 354 (c) the n-step transition probabilities can be generated by the recursive formula 2 rij (n) = k=0 rik (n − 1)pkj for n > 1.006 0  0.36 0.721 0. j starting with rij (1) = pij where  1 0 0 [pij ] =  0.008 0 Eventually the spider will catch the fly.216 0  .784 0. [rij (5)] =  0.3 Plugging into the above formula gives :  1 0 0 0  [rij (2)] =  0.897 0. page 354   Page 1 of 1 . Example 6.3 in text. page 316 2.4 0. thus :  1 0 0 lim [rij (n)] =  1 0 0  n→∞ 1 0 0  3.992 0.09  Similarly      1 0 0 1 0 0 1 0 0 [rij (3)] =  0.4 0.041/6.431: Probabilistic Systems Analysis (Spring 2006) Recitation 19 May 4.3 0. Problem 6.003 0.3 in text.922 0.64 0.078 0  .994 0. [rij (10)] =  0. and all i.027 0. 2006 1. (Problem 6. or. with equal probability. independently of what the weather was on any past trip (so the weather could be nice on the way to the island. (a) Formulate an appropriate Markov chain model with n + 1 states and find the steady state probabilities.5 probability that her next test will be of the same difficulty. However. (Problem 6.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.9) A professor gives tests that are hard. In the latter case. and a 0. which happens with probability �. but poor on the way back).431: Probabilistic Systems Analysis (Spring 2006) Recitation 20 Markov Chains: Steady State Behavior May 09. there is a 0. (Problem 6. so that each ball has probability (1 − �)/n > 0 of being selected.041/6. 2. Alvin is an avid fisherman. At each time step. some of them black. if the weather is nice.25 probability of each of the other two levels of difficulty. as long as the weather is good. Alvin will take one of his n fishing rods for the trip. the weather is good on the way to or from the island with probability p. where 0 < � < 1. Alvin sails with nice weather but without a fishing rod? 3. If she gives a hard test. Construct an appropriate Markov chain and find the steady­state probabilities. if she gives a medium or easy test. 2006 1. we either do nothing. We have a total of n balls. We want to find the probability that on a given leg of the trip to or from the island the weather will be nice. her next test will be either medium or easy. medium or easy. and vice versa). he will not bring a fishing rod with him.13) Ehrenfest model of diffusion. and enjoys fishing off his boat on the way to and from the island. but Alvin will not fish because all his fishing rods are at his other home. What is the steady­state distribution of the number of white balls? Page 1 of 1 . and the process is repeated indefinitely. we change the color of the selected ball (if white it becomes black. Unfortunately. Now. we select a ball at random. but if the weather is bad. (b) What is the steady­state probability that on a given trip.10) Alvin likes to sail each Saturday to his cottage on a nearby island off the coast. some white. Page 1 of 1 . 2. 3. See online solutions.431: Probabilistic Systems Analysis (Spring 2006) Solutions for Recitation 20 Markov Chains: Steady State Behavior May 09.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. 2006 1. See online solutions.041/6. See online solutions. What is the expected number of days it took him to switch to course 15? [Hint: He had no particular aversion to course 9. In answering the questions below. What is the expected number of ice cream cones she buys herself before she leaves course 6? (e) Her friend Oscar started out just like Josephina. then the number of days until she is 6­1 again is 1). You don’t know how long it took him to switch. Otherwise. On each day that she is a 6­1 student. Accordingly.431: Probabilistic Systems Analysis (Spring 2006) Recitation 21 Markov Chains: Absorption Probabilities and Expected Time to Absorption May 11. she buys herself an ice cream cone at Tosci’s. so if tomorrow she is still 6­1. assume Josephina will be a student forever. she stays 6­1 for another day with probability 1/2. What is the expected number of days until she is in course 9? (g) Suppose that if she is course 9 or course 15. she stops buying herself ice cream. she has probability 1/8 of returning to 6­1. Also assume.] (f) Josephina decides that course 15 is not in her future. On any day she is a 6­3 student. so after she’s eaten 2 ice cream cones. for parts (a)­(f ) that if Josephina switches to course 9 or course 15. When she is 6­2. Page 1 of 1 . she has a probability of 1/4 of switching to course 9.041/6. Josephina is currently a 6­1 student. a course 9 student or a course 15 student the next day. On any day she is a 6­2 student. a probability of 3/8 of switching to 6­1 and a probability of 3/8 of switching to 6­2 the next day. her probability of entering 6­1 or 6­3 are in the same proportion as before. when she is a course 6­1 student. She can only afford so much ice cream. He is now in course 15.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. she has a probability of 1/2 of switching to course 15. 2006 1. a 6­3 student. she has an equally likely chance of becoming a 6­2 student. she has a probability of 1/2 of being a course 6­1 student the next day. she will stay there and will not change her course again. (a) What is the probability that she eventually will leave course 6? (b) What is the probability that she will eventually be in course 15? (c) What is the expected number of days until she leaves course 6? (d) Every time she switches into 6­1 from 6­2 or 6­3. What is the expected number of days until she is 6­1 again? (Notice that we know today she is 6­1. a probability of 3/8 of switching to 6­1 and a probability of 1/8 of switching to 6­3 the next day. and otherwise she remains in her current course. and otherwise she has an equally likely chance of becoming any of the other options. Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.they will be useful later: a6−1 = a6−2 = 0.77717 a6−3 = 0. (b) This is simply the absorption probability for the recurrent class consisting of the state course-15. from which there is no return. Then a15 = 1 a9 = 0 1 1 a6−1 = a6−1 + (1) + 2 8 1 3 a6−2 = (1) + a6−1 + 2 8 1 3 a6−3 = (0) + a6−1 + 4 8 Solving this system of equations yields 105 ≈ 0. Therefore she eventually leaves course 6 with probability 1 .571 184 We will keep the other ai ’s around as well . 6-2.50543 Page 1 of 5 1 1 1 a6−2 + (0) + a6−3 8 8 8 1 a6−3 8 3 a6−2 8 . the states 6-1. (a) The Markov chain is shown below. Let us denote the probability of being absorbed by state 15 conditioned on being in state i as ai .041/6. since they each have paths leading to either state 9 or state 15.431: Probabilistic Systems Analysis (Spring 2006) Solutions for Recitation 21 Markov Chains: Absorption Probabilities and Expected Time to Absorption May 11. 2006 1. and 6-3 are all transient. 1 1 9 1/2 1/8 1/4 3/8 1/8 3/8 3/8 1/8 15 6-1 1/8 1/2 6-3 6-2 1/8 By inspection. 1.522 46 23 1 1 1 µ6−2 + (0) + µ6−3 8 8 8 1 µ6−3 8 3 µ6−2 8 (d) The student buys one ice cream cone every time she goes from 6-2 to 6-1 or from 6-3 to 6-1. Then µ15 = 0 µ9 = 0 1 1 µ6−1 = 1 + µ6−1 + (0) + 2 8 1 3 µ6−2 = 1 + (0) + µ6−1 + 2 8 1 3 µ6−3 = 1 + (0) + µ6−1 + 4 8 Solving this system of equations yields µ6−1 = 162 81 = ≈ 3. Then we are interested in the expected value of the random variable N . to yield the following set of recursive equations: v15 (0) = 1 v9 (0) = 1 1 1 1 1 1 v6−1 (0) = v6−1 (0) + v6−2 (0) + v6−3 (0) + (1) + (1) 2 8 8 8 8 3 1 1 v6−2 (0) = (0) + v6−3 (0) + (1) 8 8 2 3 3 1 v6−3 (0) = (0) + v6−2 (0) + (1) 8 8 4 Solving this system of equations yields: v6−1 (0) = 46 ≈ 0. So E[N ] = (0)v6−1 (0) + (1)v6−1 (1) + (2)(1 − v6−1 (0) − v6−1 (1)) We use the total probability theorem. and we do this by again conditioning on the second following day: v6−1 (1) = v6−2 (1) = v6−3 (1) = 1 v6−1 (1) + 2 3 v6−1 (0) + 8 3 v6−1 (0) + 8 1 v6−2 (1) + 8 1 v6−3 (1) + 8 3 v6−2 (1) + 8 1 1 1 v6−3 (1) + (0) + (0) 8 8 8 1 (0) 2 1 (0) 4 Page 2 of 5 . conditioning on the next day. conditioned on being in state i. Let µi be the expected time until absorption conditioned on being in state i.041/6. Let us denote vi (j) as the probability that she transitions from from 6 − 2 to 6 − 1 or from 6 − 3 to 6 − 1 j times before leaving course 6. or 2. and buys no more than 2 ice cream cones.754 61 We still need to find v6−1 . which denotes the number of cones bought before leaving course 6. and takes on the values 0.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.431: Probabilistic Systems Analysis (Spring 2006) (c) This corresponds to an expected time until absorption for the transient state 6 − 1. Solving this system of equations yields: v6−1 (1) = 690 ≈ 0. X∞ = 15) = = P(X∞ = 15|Xn+1 = j)P(Xn+1 = j|Xn = i) P(X∞ = 15|Xn = i) aj Pi. we can solve for the expected number of cones: E[N ] = (0)v6−1 (0) + (1)v6−1 (0) + (2)(1 − v6−1 (0) − v6−1 (1)) 690 225 = + 2( ) 3721 3721 1140 = ≈ 0. whose probability we found before).j . but Pi. Note that state 9 now disappears. v=0 starting in state 6-1. conditioned on being in state i as µi Our system of equations now becomes ˜ µ15 = 0 ˜ µ6−1 = 1 + ˜ µ6−2 ˜ µ6−3 ˜ a6−1 1 µ6−1 + 0 + ˜ a6−1 2 a6−1 3 = 1+0+ µ6−1 + ˜ a6−2 8 a6−1 3 = 1+0+ µ6−1 + ˜ a6−3 8 a6−2 1 a6−3 1 µ6−2 + 0 + ˜ µ6−3 ˜ a6−1 8 a6−1 8 a6−3 1 µ6−3 ˜ a6−2 8 a6−2 3 µ6−2 ˜ a6−3 8 Solving this system of equations yields µ6−1 = ˜ (f) The new Markov chain is shown below.j|A = Pj. this automatically sets v to 1.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. note that Pj. So we may modify our chain with these new conditional probabilities and calculate the expected time to absorption on the new chain. Let us denote the new expected time to absorption.j|A = P(Xn+1 = j|Xn = i.65 483 Page 3 of 5 . So Pi.j for i = j.041/6. so we require that there be no more transitions from 6-2 to 6-1 or from 6-3 to 6-1 after the second day (that is. which means that we may not simply renormalize the transition probabilities in a uniform fashion after conditioning on this event.431: Probabilistic Systems Analysis (Spring 2006) Notice in the second and third equations that when she goes into state 6-1. which we will call A. Also. which we found in part (b).185 3721 Finally. 1763 ≈ 3.306 3721 (e) We want to find the expected time to absorption conditioned on the event that the student eventually ends up in state 15.j|A = Pi.j ai where ak is the absorption probability of eventually ending up in state 15 conditioned on being in state k. p9.6−1 = 8 . We can consider state 6-1 as an absorbing state.615 13 1 µ6−2 + 6 1 µ6−3 4 3 µ6−2 + 8 1 1 µ6−3 + (0) 6 6 1 (0) 4 1 (g) The corresponding Markov chain is the same as the one in part (a) except p9.041/6.431: Probabilistic Systems Analysis (Spring 2006) 1 9 1/2 1/6 1/4 3/8 1/6 3/4 3/8 6-1 1/6 6-3 6-2 1/4 This is another expected time to absorption question on the new chain.15 = 8 instead of p9.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. p15. Let us define µk to be the expected number of days it takes the student to go from state k to state 9 in this new Markov chain: 1 µ6−1 = 1 + µ6−1 + 2 3 µ6−2 = 1 + µ6−1 + 4 3 µ6−3 = 1 + µ6−1 + 8 Solving this system of equations yields: µ6−1 = 86 ≈ 6. p15.9 = 1.9 = 7 1 7 8 .6−1 = 8 .15 = 1. µ6−3 = 61 61 1 8 1 8 3 8 3 8 Page 4 of 5 . p15. Let µk be the expected number of transi­ tions to be absorbed if we start at state k µ9 = µ15 µ6−3 µ6−2 7 + (1 + µ9 ) ⇒ µ9 = 8 8 7 = + (1 + µ15 ) ⇒ µ15 = 8 8 3 1 = + (1 + µ6−2 ) + (1 + µ9 ) 8 4 1 1 = + (1 + µ6−3 ) + (1 + µ15 ) 8 2 344 312 ⇒ µ6−2 = . Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. conditioned on what happens on the first transition. π15 = 265 .041/6. The expected frequency of visits to 6-1 is π6−1 . which are π6−1 = 61 11 9 79 105 265 .431: Probabilistic Systems Analysis (Spring 2006) Let R be the number of days until she is 6-1 again. Since she is currently −1 1 6-1. the expected number of days until she is 6-1 again is π6−1 = 265 . Another approach to solving this problem uses the steady state probabilites of this chain. π9 = 265 . π6−3 = 265 . so the expected number of days between visits to 6-1 is π61 . E[R] = E[E[R|X2 ]] 1 1 1 1 1 = (1) + (1 + µ9 ) + (1 + µ15 ) + (1 + µ6−2 ) + (1 + µ6−3 ) 2 8 8 8 8 265 = 61 Notice that this chain consists of a single recurrent aperiodic class. 61 Page 5 of 5 . We find E[R] by using the total expectation theorem. π6−2 = 265 . (a) Find the probability of interest by using the normal approximation to the binomial. the probability that your computer’s operating system crashes at least once is 5%. (Example 7. (Problem 7.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6. you decide that the roulette is not fair. 2. Page 1 of 1 . and count the number of rounds for which the result is odd.8) We load on a plane 100 packages whose weights are independent random variables that are uniformly distributed between 5 and 50 pounds.6) Before starting to play the roulette in a casino. 2006 1. you want to look for biases that you can exploit. this time using the Poisson approximation to the binomial. independent of every other day. find an approximation for the probability that that you will make the wrong decision. If the count exceeds 55. You therefore watch 100 rounds that result in a number between 1 and 36. You are interested in the probability of at least 45 crash­free days out of the next 50 days. What is the probability that the total weight will exceed 3000 pounds? Find an approximate answer using the Central Limit Theorem. Assuming that the roulette is fair. (Problem 7. 3.431: Probabilistic Systems Analysis (Spring 2006) Recitation 22 Central Limit Theorem May 16.041/6.7) During each day. (b) Repeat part (a). 3. 2006 1. See online solutions.041/6. See online solutions. Page 1 of 1 . page 390. 2.Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.431: Probabilistic Systems Analysis (Spring 2006) Solutions for Recitation 22 Central Limit Theorem May 16. See solution in text. Documents Similar To Recitation 2006Skip carouselcarousel previouscarousel nextBook - Introduction to Probability (2e) by Dimitri P. Bertsekas & John N. TsitsiklisHomework 1 SolutionsSolutions Manual - Introduction to Probability (2e) by Dimitri P. Bertsekas & John N. TsitsiklisIntro to Probability and Statistics Formula Sheet 1mq7sol_manualDigital Design Vahid Solution Manual 2edCH2 Prob Supp05-StatisticalModels.pdfams507_homework3Calendario Academico 2014-2 PresencialCH1 Prob SuppOptimization of Business ProcessesSTAT 230 A2 Fall 2012 SolutionsAlvin W. Drake Fundamentals of Applied Probability Theory 1967Articulo 5Queuing Theory257375163 Marcel Finan s Exam 1P ManualProbability and StatisticsChap10 decision scienceBinomialTSP50RMptsp_prsolutions08PS 1 QuestionsPart 4C (Quantitative Methods for Decision Analysis) 3541313_S83_Part_I_WSxenoquatzStats Intro1. ObamiroDocumentFooter MenuBack To TopAboutAbout ScribdPressOur blogJoin our team!Contact UsJoin todayInvite FriendsGiftsLegalTermsPrivacyCopyrightSupportHelp / FAQAccessibilityPurchase helpAdChoicesPublishersSocial MediaCopyright © 2018 Scribd Inc. .Browse Books.Site Directory.Site Language: English中文EspañolالعربيةPortuguês日本語DeutschFrançaisTurkceРусский языкTiếng việtJęzyk polskiBahasa indonesiaSign up to vote on this titleUsefulNot usefulYou're Reading a Free PreviewDownloadClose DialogAre you sure?This action might not be possible to undo. Are you sure you want to continue?CANCELOK
Copyright © 2024 DOKUMEN.SITE Inc.