ECE_313_SP2023_HW8sol

pdf

School

University of Illinois, Urbana Champaign *

*We aren’t endorsed by this school

Course

313

Subject

Electrical Engineering

Date

Jan 9, 2024

Type

pdf

Pages

6

Uploaded by SuperGalaxy7216

Report
University of Illinois Spring 2023 ECE 313: Problem Set 8: Problems and Solutions Due: Thursday, March 23 at 5:00:00 p.m. Reading: ECE 313 Course Notes, Sections 3.6.2 and 3.6.3 Note on reading: For most sections of the course notes there are short answer questions at the end of the chapter. We recommend that after reading each section you try answering the short answer questions. Do not hand in; answers to the short answer questions are provided in the appendix of the notes. Note on turning in homework: Homework is assigned on a weekly basis on Thursdays, and is due by 5:00 p.m. on the following Thursday. You must upload handwritten homework to Gradescope. No typeset homework will be accepted. No late homework will be accepted. Please write on the top right corner of the first page: NAME AS IT APPEARS ON Canvas NETID SECTION PROBLEM SET # Page numbers are encouraged but not required. Five points will be deducted for improper headings. 1. [Gaussian Distribution] Suppose X is a N ( 2 , 4) random variable. Compute the following quantities. (a) P { X ≥ − 2 } . Solution: We are given that X has a Gaussian distribution with mean 2 and variance 4. Then X +2 2 is a standard normal distribution. P { X ≥ − 2 } = P X ( 2) 2 2 ( 2) 2 = P X + 2 2 0 = 1 2 (b) P ( X ≥ − 2 | X ≥ − 4).
Solution: P ( X ≥ − 2 | X ≥ − 4) = P ( X ≥ − 2 X ≥ − 4) P ( X ≥ − 4) = P ( X ≥ − 2 X ≥ − 4) P ( X ≥ − 4) = P ( X ≥ − 2) P ( X ≥ − 4) = P ( X +2 2 0) P ( X +2 2 ≥ − 1) = 1 / 2 Q ( 1) = 1 2Φ(1) (c) P { X 2 < X + 2 } . Solution: P { X 2 < X + 2 } = P { X 2 X 2 < 0 } = P { ( X 2)( X + 1) < 0 } = P {− 1 < X < 2 } = P { 0 . 5 < X + 2 2 < 2 } = Φ(2) Φ(0 . 5) (d) E [( X + 2) 2 ]. Solution: Recall that Var( X ) = E [( X E [ X ]) 2 ] . Hence, E [( X + 2) 2 ] = E [( X ( 2)) 2 ] = Var( X ) = 4 . Alternate solution: E [( X + 2) 2 ] = Z −∞ ( x + 2) 2 f X ( x ) dx = Z −∞ ( x + 2) 2 1 2 πσ 2 exp ( x µ ) 2 2 σ 2 dx = Z −∞ ( x + 2) 2 1 8 π exp ( x + 2) 2 8 dx 2
Using change of variable u = x + 2 and doing integration by parts, E [( X + 2) 2 ] = Z −∞ u 2 1 8 π exp u 2 8 du = 4 8 π Z −∞ u u 4 exp u 2 8 du = 4 8 π u exp u 2 8 −∞ Z −∞ exp u 2 8 du ! = 0 + 4 Z −∞ 1 2 π 4 exp u 2 2 · 4 du = 4 2. [Communication in Gaussian Noise] A wireless communication system consists of a transmitter and a receiver. The transmitter sends a signal x , and the receiver observes Y = x + Z, where Z is a noise term, modeled as a Gaussian random variable with mean µ Z = 0 and variance σ 2 Z = 1. (a) Suppose the transmitted signal is x = 1. What is the pdf of the received signal Y ? Solution: The received signal Y follows a Gaussian distribution with µ Y = 1 and σ 2 Y = 1 . The pdf is P ( Y = y ) = 1 2 π exp ( y 1) 2 2 (b) Now suppose the transmitted signal can be either x = 1 or x = 1. The receiver uses the following decoding rule: if Y > 0, it declares that x = 1; if Y 0, it declares that x = 1. Assuming that the transmitter sends 1 or +1 with probability 1 / 2 each, what is the receiver’s error probability? Solution: Let Y 1 and Y 1 be the received signal when x = 1 and x = 1 are transmitted respectively. Their distributions are Y 1 N (1 , 1) Y 1 N ( 1 , 1) The individual probabilities of error for the two symbols transmitted can be written as P ( Y = 1 | x = 1) = P ( Y 1 < 0) = P ( Y 1 1 < 1) = Φ( 1) = Q (1) 3
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
P ( Y = 1 | x = 1) = P ( Y 1 > 0) = P ( Y 1 + 1 > 1) = Q (1) The receiver’s probability of error is p error = P ( x = 1) P ( Y = 1 | x = 1) + P ( x = 1) P ( Y = | x = 1) = Q (1)( 1 2 + 1 2 ) = Q (1) (c) Now suppose the transmitted signal x can be chosen from three possible values: x = 2, x = 0 and x = 2. The receiver now uses the following decoding rule: if Y < 1, it declares x = 2, if Y > 1, it declares x = 2, and otherwise it declares x = 0. Assuming the transmitter sends each possible symbol with probability 1 / 3, what is the receiver’s error probability? Solution: Similar to previous part, we define Y 1 , Y 0 and Y 2 as Y 2 N (2 , 1) Y 0 N (0 , 1) Y 2 N ( 2 , 1) The individual probabilities of error are P ( Y ̸ = 2 | x = 2) = P ( Y 2 < 1) = P ( Y 2 2 < 1) = Φ( 1) = Q (1) P ( Y ̸ = 0 | x = 0) = P ( Y 0 > 1 Y 0 < 1) = P ( Y 0 > 1) + P ( Y 0 < 1) = Q (1) + Φ( 1) = 2 Q (1) P ( Y ̸ = 2 | x = 2) = P ( Y 2 > 1) = P ( Y 1 + 2 > 1) = Q (1) The receiver’s probability of error is p error = P ( x = 2) P ( Y ̸ = 2 | x = 2) + P ( x = 0) P ( Y ̸ = 0 | x = 0) + P ( x = 2) P ( Y ̸ = 2 | x = 2) = Q (1)( 1 3 + 2 3 + 1 3 ) = 4 3 Q (1) 4
3. [Gaussian versus Poisson Approximation for Binomial Distribution] A communication receiver recovers a block of n = 10 5 bits. It is known that each bit in the block can be in error with probability 10 4 , independently of whether other bits are in error. Let X be the number of bit errors. (a) Write down an exact expression for P { X = 15 } . You do not need to compute a numerical value for this probability. Solution: X has a Binomial distribution with parameters ( n = 10 5 , p = 10 4 ). Thus, P { X = 15 } = 10 5 15 (10 4 ) 15 (1 10 4 ) (10 5 15) 0 . 03472 . Note that this value is not easy to compute numerically. (b) Determine an approximate value for P { X = 15 } via the Gaussian approximation with continuity correction. Solution: Let e X be Gaussian random variable with µ X = np = 10, and σ 2 X = np (1 p ) = 9 . 999, and σ X 3 . 162 . Therefore, P { X = 15 } = P { 14 . 5 < X < 15 . 5 } P { 14 . 5 < e X < 15 . 5 } = P ( 14 . 5 10 3 . 162 < e X 10 3 . 162 < 15 . 5 10 3 . 162 ) = P ( 1 . 42 < e X 10 3 . 162 < 1 . 74 ) Φ(1 . 74) Φ(1 . 42) 0 . 9591 0 . 9222 = 0 . 0369 . (c) Solve part (b) using the Poisson approximation of a binomial distribution. Solution: For the Poisson approximation, we set λ = np = 10. Then P { X = 15 } ≈ e 10 10 15 15! 0 . 0347 . Note that the Poisson approximation is more accurate than the Gaussian approximation for this example. This is to be expected since n is large and p is small. 4. [Gaussian Approximation] You go to a casino and decide to play a game in which, with probability 0.4, you win 1 dollar, and with probability 0.6, you lose 1 dollar. You decide to play this same game repeatedly 100 times. Let X i ∈ {− 1 , 1 } represent your earnings at the i th game, for i = 1 , . . . , 100. Assume that X 1 , X 2 , ..., X 100 are all independent. Let X = n i =1 X i be your total earnings (which may be negative). (a) Let Z i = ( X i + 1) / 2, for i = 1 , . . . , 100 and Z = 100 i =1 Z i . Notice that Z i is a binary indicator of whether the i th game was won. What is the distribution of Z ? Solution: Notice that each Z i is an independent Bernoulli random variable with prob- ability of success p = 0 . 4. Then Z is a Binomial random variable with n = 100 and p = 0 . 4 , i.e. Binomial(100 , 0 . 4) . 5
(b) Express the event { X 10 } in terms of Z , and use the Gaussian approximation with continuity correction to compute P { X 10 } . Solution: We can write Z as Z = X 2 + 50 The event { X 10 } is equivalent to at least 10 more successes than losses in 100 trials or Z 55 . Let e Z be a Gaussian random variable with µ Z = np = 40 and σ 2 Z = np (1 p ) = 24. Using Gaussian approximation with continuity correction, P ( Z 55) P ( e Z 54 . 5) = P e Z 40 24 54 . 5 40 24 ! = P e Z 40 24 2 . 9598 ! = Q (2 . 96) = 0 . 0015 (c) Express the event { X = 0 } in terms of Z and use the Gaussian approximation with continuity correction to compute P { X = 0 } . Solution: The event { X = 0 } is equivalent to { Z = 50 } . Following similar procedure as part (b), P ( Z = 50) = P (49 . 5 e Z 50 . 5) = P 49 . 5 40 24 e Z 40 24 50 . 5 24 ! = P 1 . 9392 e Z 40 24 2 . 1433 ! = Φ(2 . 143) Φ(1 . 939) = 0 . 02625 0 . 01606 = 0 . 0102 (d) Solve part (c) using the Poisson approximation of a binomial distribution instead. Solution: Poisson distribution with λ = np = 40 approximates the Binomial distribu- tion. P ( Z = 50) = λ 50 e λ 50! = 0 . 0177 The exact probability of { Z = 50 } using the Binomial distribution is 0 . 0103 . Hence, we see that the Gaussian approximation is a better estimate than the Poisson distribution for this case. 6
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help