Mock Exam 1 solutions
pdf
keyboard_arrow_up
School
University of Illinois, Urbana Champaign *
*We aren’t endorsed by this school
Course
300
Subject
Industrial Engineering
Date
Dec 6, 2023
Type
Pages
9
Uploaded by DeanLightning6647
Mock Exam 1
IE 300: Analysis of Data
Spring 2023
02/16/2022
Name:
Solution Key
This exam has 5 questions for a total of 100 points.
The points and expected time of each question are shown next to it.
There are 7 pages in the exam.
You have 75 minutes to complete all questions.
You will be given an additional 5 minutes in the end to double check your answer.
The exam is
open books, open notes
.
You
are allowed
to use a scientific calculator.
Page 7 contains the “
z
-table” (values for the normal distribution cumulative distribution function).
You are
not
allowed to discuss the questions with anyone other than me.
Answer every question
to the best of your knowledge.
Make sure to show your work for
partial credit
.
Do awesome!
1
Mock Exam 1
Question 1: Basic probability questions (13 minutes, 20 points)
(a) Select all that apply.
Consider two dice: a real one with six sides marked with the
integer
numbers
1
,
2
,
3
,
4
,
5
,
6 and a virtual one which simply produces
real numbers
in (0
,
6). Let
X
be
the random variable representing the outcome of a dice.
The probability of getting a 3 (
P
(
X
= 3)) is the same in both dice.
a)
The probability of getting more than a 3 (
P
(
X >
3)) is the same in both dice.
b)
The probability of getting less than a 3 (
P
(
X <
3)) is the same in both dice.
c)
The first dice has probability mass function equal to
1
6
and the second dice has probability
density function equal to
1
6
.
d)
(
4 Points
)
(b) Select all that apply. If
P
(
A
∩
B
) =
P
(
A
), then:
P
(
A
∪
B
) =
P
(
B
)
a)
A
⊆
B
b)
P
(
A
\
B
) = 0
c)
(
4 Points
)
(c) Pick the (one) correct answer. Assume that events
A
,
B
, and
C
are
collectively exhaustive
.
Then, we must have that
P
(
A
∪
B
∪
C
):
= 1
a)
≥
1
b)
≤
1
c)
(
3 Points
)
(d) A company is considering 12 candidate cities for 3 new facilities. These 3 facilities will serve
different purposes.
If a city can be selected to host more than one facility if the company picks
them to, how many different setups can we come up with?
(
3 Points
)
(e) What if every city can only be picked for at most one facility: that is, if a city is picked for
either of the 3 facilities, then they cannot be picked for another one. How many different setups
can you come up with in this case?
(
3 Points
)
(f) The company has asked its employees to rank the 12 candidate cities in order of preference
based on their quality of life. How many different rankings can you come up with?
(
3 Points
)
Answer:
(a)
False
.
The probability for the discrete die is
P
(
X
= 3) =
1
6
; for the continuous die the
probability is
P
(
X
= 3) = 0.
a)
True
. The probability for the discrete die is
P
(
X >
3) =
P
(
X
= 4) +
P
(
X
= 5) +
P
(
X
=
6) =
1
2
, the same as for the continuous die
P
(
X >
3) = 1
-
F
(3) = 1
-
3
6
=
1
2
.
b)
False
.
The probability for the discrete die is
P
(
X <
3) =
P
(
X
= 1) +
P
(
X
= 2) =
1
3
,
whereas for the continuous die it is
P
(
X <
3) =
F
(3) =
3
6
=
1
2
.
c)
True
.
The pmf of a discrete uniform distribution is
1
b
-
a
+1
=
1
6
; the pdf of a continuous
uniform distribution is
1
b
-
a
=
1
6
.
d)
2
Mock Exam 1
(b) If
P
(
A
∩
B
) =
P
(
A
), then all three statements apply!
(c) For collectively exhaustive events, we have
A
∩
B
∩
C
=
S
, where
S
is the whole sample space.
Hence,
P
(
A
∪
B
∪
C
) = 1.
(d) This can be answered using the
multiplication rule
, much like what you’d do use to calculate
all possible outcomes of throwing two dies. If each city can accommodate more than 1 facilities,
then for each facility, we have 12 options, leading to 12
·
12
·
12 = 12
3
= 1728 different setups.
(e) This can be viewed as a
permutation
. If each city is only allowed at most 1 facility, then for
the first facility, we have 12 options, for the second we have 11, for the third we have 10 and a total
of 12
·
11
·
10 = 1320 different setups. The same answer can be given by using the formula for the
permutation of
P
12
,
3
=
12!
9!
= 1320.
(f) In a question like this, where
any city can be placed in any position/rank
, we may calculate
this as
n
!. Similar to having
n
people pick
n
gifts, or
n
employees being assigned to
n
positions,
here
n
= 12 cities can be found in any of
n
= 12 positions. The answer is
n
! = 12! = 479001600
possible rankings.
3
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Mock Exam 1
Question 2: Pilot season (12 minutes, 18 points)
A TV series production company asks people to watch a pilot episode of a TV series and rate it.
The rating is
normally distributed
with a mean of
μ
= 7
.
5 and a standard deviation of
σ
= 0
.
9
if the TV show is good; and with a mean of
μ
= 4 and a standard deviation of
σ
= 2 if the show
is not good. The company checks the rating of the pilot and if the rating is above a score of 6.6,
they pick it up for production. Answer the following questions.
(a) What is the probability that a good TV show is picked up for production?
What is the
probability that a bad TV show is picked up for production?
(
6 points
)
(b) What is the probability that a random show is picked up for production? You may assume that
historically only 5% of shows have been good.
(
6 points
)
(c) It was just announced that a show has been picked up for production. What is the probability
that it is a good TV show?
You may still assume that historically only 5% of shows have been
good.
(
6 points
)
Answer:
(a) Let
G/B
be good/bad shows, and let
X
be the audience rating for the pilot. Then,
we are looking for
P
(
X >
6
.
6
|
G
). A good TV show is normally distributed with known
μ
and
σ
.
We have:
z
=
6
.
6
-
7
.
5
0
.
9
=
-
1
.
We then use it to calculate
P
(
X >
6
.
6
|
G
) = 1
-
P
(
X
≤
6
.
6
|
G
) = 1
-
Φ(
-
1) = Φ(1) = 0
.
8413
.
We are also interested in
P
(
X >
6
.
6
|
B
). A bad TV show is also normally distributed but
with different
μ
and
σ
. We have:
z
=
6
.
6
-
4
2
= 1
.
3
.
We again use it to calculate
P
(
X >
6
.
6
|
B
) = 1
-
P
(
X
≤
6
.
6
|
B
) = 1
-
Φ(1
.
3) = 1
-
0
.
9032 = 0
.
0968
.
(b) We are looking for
P
(produced). From the law of total probability:
P
(produced) =
P
(produced
|
G
)
·
P
(
G
)+
P
(produced
|
B
)
·
P
(
B
) = 0
.
8413
·
0
.
05+0
.
0968
·
0
.
95 = 0
.
134025
≈
13
.
4%
.
(c) We are looking for
P
(
B
|
produced). We have:
•
P
(
G
) = 0
.
05
, P
(
B
) = 0
.
95.
•
P
(produced
|
G
) = 0
.
8413 (from part a) and
P
(produced
|
B
) = 0
.
0968 (from part b).
Combining all of them in Bayes’ theorem and using the result from part (b) for the denominator:
P
(
B
|
produced) =
P
(produced
|
B
)
·
P
(
B
)
P
(produced
|
G
)
·
P
(
G
) +
P
(produced
|
B
)
·
P
(
B
)
=
=
0
.
0968
·
0
.
95
0
.
134025
=
= 0
.
6861
.
4
Mock Exam 1
Question 3: Weird transmissions (15 minutes, 22 points)
A binary sequence consists of two digits: “0” or “1”. Consider a sequence that is generated ran-
domly. The digit “1” appears with probability
p
= 0
.
75 and the digit “0” appears with probability
1
-
p
= 0
.
25. Answer the following questions.
(a) What is the probability we get exactly 2 of each digit in a sequence of 4 digits?
(
6 Points
)
(b) We call a digit the winner of a sequence, if it appears in more places in that sequence.
For
example, the sequence “1110010110” has “1” as its winner as it appears in 6 places (versus the 4
places having a “0”). What is the probability “1” is a winner in a sequence of 5 digits?
(
6 Points
)
(c) Now consider that we sent multiple messages, each having exactly 5 digits. The transmission
ends when a message has “0” as the winner. We have already send 4 messages (with “1” as the
winner).
How many more messages should we expect to send before “0” wins and we stop the
transmission?
(
5 Points
)
(d)
*
We say that we have a “sequence” when multiple same digits show up. For example, if 3 “0”s
appear one after the other, then we have a “sequence” of length 3. What is the expected length of
the first sequence? What is the expected length of the second sequence?
(
5 Points
)
Answer:
(a) This is a binomial distribution:
P
(
X
= 2) =
4
2
0
.
75
2
·
0
.
25
2
= 0
.
2109
.
(b) In order for “1” to be the winner we need it to appear
X
= 3
,
4
,
or 5 times:
P
(winner 1) =
P
(
X
= 3) +
P
(
X
= 4) +
P
(
X
= 5) =
=
5
3
0
.
75
3
·
0
.
25
2
+
5
4
0
.
75
4
·
0
.
25
1
+
5
5
0
.
75
5
·
0
.
25
0
=
= 0
.
2637 + 0
.
3955 + 0
.
2373 = 0
.
8965
.
(c) This is a geometric distribution, as we are looking for the number of messages that we send
until the first “failed” message is sent (when “0” wins). Recall that the geometric distribution is
memoryless. Finally, we have
p
=
P
(winner 0) = 1
-
P
(winner 1) = 0
.
1035
.
Hence,
E
[
X
] =
1
p
= 9
.
66 messages
.
(d) The expectation of a geometric is 1
/p
. However, the first run can be either a run of “0”s or of
“1”s. So, let us condition on it:
E
[1st run] =
E
[1st run
|
1st digit is a 1]
·
P
(1st digit is a 1) +
E
[1st run
|
1st digit is a 0]
·
P
(1st digit is a 0) =
=
1
0
.
25
·
0
.
75 +
1
0
.
75
·
0
.
25 =
4
3
.
Similarly for the second run we have:
E
[2nd run] =
E
[2nd run
|
1st digit is a 1]
·
P
(1st digit is a 1) +
E
[2nd run
|
1st digit is a 0]
·
P
(1st digit is a 0) =
=
1
0
.
75
·
0
.
75 +
1
0
.
25
·
0
.
25 = 2
.
5
Mock Exam 1
Question 4: A supercomputing center (12 minutes, 20 points)
The number of jobs that a supercomputing center receives for processing follows a
Poisson dis-
tribution
. The rates change depending on the type of job. The supercomputing center sees jobs
arrive with a rate of:
•
1 arrival every 5 minutes for computing jobs,
•
1 arrival every 2 minutes for visualization jobs, and
•
1 arrival every 15 minutes for data wrangling jobs.
Answer the following questions.
(a) Assume it is 8 am now. What is the probability that the next data wrangling job arrives after
9 am?
(
4 Points
)
(b) What is the probability of getting the 2nd computing job in the next 10 minutes?
(
4 Points
)
(c) What is the expected time until the 5th computing job arrives?
(
4 Points
)
(d) What is the probability that the next job that arrives is a computing job?
(
4 Points
)
(e)
*
We are looking at the log of jobs, but by some software error all times have been deleted and
we cannot tell whether the log is from the morning (8 am to noon), from the afternoon (noon to
6 pm), or from the remaining day (6 pm to 8 am). We noticed that we had 20 jobs in an hour.
What is the probability the log is from the morning hours (8 am to noon)?
(
4 Points
)
Answer:
(a) Let
T
be the time the next data wrangling job arrives after 8am.
Then, we are
looking for
P
(
T >
60
0
). We then have:
P
(
T >
60) = 1
-
P
(
T
≤
60) =
e
-
1
15
·
60
=
e
-
4
= 0
.
0183
.
(b) Let
X
be the number of computing jobs we get in the next 10 minutes.
This is a Poisson
distributed quantity with rate
λ
= 2 (computing jobs every 10 minutes). Hence:
P
(
X
≥
2) = 1
-
P
(
X
= 0)
-
P
(
X
= 1) = 1
-
e
-
2
-
2
·
e
-
2
= 0
.
5940
.
(c) Let
T
be the time until the 5th computing job arrives. This is an Erlang distributed quantity
with
λ
= 1 every 5 minutes and
k
= 5:
E
[
T
] =
k
λ
= 25 minutes
.
(d) Note that we are interested in one exponentially distributed random variable to be smaller
in value than two other exponentially distributed random variables.
To answer this, we would
typically calculate
λ
1
λ
1
+
λ
2
+
λ
3
. We then have:
λ
1
λ
1
+
λ
2
+
λ
3
=
1
/
5
1
/
5 + 1
/
2 + 1
/
15
= 0
.
2609
.
(e) First, calculate what
P
(
X
= 20) is for each of the three period:
6
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Mock Exam 1
•
morning with
λ
= 12:
P
(
X
= 20
|
morning) =
e
-
12
·
12
20
20!
= 0
.
00968
.
•
afternoon with
λ
= 30:
P
(
X
= 20
|
afternoon) =
e
-
30
·
30
20
20!
= 0
.
0134
.
•
rest with
λ
= 4:
P
(
X
= 20
|
rest) =
e
-
4
·
4
20
20!
= 8
.
277
·
10
-
9
.
We now calculate
P
(morning
|
X
= 20) using Bayes’ theorem:
P
(morning
|
X
= 20) =
P
(
X
= 20
|
morning)
·
P
(morning)
P
(
X
= 20)
=
=
0
.
00968
·
4
/
24
0
.
00968
·
4
/
24 + 0
.
0134
·
6
/
24 + 8
.
277
·
10
-
9
·
14
/
24
=
= 0
.
325
.
7
Mock Exam 1
Question 5: Continuous random variables (13 minutes, 20 points)
Assume that continuous random variable
X
such that 0
≤
X
≤
4 is distributed with
f
(
x
) =
cx
2
.
Answer the following questions.
(a) What should
c
be in order for
f
(
x
) to be a valid probability density function?
(
5 Points
)
(b) What is
P
(2
≤
X
≤
3)?
(
5 Points
)
(c) What is the expectation and the variance of
X
?
(
5 Points
)
(d) Assume that random variable
Y
is a function of
X
.
Specifically,
Y
=
√
X
when
x
≤
1 and
Y
=
1
X
when 1
< x
≤
4. What is the expectation of
Y
?
(
5 Points
)
Answer:
(a) We know that
R
+
∞
-∞
f
(
x
)
dx
= 1:
Z
4
0
cx
2
dx
= 1 =
⇒
cx
3
3
4
0
= 1 =
⇒
64
c
3
= 1 =
⇒
c
=
3
64
.
(b)
P
(2
≤
X
≤
3) =
R
3
2
f
(
x
)
dx
=
R
3
2
3
64
x
2
dx
=
3
x
3
64
·
3
3
2
=
27
-
8
64
=
19
64
= 0
.
2969.
(c) We have, by definition:
E
[
X
] =
4
Z
0
xf
(
x
)
dx
=
4
Z
0
3
64
x
3
dx
=
3
256
x
4
0
4 = 3
.
For the variance now, we use the following identity:
V ar
[
X
] =
E X
2
-
(
E
[
X
])
2
=
4
Z
0
x
2
f
(
x
)
dx
-
3
2
=
4
Z
0
3
64
x
4
dx
-
9 =
=
3
320
x
5
4
0
-
9 =
48
5
-
9 =
3
5
= 0
.
6
.
(d) We know that
E
[
g
(
X
)] =
R
+
∞
-∞
g
(
x
)
f
(
x
)
dx
, for any function of a random variable
X
. Here,
g
(
x
) =
(
√
x,
x
≤
1
,
1
x
,
x >
1
.
Hence:
E
[
g
(
X
)] =
4
Z
0
g
(
x
)
·
f
(
x
)
dx
=
1
Z
0
√
x
3
64
x
2
dx
+
4
Z
1
1
x
3
64
x
2
dx
=
=
3
x
7
/
2
224
1
0
+
3
x
2
128
4
1
=
3
224
+
45
128
= 0
.
365
.
Good luck!
8
Mock Exam 1
STANDARD NORMAL CUMULATIVE
DISTRIBUTION FUNCTION (Φ(
z
))
z
0.00
0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0.09
0.0
0.5000 0.5040 0.5080 0.5120 0.5160 0.5199 0.5239 0.5279 0.5319 0.5359
0.1
0.5398 0.5438 0.5478 0.5517 0.5557 0.5596 0.5636 0.5675 0.5714 0.5753
0.2
0.5793 0.5832 0.5871 0.5910 0.5948 0.5987 0.6026 0.6064 0.6103 0.6141
0.3
0.6179 0.6217 0.6255 0.6293 0.6331 0.6368 0.6406 0.6443 0.6480 0.6517
0.4
0.6554 0.6591 0.6628 0.6664 0.6700 0.6736 0.6772 0.6808 0.6844 0.6879
0.5
0.6915 0.6950 0.6985 0.7019 0.7054 0.7088 0.7123 0.7157 0.7190 0.7224
0.6
0.7257 0.7291 0.7324 0.7357 0.7389 0.7422 0.7454 0.7486 0.7517 0.7549
0.7
0.7580 0.7611 0.7642 0.7673 0.7703 0.7734 0.7764 0.7794 0.7823 0.7852
0.8
0.7881 0.7910 0.7939 0.7967 0.7995 0.8023 0.8051 0.8078 0.8106 0.8133
0.9
0.8159 0.8186 0.8212 0.8238 0.8264 0.8289 0.8315 0.8340 0.8365 0.8389
1.0
0.8413 0.8438 0.8461 0.8485 0.8508 0.8531 0.8554 0.8577 0.8599 0.8621
1.1
0.8643 0.8665 0.8686 0.8708 0.8729 0.8749 0.8770 0.8790 0.8810 0.8830
1.2
0.8849 0.8869 0.8888 0.8907 0.8925 0.8944 0.8962 0.8980 0.8997 0.9015
1.3
0.9032 0.9049 0.9066 0.9082 0.9099 0.9115 0.9131 0.9147 0.9162 0.9177
1.4
0.9192 0.9207 0.9222 0.9236 0.9251 0.9265 0.9279 0.9292 0.9306 0.9319
1.5
0.9332 0.9345 0.9357 0.9370 0.9382 0.9394 0.9406 0.9418 0.9429 0.9441
1.6
0.9452 0.9463 0.9474 0.9484 0.9495 0.9505 0.9515 0.9525 0.9535 0.9545
1.7
0.9554 0.9564 0.9573 0.9582 0.9591 0.9599 0.9608 0.9616 0.9625 0.9633
1.8
0.9641 0.9649 0.9656 0.9664 0.9671 0.9678 0.9686 0.9693 0.9699 0.9706
1.9
0.9713 0.9719 0.9726 0.9732 0.9738 0.9744 0.9750 0.9756 0.9761 0.9767
2.0
0.9772 0.9778 0.9783 0.9788 0.9793 0.9798 0.9803 0.9808 0.9812 0.9817
2.1
0.9821 0.9826 0.9830 0.9834 0.9838 0.9842 0.9846 0.9850 0.9854 0.9857
2.2
0.9861 0.9864 0.9868 0.9871 0.9875 0.9878 0.9881 0.9884 0.9887 0.9890
2.3
0.9893 0.9896 0.9898 0.9901 0.9904 0.9906 0.9909 0.9911 0.9913 0.9916
2.4
0.9918 0.9920 0.9922 0.9925 0.9927 0.9929 0.9931 0.9932 0.9934 0.9936
2.5
0.9938 0.9940 0.9941 0.9943 0.9945 0.9946 0.9948 0.9949 0.9951 0.9952
2.6
0.9953 0.9955 0.9956 0.9957 0.9959 0.9960 0.9961 0.9962 0.9963 0.9964
2.7
0.9965 0.9966 0.9967 0.9968 0.9969 0.9970 0.9971 0.9972 0.9973 0.9974
2.8
0.9974 0.9975 0.9976 0.9977 0.9977 0.9978 0.9979 0.9979 0.9980 0.9981
2.9
0.9981 0.9982 0.9982 0.9983 0.9984 0.9984 0.9985 0.9985 0.9986 0.9986
3.0
0.9987 0.9987 0.9987 0.9988 0.9988 0.9989 0.9989 0.9989 0.9990 0.9990
3.1
0.9990 0.9991 0.9991 0.9991 0.9992 0.9992 0.9992 0.9992 0.9993 0.9993
3.2
0.9993 0.9993 0.9994 0.9994 0.9994 0.9994 0.9994 0.9995 0.9995 0.9995
3.3
0.9995 0.9995 0.9995 0.9996 0.9996 0.9996 0.9996 0.9996 0.9996 0.9997
3.4
0.9997 0.9997 0.9997 0.9997 0.9997 0.9997 0.9997 0.9997 0.9997 0.9998
3.5
0.9998 0.9998 0.9998 0.9998 0.9998 0.9998 0.9998 0.9998 0.9998 0.9998
3.6
0.9998 0.9998 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999
3.7
0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999
3.8
0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999 0.9999
3.9
1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000 1.0000
9
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help