Assume that all of a company’s job applicants must take a test, and that the scores on this test are normally distributed. The selection ratio is the cutoff point used by the company in its hiring process. For example, a selection ratio of 25% means that the company will accept applicants for jobs who rank in the top 25% of all applicants. If the company chooses a selection ratioof 25%, the average test score of those selected will be 1.27 standard deviations above average. Use simulation to verify this fact, proceeding as follows. a. Show that if the company wants to accept only the top 25% of all applicants, it should accept applicants whose test scores are at least 0.674 standard deviation above average. (No simulation is required here. Just use the appropriate Excel normal function.) b. Now generate 1000 test scores from a normal distribution with mean 0 and standard deviation 1. The average test score of those selected is the average of the scores that are at least 0.674. To determine this, use Excel’s DAVERAGE function. To do so, put the heading Score in cell A3, generate the 1000 test scores in the range A4:A1003, and name the range A3:A1003 Data. In cells C3 and C4, enter the labels Score and >0.674. (The range C3:C4 is called the criterion range.) Then calculate the average of all applicants who will be hired by entering the formula =DAVERAGE(Data, "Score", C3:C4) in any cell. This average should be close to the theoretical average, 1.27. This formula works as follows. Excel finds all observations in the Data range that satisfy the criterion described in the range C3:C4 (Score>0.674). Then it averages the values in the Score column (the second argument of DAVERAGE) corresponding to these entries. See online help for more about Excel’s database “D” functions.c. What information would the company need to determine an optimal selection ratio? How could it determine the optimal selection ratio?
Assume that all of a company’s job applicants must take a test, and that the scores on this test are
selection ratio of 25% means that the company will accept applicants for jobs who rank in the top 25% of all applicants. If the company chooses a selection ratioof 25%, the average test score of those selected will be
1.27 standard deviations above average. Use simulation to verify this fact, proceeding as follows.
a. Show that if the company wants to accept only the top 25% of all applicants, it should accept applicants whose test scores are at least 0.674 standard deviation above average. (No simulation
is required here. Just use the appropriate Excel normal function.)
b. Now generate 1000 test scores from a normal distribution with mean 0 and standard deviation 1. The average test score of those selected is the average of the scores that are at least 0.674. To
determine this, use Excel’s DAVERAGE function. To do so, put the heading Score in cell A3, generate the 1000 test scores in the
In cells C3 and C4, enter the labels Score and >0.674. (The range C3:C4 is called the criterion range.) Then calculate the average of all applicants who will be hired by entering the formula
=DAVERAGE(Data, "Score", C3:C4) in any cell. This average should be close to the theoretical average, 1.27. This formula works as follows.
Excel finds all observations in the Data range that satisfy the criterion described in the range C3:C4 (Score>0.674). Then it averages the values in the Score column (the second argument of
DAVERAGE) corresponding to these entries. See online help for more about Excel’s database “D”
determine the optimal selection ratio?
Step by step
Solved in 5 steps with 16 images