exam1practicef19sol

.pdf

School

Rumson Fair Haven Reg H *

*We aren’t endorsed by this school

Course

241

Subject

Computer Science

Date

Nov 24, 2024

Type

pdf

Pages

4

Uploaded by CoachRiverTiger30

Report
Solutions to practice exam for Exam I, CS241, Fall 2019 1. Answer the following questions (circle True or False and fill in the blank). Explain your answers (a) True or False : False The average-case behavior of an algorithm is always asymptotically better than its worst-case behavior. The worst- and average-cases are usually asymptotically the same. (b) True or False : True The amount of work done by an algorithm usually depends on the size of the input. For any algorithm that has to process every element in an array, e.g. (c) True or False : The size of the input to the problem always corresponds to the size of an array. False. It may correspond to, e.g., the number of digits in a binary number (d) Suppose the running time of a divide-and-conquer algorithm is given by the following recurrence: T ( n ) = 3 T ( n 4 ) + Θ( n ) This algorithm partitions the original problem into 3 subproblems, each of size n 4 . It spends Θ( n ) time to divide and/or combine the output. (e) True or False : Note: In each of questions 1(e), 1(f), and 1(g), the highest degree polynomial of n is n 4 . f ( n ) = 1000 n log 9 n + n 3 + n 2 log 5 n + 1 1000 n 4 log 2 n is Ω( n 3 ). True because n 4 n 3 (f) True or False : f ( n ) = 1000 n log 9 n + n 3 + n 2 log 5 n + 1 1000 n 4 log 2 n is O ( n 3 ). False because n 4 n 3 (g) True or False : f ( n ) = 1000 n log 9 n + n 3 + n 2 log 5 n + 1 1000 n 4 log 2 n is O (3 n ). True because n 4 3 n (h) True or False : Every comparison-based sorting algorithm must take at least Ω( n log n ) time in the best- case. False. It is true for the worst-case, but the best case may have a lower running time. (i) True or False : If the input is a sorted array of n elements, the selection problem can be solved in O (log n ) time. We did not cover this chapter. This is true, because in a sorted array, we can select the item at any position (the selection problem) in constant time. 2. For each part, circle asymptotic bounds that apply. (a) What is the worst-case time for Max-Heapify on an array of n distinct elements? O (log n ) (b) What is the worst-case time for Counting sort on n elements, each in the range 1 to log n ? We did not cover this chapter. O ( n ) (c) What is the best-case time for Counting sort on n distinct elements, each in the range 1 to n 2 ? We did not cover this chapter. O ( n 2 ) 1
(d) What is the amount of extra storage used by Heapsort on n distinct elements? O (1) (e) What is the worst-case time for Build-Max-Heap on an unordered array of n distinct elements? O ( n log n ) Actually, this algorithm can be shown to run in O ( n ) time, but we did not cover that proof. 3. Consider the following comparison-based sorting algorithm. Fast-Sort (array L of integers) if ( L has less than 3 elements) sort L with Insertion-Sort and return L else L 1 = Fast-Sort (1st quarter of L ) L 2 = Fast-Sort (2nd quarter of L ) L 3 = Fast-Sort (3rd quarter of L ) L 4 = Fast-Sort (4th quarter of L ) L = 4-way-Merge ( L 1 , L 2 , L 3 , L 4 ) return L Assume the complexity of 4-way-Merge ( L 1 , L 2 , L 3 , L 4) is O ( n 1 + n 2 + n 3 + n 4 ), where n i is the number of elements in L i , i = 1 , 2 , 3 , 4. Write down the the recurrence relation for the asymptotic running time of Fast-Sort and solve the recurrence using the Master Theorem: T ( n ) = 4 T ( n 4 ) + n a = 4 , b = 4 , f ( n ) = n , and n log 4 4 = n . Case 2 of the Master Theorem holds and T ( n ) = Θ( n lg n ) 4. Consider the following comparison-based algorithm for the selection problem. New-Select( A, p, r, i ) 1 if p == r 2 then return A [ p ] 3 m = Magic-Median( A, p, r ) // returns median of A , m 4 q = Magic-Partition( A, p, r, m ) // partitions A around m 5 k = q - p + 1 6 if i k 7 then return New-Select( A, p, q, i ) 8 else return New-Select( A, q + 1 , r, i - k ) (a) Assume the running time of Magic-Median( L ) is O (1) and the running time of Magic-Partition( L ) is O (lg n ). State the recurrence relation for the asymptotic running time of New-Select . T ( n ) = T ( n 2 ) + (lg n ) (b) Solve the recurrence you derived in part (a); show the the tightest bound you can. Show your work and clearly state any assumptions you make. The Master Theorem cannot be used for this recurrence because lg n is not a polynomial. By the Alternate Master Theorem, we have that a = 1, b = 2, k = 0, and p = 1. a = 1 = b 0 , so case 2 holds and T ( n ) = Θ( n 0 log p +1 n ) = Θ(lg 2 n ) . 2
(c) Is it possible for the algorithm to run in the amount of time you determined in (b)? Why or Why not? No. If partition could be done in O (lg 2 n ) time, then quicksort could run in T ( n ) = 2 T ( n/ 2) + lg 2 n time. By the Alternate Master Theorem, for this recurrence, a = 2 , b = 2 , k = 0 , p = 2 , and n lg b a = n log 2 2 = n . a = 2 > b 0 = 1 , so case 1 holds and T ( n ) = Θ( n log 2 2 ) = Θ( n 1 ) = Θ( n ) time. This is impossible by the proof of the lower bound on comparison based sorting algorithms, which says that any comparison-based sorting algorithm has a worst-case input that causes it to run in Ω( n lg n ) time. 5. Consider inserting the following keys, in this order, into a hash table of length m = 11. For each part, state the number of collisions. keys to insert (in this order): 3, 4, 14, 15, 25, 26 (a) Suppose you use chaining with the hash function h ( k ) = k mod 11. Illustrate the result of inserting the keys above using chaining. Every slot of the hash table would be empty except position 3, which would contain the list 25 14 3 and position 4, which would contain the list 26 15 4 . (b) Suppose you use open addressing with the primary hash function h 1 ( k ) = k mod 11. Illustrate the result of inserting the keys above using linear probing, i.e., h ( k, i ) = ( h 1 ( k ) + i ) mod 11. index content 0 1 2 3 3 4 4 5 14 6 15 7 25 8 26 9 10 3 % 11 = 3 4 % 11 = 4 14 % 11 = 3 collision 1 rehash to 4 collision 2 rehash to 5 15 % 11 = 4 collision 3 rehash to 5 collision 4 rehash to 6 25 % 11 = 3 collision 5 rehash to 4 collision 6 rehash to 5 collision 7 rehash to 6 collision 8 rehash to 7 26 % 11 = 4 collision 9 rehash to 5 collision 10 rehash to 6 collision 11 rehash to 7 collision 12 rehash to 8 DONE There would be 12 collisions. 3
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help
(c) Suppose you use open addressing with hash functions h 1 ( k ) = k mod 11 and h 2 ( k ) = ( k mod 10). Illus- trate the result of inserting the keys above using double hashing, i.e., h ( k, i ) = ( h 1 ( k )+( h 2 ( k ) · i )) mod 11. index no. content 0 1 2 3 3 4 4 5 6 7 14 8 25 9 15 10 26 3 % 11 = 3 4 % 11 = 4 14 % 11 = 3 collision 1 rehash to (3 + ((14 % 10) × 1)) % 11) = 7 % 11 = 7 15 % 11 = 4 collision 2 rehash to (4 + ((15 % 10) × 1)) % 11) = 9 % 11 = 9 25 % 11 = 3 collision 3 rehash to (3 + ((25 % 10) × 1)) % 11) = 8 % 11 = 8 26 % 11 = 4 collision 4 rehash to (4 + ((26 % 10) × 1)) % 11) = 10 % 11 = 10 DONE There would be 4 collisions. 4