Suppose that someone has developed an algorithm to solve a certain problem, which runs in time T(n, k) ∈ (f(n, k)), where n is the size of the input, and k is a parameter we are free to choose (we can choose it to depend on n). In each case determine the value of the parameter k(n) to achieve the (asymptotically) best running time. Justify your answer. I recommend not trying much fancy math. Think of n as being some big fixed number. Try some value of k, say k = 1, k = na , or k = 2an for some constant a. Then note whether increasing or decreasing k increases or decreases f . Recall that “asymptotically” means that we only need the minimum to within a multiplicative constant. 1. You might want to first prove that g + h = (max(g, h)). 2. f(n, k) = n+k log k . This is needed for the radix–counting sort in Section 5.4. 3. f(n, k) = n3 k + k · n. 4. f(n, k) = log3 k + 2n k . 5. f(n, k) = 8nn2 k + k · 2n + k2
Suppose that someone has developed an
algorithm to solve a certain problem, which runs in time T(n, k) ∈ (f(n, k)), where
n is the size of the input, and k is a parameter we are free to choose (we can choose it
to depend on n). In each case determine the value of the parameter k(n) to achieve the
(asymptotically) best running time. Justify your answer. I recommend not trying much
fancy math. Think of n as being some big fixed number. Try some value of k, say k = 1,
k = na , or k = 2an for some constant a. Then note whether increasing or decreasing
k increases or decreases f . Recall that “asymptotically” means that we only need the
minimum to within a multiplicative constant.
1. You might want to first prove that g + h = (max(g, h)).
2. f(n, k) = n+k
log k . This is needed for the radix–counting sort in Section 5.4.
3. f(n, k) = n3
k + k · n.
4. f(n, k) = log3 k + 2n
k .
5. f(n, k) = 8nn2
k + k · 2n + k2
Step by step
Solved in 2 steps