Suppose that you implement an algorithm to solve a problem. It works correctly and quickly for small inputs, but you notice that your program runs much slower as the input size grows (to the point of becoming unusable). In fact, each time you increase the input size by one the execution time of your program roughly doubles! Which of the following options is the most effective way of addressing this issue? You may assume that program outputs which are close to optimal are acceptable
43-
Suppose that you implement an
Purchase a faster computer---this is a worthwhile investment as a programmer and will resolve this kind of issue.
Design a new approximate algorithm with lower time complexity.
Remove comments and any other non-essential elements in your source code that may be slowing things down.
Switch to a more efficient
Invoke Python with the --linear-time option to force your program to run in linear time, O(n).
Trending now
This is a popular solution!
Step by step
Solved in 2 steps