I have a question about understanding Big O Notation. Like, I know that it is talking about space time and complexity and that it is referred to as O(n), but I am still confused and want to know how does this apply to executing code? As well as how do I know if my code is executing more efficiently, when using Big O Notation?
I have a question about understanding Big O Notation. Like, I know that it is talking about space time and complexity and that it is referred to as O(n), but I am still confused and want to know how does this apply to executing code?
As well as how do I know if my code is executing more efficiently, when using Big O Notation?
Big O notation allows us to talk casually about the execution time of an algorithm on the basis of the amount of raw data
being feeded to the algorithm for computation.
We tend to speak in terms of raw data that if the amount of raw data is increased how it will affect the execution time i.e. is it going to increase by the same amount in which the raw data has increased or is it going to change the execution time in a different manner that is what Big O notation tells us about.
Step by step
Solved in 2 steps with 2 images