Algorithm efficiency analysis is divided into two types: the first is time efficiency, and the second is space efficiency. Time efficiency is called time complexity, and space efficiency is called space complexity. Time complexity mainly measures the running speed of an algorithm, while space complexity mainly measures the extra space required by an algorithm. In the early days of computer development, the storage capacity of computers was very small. So we care a lot about space complexity. However, after the rapid development of the computer industry, the storage capacity of computers has reached a very high level. So we no longer need to pay special attention to the space complexity of an algorithm.
The time an algorithm takes is proportional to the number of executions of its statements. The basic operations in the algorithm are The number of executions is the time complexity of the algorithm. That is to say, when we get a code and look at the time complexity of the code, we mainly find how many times the code with the most executed statements in the code has been executed.
Look at the picture analysis:
When the value of N becomes larger and larger, the sum of 2N and The value of 10 can be ignored.
In fact, when we calculate the time complexity, we do not actually have to calculate the exact number of executions, but only the approximate number of executions, so here we use the asymptotic representation of Big O.
Big O notation: It is a mathematical symbol used to describe the asymptotic behavior of a function.
1. Replace all additive constants in run time with constant 1.
2. In the modified running times function, only the highest order term is retained.
3. If the highest-order term exists and is not 1, remove the constant multiplied by this term. The result is Big O order.
Through the above we will find that the asymptotic representation of Big O removes those items that have little impact on the results, and expresses the number of executions concisely and clearly.
In addition, the time complexity of some algorithms has best, average and worst cases:
Worst case: the maximum number of runs (upper bound) for any input size
Average case: the expected number of runs for any input size
Best case: the minimum number of runs (lower bound) for any input size
For example: Search for a data x## in an array of length N
#Best case: found 1 time Worst case: found N timesAverage case: found N/2 times In actual practice, pay attention to the general situation is the worst operating situation of the algorithm, so the time complexity of searching data in the array is O(N)Calculation time complexity Example 1:
Through calculation and analysis, it is found that the basic operation is recursive 2^N times, and the time complexity is O(2^N).
Rule:
The above is the detailed content of Java time complexity and space complexity example analysis. For more information, please follow other related articles on the PHP Chinese website!