Whether you're learning to code, preparing for interviews, or brushing up on algorithm fundamentals, understanding time complexity is crucial. It helps you measure how fast an algorithm runs as input size grows. In this post, we'll walk through the key concepts with simple examples and practical tips.
1. Constant Time: O(1)
Operations that take the same amount of time regardless of input size.
x = x + 1;
Examples:
Accessing an array element by index
Simple arithmetic operations
2. Linear Time: O(n)
The running time grows directly with the input size.
for (i = 1; i <= n; i++) {
m = m + 2; // constant time
}
Use Case:
Scanning an array
3. Nested Loops: O(n^2)
Each loop runs independently, creating a product of iterations.
for (i = 1; i <= n; i++) {
for (j = 1; j <= n; j++) {
k = k + 1;
}
}
Use Case:
Comparing every element with every other (e.g., bubble sort)
4. Consecutive Statements: Add Complexities
You sum up the complexities of each separate block.
x = x + 1; // O(1)
for (i = 1; i <= n; i++) {
m = m + 2; // O(n)
}
for (i = 1; i <= n; i++) {
for (j = 1; j <= n; j++) {
k = k + 1; // O(n^2)
}
}
Total Time: O(1 + n + n^2) = O(n^2)
5. Logarithmic Time: O(log n)
The input size reduces by a constant fraction each time.
for (i = 1; i <= n;) {
i = i * 2;
}
Here, i
doubles each step: 1, 2, 4, 8, ..., until i > n
.
The loop runs ~log2(n) times
Also applies to:
for (i = n; i >= 1;) {
i = i / 2;
}
Use Case:
Binary search
6. Multiple Independent Loops: O(n + m)
When two loops run independently on different data sets:
for (i = 0; i < n; i++) {
// O(n)
}
for (j = 0; j < m; j++) {
// O(m)
}
7. Conditional Statements: O(1) (Usually)
if (x > 0) {
x = x - 1;
}
The time taken depends only on the condition, not input size.
8. Recursive Time: Exponential and Beyond
Recursive algorithms can grow very fast depending on branching.
void recur(int n) {
if (n <= 1) return;
recur(n - 1);
recur(n - 1);
}
This runs in O(2^n) time.
9. Best, Worst, and Average Case
Different inputs can lead to different run times:
Example (Linear Search):
Best Case: O(1)
Worst Case: O(n)
10. Amortized Time
Occasionally expensive operations are averaged over many cheap ones.
Example: Dynamic array resizing
Most
push
calls: O(1)Occasional resize: O(n)
Amortized: O(1) per operation
Final Thoughts
Understanding time complexity is key to writing efficient code. Start simple, practice with real problems, and build your intuition. Whether it's for coding interviews or real-world projects, mastering these basics gives you a major edge.
In future posts, we will go deep and understand complexities even further. So stay tuned!
Happy coding!