Going Deeper with Time and Space Complexity
In the last post, we learned the basics of time complexity—from constant time O(1)
to exponential O(2^n)
. Now, let’s take a step further. We’ll explore how to analyze more realistic code, simplify complex expressions, and understand space complexity—another crucial piece of the puzzle.
1. Real-World Code Complexity: How to Analyze
Let’s look at an example function:
function printPairs(arr) {
for (let i = 0; i < arr.length; i++) {
for (let j = i + 1; j < arr.length; j++) {
console.log(arr[i], arr[j]);
}
}
}
Here’s how you analyze it:
Outer loop runs
n
times.Inner loop runs
n - i - 1
times on average.This results in roughly
n(n-1)/2
operations → O(n^2).
Key tip: Always try to estimate the number of times a line of code executes with respect to input size.
2. Big-O Simplification Rules
When you see an expression like:
O(n^2 + n + 10 + log n)
You don’t keep everything.
🧠 Only the fastest-growing term matters.
So it simplifies to: O(n^2)
General rules:
Drop constants:
O(2n)
→O(n)
Drop less significant terms:
O(n + log n)
→O(n)
3. What About Space Complexity?
Time complexity tells us how long an algorithm runs.
Space complexity tells us how much memory it needs.
Take this example:
function sumArray(arr) {
let sum = 0;
for (let i = 0; i < arr.length; i++) {
sum += arr[i];
}
return sum;
}
Time:
O(n)
(we scan each item)Space:
O(1)
(we only use a single variable)
But what about this?
function doubleArray(arr) {
let result = [];
for (let i = 0; i < arr.length; i++) {
result.push(arr[i] * 2);
}
return result;
}
Time:
O(n)
Space:
O(n)
(new array of same size)
4. Pitfalls to Watch Out For
a. Hidden Loops
Methods like .sort()
often have hidden complexities (like O(n log n)).
arr.sort(); // Not O(1)! It's usually O(n log n)
b. Nested Function Calls
Sometimes complexity comes from function calls inside loops:
function outer(arr) {
for (let i = 0; i < arr.length; i++) {
inner(arr[i]);
}
}
function inner(x) {
for (let j = 0; j < x; j++) {
console.log(j);
}
}
If x
grows with n
, the total time becomes O(n^2).
5. Practice Tip: Trace It!
When in doubt, manually trace the number of operations on paper or in your mind. That’s the best way to build your Big-O intuition.
Final Thoughts
Time and space complexity isn’t just for interviews—it helps you write better, faster, and more scalable code.
In the next post, we’ll look at:
Comparing algorithms with different complexities
Trade-offs between time and space
Complexity of popular sorting and searching algorithms
Until then—keep practicing, keep tracing, and keep building!
🧠 Happy Coding!