5. Introduction to Big O Notation (The Language of Efficiency)
Big O notation (often written as O()) is the mathematical language we use to describe the upper bound or worst-case scenario of an algorithm's performance.
It focuses on how the execution time or space requirement grows as the input size (N) increases.
What Big O Ignores
Big O simplifies the analysis by ignoring:
- Constants:
O(2N)is simplified toO(N). If an algorithm takes twice as long, it doesn't change its fundamental growth pattern. - Lower Order Terms: In
O(N^2 + N), we focus on the dominant term (N^2), so the complexity isO(N^2). As N gets very large, N^2 completely overwhelms N.
The Analogy
Imagine driving a car: Big O doesn't care about the initial speed or small bumps; it describes how the travel time changes if the distance (N) triples.