Intuitive Introduction to Big O

The post is going to be an intuitive introduction to Big O notation. I am not going to be rigorous or mathematical in my approach.

Big O indicates how an algorithm behaves when the size of the input changes. The behavior might be either the computation time or the storage required for computation. What Big O tells is how the storage or computation time scales with the change in input.

The above is an intuitive definition of Big O.

Get articles on coding, software and product development, managing software teams, scaling organisations and enhancing productivity by subscribing to my blog

A pivotal point to keep in mind is that Big O talks of change; it does not talk about specifics, i.e., Big O does not tell this algorithm will execute in 1 ms or 1 Mb of space is required for execution.

Let us understand Big O with a non-programmatic example.

There are lots of tennis balls piled up in a court. You have to devise an algorithm to move them to the clubhouse.

One approach is to pick a ball. Take it to the clubhouse and drop it there. Do the same with another ball. Repeat until there are no more balls left.

If there are n balls in the court, you have to go back and forth between the court and the clubhouse n times, O(n) is the Big O of this approach. As the number of balls increases, the work required to clean the court increases. If there are too many balls, with this algorithm, you might end up doing the task the whole day.

If you boil it down, what O(n) tells us is that the task increases linearly with the input, in this case, the number of balls in the court.

Another option is to collect all the balls in a bag in one go1. Take the bag to the clubhouse; this is a constant time algorithm. Since one is not repeatedly going back and forth for each ball; the task does not scale with the number of balls.

When trying to deduce the Big O of an algorithm, figure out the effect of input on storage space or computation.

fractal-880676_640

Why do we need Big O?

Big O has no significance when the input is small. For small input sizes, the efficiency of an algorithm does not make a difference — the effectiveness of an algorithm matters when the input size is large.

Coming back to our example, let us assume we are dim-witted and have institutionalized the back and forth algorithm. We have not bothered to figure out the Big O of the approach. Days pass by; there are at most a couple of balls in the court, and cleaning goes on nicely. One beautiful morning, there are ten thousand balls in the court, and we are in for an unpleasant surprise – we spend hours cleaning the court.

If we had bothered to figure out the Big O of our cleaning algorithm, we could have avoided this surprise.

When someone talks of Big O, they are usually talking about the worst-case complexity.

If the majority of the members of our tennis club are well behaved and do not leave behind balls in the court, then we would not have a tough time cleaning the court. But while calculating the Big O of the algorithm, we assume the worst case, i.e., no one cleans the court after their game, everyone leaves all the balls behind.

While calculating Big O, we do not take into account any of the constant time work done as part of the algorithm. In the first approach of cleaning the balls, if the person takes a break when she is midway through, Big O of the algorithm does not change; it is still O(n).

Why?

Reinforcing the critical point, Big O is the relationship between the algorithm and the input; Big O only cares how the efficiency of the algorithm varies with the change in input. The break the person takes does not vary with the input; if there are two balls in the court or a thousand, the person takes a break when she is halfway through. The break time or the number of breaks is not dependent on the number of balls in the court.

While calculating Big O, ignore any constant time work, i.e., work that does not scale with the input.

In recent times, technical interviews have put the spotlight on Big O. Big O is one of the notations to represent the efficiency of an algorithm. There are many others too.

1 The astute reader might observe that the time required to bag the balls increases with the number of balls in the court. We ignore this and assume that it is constant.

Image by Pete Linforth from Pixabay

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s