It is of interest to understand the behavior of programs relative to the amount of input data.

If a program has a for loop such as:

for(int i=0; i<n; i++){...}

We designate this as O(n) read Order n effort.

If we have nested for loops such as:

for(int i=0; i<n; i++)

for(int j=0; j<n; j++){...}

We designate this as O(n^{2}) read Order n squared effort.

In asymptotic analysis, we ignore constants and lower order terms. So if the amount of effort is:

O(7n^{3}+2n^{2}+n+5) we say that this is approximately O(n^{3}) effort.

Anytime we divide the amount of data by 2, we are reducing the amount of
effort by log_{2} so in a binary search, which divides the amount of
data considered by 2 each time it is encountered, we say that we are expending
O(log(n)) effort.

The practical limit to algorithms that must handle large data sets is O(n*log(n))
and algorithm of this type would be the quick sort which does n iterations and
divides the data by 2 each time it encounters it. The bubble sort which is
O(n^{2}) effort is not good for large data sets.