# Asymptotic comparison – III

This is the third post regarding the notion of asymptotic comparison. The first one dealt with equivalents, the rigorous way of saying “these two things look alike”, while the second post was about little o, meaning “this thing is much smaller than this other one”. Naturally, where there is a little o, there should be a big O, and this is what we shall discuss. The informal definition of big O is “this thing is at most as large as this other thing”.

I would very much recommend reading the first posts in the series. The big O work in the same way as the little o computationally, so this post will be shorter. As before, we will focus on sequences, and introduce the analogous notions for functions in a later post.

## Formal definition

Unlike for equivalents and little o, there is no division involved, so we do not need to assume that the sequences never take the value zero.

Consider two sequences $(u_n)$ and $(v_n)$. We say that $(u_n)$ is a big O of $(v_n)$ if for some $C > 0$,

$\displaystyle |u_n| \leq C |v_n|$

for all $n$ (sufficiently large). We then write $u_n = O(v_n).$

If $(v_n)$ never takes the value $0$, it means that the sequence $(u_n/v_n)$ is bounded.

This can also be read “$u_n$ equals big O of $v_n$“. As for little o, this is not an equality in the usual sense of the term: things go from left to right, and we should not write $O(v_n) = u_n$ 😦 It is usually merely written with the letter O in $LaTeX$, but you can also see other options here.

An important difference with the notion of little o and equivalents is the fact that there is no limit here. It is very possible that the limit of $u_n / v_n$ could not exist. For instance, we have

$\displaystyle \sin n = O(1)$,

which is just a short way to state that the sequence $(\sin n)$ is bounded. However, this sequence has no limit.

You probably know that the sequence $(u_n) = (\sin n)$ has no limit, but do you have a single argument to show this? No need to use equidistribution or rational approximations of $\pi$, but a trig formula or two might do the trick.

## Big O and little o

It is always good to remember that a big O is less restrictive than a little o, in the following sense.

Take two sequences $(u_n)$ and $(v_n)$. Then

$\displaystyle u_n = o(v_n) \Rightarrow u_n = O(v_n).$

Naturally, the converse is not true. For instance, if $u_n = v_n$, then $u_n = O(v_n)$, but $u_n$ is not a little o of $v_n$. There is some sort of converse though.

Take three sequences $(u_n), (v_n), (w_n)$. Then if $u_n = O(v_n)$ and $v_n = o(w_n)$, then $u_n = o(w_n)$.

For instance, if we know that $u_n = n + \sqrt{n} + O(1/n)$, then $u_n = n + \sqrt{n} + o(1)$, and in particular $u_n - n \sim \sqrt{n}$.

Prove Propositions II.1 and II.2.

## Properties

As expected, big O have similar properties as o, as we summarize below.

### Basic properties

• If $\alpha, \beta \in \mathbb{R}$ and $\alpha \leq \beta$, then

$\displaystyle n^{\alpha} = O (n^{\beta}).$

• If $r_1, r_2 \in \mathbb{R}$ and $|r_1| \leq |r_2|$, then

$\displaystyle r_1^n = O (r_2^n).$

Naturally, other comparative growths mentioned in the little o post are still valid after replacing little o by big O.

Consider sequences $(u_n), (v_n), (x_n), (y_n)$.

• If $u_n = O(x_n)$ and $x_n = O(y_n)$, then $u_n = o(y_n)$.
• If $u_n = O(x_n)$ and $v_n = O(y_n)$, then $u_n v_n = O(x_n y_n).$
• If $u_n = O(x_n)$, then for any $\alpha \geq 0$, we have $u_n^{\alpha} = O( x_n^{\alpha}).$
• If $u_n = O(x_n)$ and $v_n = O(x_n)$, then $u_n + v_n = O (x_n).$
• If $u_n = O(x_n)$, and $v_n$ is never $0$, then $u_n / v_n = O (x_n / v_n)$.
• If $u_n = O(x_n)$, then for any $c \in \mathbb{R}$, $c u_n = O(x_n)$.

It is worth noticing that most of these behave like inequalities and this is why it may be convenient, though slightly informal, to write $u_n \lesssim v_n$ rather than $u_n = O(v_n)$. However, the big O notation has the advantage that it essentially behaves like an equality and allows to simplify algebraic computations, as we discuss below.

Another useful property is that $n$ can essentially be replaced by any sequence tending to $+ \infty$.

Take sequences $(u_n), (x_n)$, and a sequence of integers $(v_n)$ that tends to $+ \infty$. If $u_n = O(x_n)$, then $u_{v_n} = O(x_{v_n})$.

### Big O and limits

Limits are often computed using little o or big O. Recall that $\lim_{n \to + \infty} u_n = \ell$ if and only if $u_n - \ell = o(1)$. With big O, we often use the following.

If $u_n - \ell = O(\epsilon_n)$ where $(\epsilon_n)$ is a sequence tending to $0$, then

$\displaystyle \lim_{n \to + \infty} u_n = \ell.$

### Words of caution

The same warnings as for little o apply to big O.

• keep things as simple as possible in the big O;
• do not cancel big O;
• do not take functions of big O.

## Asymptotic expansion

### Definition

We can write asymptotic expansions using little o or big O. For little o, we want to say: this sequence looks like this much more simple sequence, plus something which is smaller. For big O, we want to say: this sequence looks like this much more simple sequence, plus something which is at most as large as this quantity, where the quantity in question is much smaller than the rest. Therefore, we could write

$\displaystyle u_n = n^2 + 2n + O \left ( \sqrt{n} \right )$

but not

$\displaystyle u_n = n^2 + 2n + O(n)$ 😦

which is no better than saying

$\displaystyle u_n = n^2 + O(n).$

We saw that derivatives allow to obtain asymptotic expansions with a little o, such as

$\displaystyle \sin(1/n) = 1/n + o(1/n)$.

Replacing this little o with a big O would be possible, but it would only say $\sin(1/n) = 1/n + O(1/n) = O(1/n)$, which is less precise. We can actually go further and write

$\displaystyle \sin(1/n) = 1/n + O(1/n^2)$.

Note that this is better than the previous expansion. For instance, we can obtain

$\displaystyle n^{3/2} \left ( \sin(1/n) - 1/n \right ) = O(n^{-1/2})$

which says that $n^{3/2} \left ( \sin(1/n) - 1/n \right ) \to 0$. The expansion with the little o would only tell that $n^{3/2} \left ( \sin(1/n) - 1/n \right ) = o(n^{1/2})$, which is not as good.

But where is the expansion $\sin(1/n) = 1/n + O(1/n^2)$ coming from? For this, we need to know more, namely Taylor expansions, which we will cover this in another post.

### Algebra of big O

Naturally, the algebra of big O works just like for little o. For instance if we have $u_n = n^2 - n + 2 \ln n + O(1)$ and $v_n = 3 n + 2 + O(1/n)$, then

\begin{aligned} u_n + v_n & = n^2 - n + 2 \ln n + O(1) + 3 n + 2 + O(1/n) \\ & = n^2 + 2n + 2 \ln n + O(1). \end{aligned}

Note how the $2$ is absorbed into the $O(1)$, while it would not be absorbed into a $o(1)$. Similarly, we could get an asymptotic extension of the product as follows:

\begin{aligned} u_n v_n & = \left ( n^2 - n + 2 \ln n + O(1) \right ) \left ( 3 n + 2 + O(1/n) \right ) \\ & = 3 n^3 + 2n^2 + O(n) - 3n^2 + 6 n \ln n \\ & = 3 n^3 - n^2 + 6 n \ln n + O(n) \\ \end{aligned}

Note that the worse $O$ that we get is $O(n)$, coming from either $n^2 \times O(1/n)$ or $O(1) \times n$, so all terms of order $n$ or lower are absorbed into it. Similarly to what happens for little o, we see (using e.g. Proposition II.2) that in such an asymptotic expansion, the sequence is equivalent to the first term of the asymptotic expansion. For instance, we get here that

$\displaystyle u_n v_n \sim 3n^3$

and that

$\displaystyle u_n v_n - 3n^3 \sim - n^2.$

Finally, just like we do not want two little o or two big o in an equality, we also do not want a little o and a big O together either. For instance, $O(1/n) + o(1/n)$ simplifies to $O(1/n)$, while $O(n) + o(n^2)$ simplifies to $o(n^2)$. But note that $O(1/n) + o(1)$ does not simplify! In real life, such an oddity should however never occur. One should aim to do computations with the least precision needed (yes, be as lazy as possible!). This may mean using big O or little o, and with a bit of experience, this can be usually inferred from the get-go.

Assume that $u_n = n^3 - n^2 + 3n + 1 + O(1/n)$ and $v_n = -2 n^2 + n + O(1)$. Compute an asymptotic expansion of $u_n + v_n$ and of $u_n v_n$.