O. Θ, Ω, O, ω, don’t be confused!

Time:2021-4-1

file

preface

This article is included in the album:http://dwz.win/HjK, click to unlock more knowledge of data structure and algorithm.

Hello, I’m brother Tong, a hard core man who climbs 26 floors every day and doesn’t forget to read the source code.

In the previous sections, we learned how to analyze the complexity of the algorithm, and analyzed the complexity of the algorithm from the worst, the average, the best and the worst without dead angle. When we express the complexity, we usually use big O to express it.

However, in other books, you may have seen symbols such as Θ, Ω, O, and ω.

So what do these symbols mean?

In this section, we will solve this problem.

pronunciation

Let’s first correct a wave of pronunciation

  • O. / əʊ /, big oh
  • o. / əʊ /, small Oh
  • Θ,/ˈθiːtə/,theta
  • Ω, / O ʊ me ə /, big Omega
  • ω. / O ʊ me ɡ /, little Omega

Is it different from the teacher’s teaching^^

Mathematical explanation

Θ

Θ defines an exact asymptotic behavior?

It is expressed by function

For f (n), there are positive numbers N0, C1, C2, such that when n > = N0, there is always 0 < = C1 * g (n) < = f (n) < = C2 * g (n), then we can use f (n) = Θ (g (n)).

It is represented by a graph

file

Θ defines both upper and lower bounds. F (n) lies between the upper and lower bounds and contains an equal sign.

For example, f (n) = 2n ^ 2 + 3N + 1 = Θ (n ^ 2), at this time, G (n) is obtained by using f (n) to remove the lower order terms and constant terms, because there must be some positive numbers N0, C1, C2, so that 0 < = C1 * n ^ 2 < = 2n ^ 2 + 3N + 1 < = C2 * n2. Of course, you say g (n) is 2 * nSo, G (n) is actually a set of functions that satisfy this condition.

Well, if you can understand, the following four are easy to understand.

O

O defines the upper bound of the algorithm.

It is expressed by function

For f (n), there are positive numbers N0 and C, so that when n > = N0, there is always 0 < = f (n) < = C * g (n), then we can use f (n) = O (g (n)).

It is represented by a graph

file

As long as f (n) is not greater than c * g (n), we can say that f (n) = O (g (n)).

For example, for insertion sort, we say its time complexity is O (n ^ 2), but if we use Θ to express it, it must be divided into two parts:

  1. In the worst case, its time complexity is 0 (n ^ 2);
  2. In the best case, its time complexity is Θ (n).

Here’s n2 is only the smallest upper bound of G (n). Of course, G (n) can be equal to n3。

However, we generally say that complexity refers to the minimum upper bound. For example, if the time complexity of insertion sort here is O (n ^ 3), theoretically speaking, it’s OK, but it doesn’t conform to the agreement.

The best case for insert sort is that the array itself is ordered.

o

O also defines the upper bound of the algorithm, but it does not include equal to. It is an inexact upper bound, or loose upper bound.

It is expressed by function

For f (n), there are positive numbers N0 and C, so that when n > N0, there is always 0 < = f (n) < C * g (n), then we can use f (n) = O (g (n)).

It is represented by a graph

file

O means that as like as two peas, the larger O is removed from the same situation, and the other acts are exactly the same as the big O.

Ω

Ω defines the lower bound of the algorithm, which is opposite to o.

It is expressed by function

For f (n), there are positive numbers N0 and C, so that when n > = N0, there is always 0 < = C * g (n) < = f (n), then we can use f (n) = Ω (g (n)).

It is represented by a graph

file

If f (n) is not less than c * g (n), we can say that f (n) = Ω (g (n)).

For example, for insertion sort, we can say that its time complexity is Ω (n), but it usually doesn’t make much sense, because insertion sort is rare in the best case, mostly in the worst case or average case.

ω

Omega also defines the lower bound, but does not include equal to, is an imprecise lower bound, or Panasonic lower bound.

It is expressed by function

For f (n), there are positive numbers N0 and C, so that when n > N0, there is always 0 < = C * g (n) < f (n), then we can use f (n) = ω (g (n)).

It is represented by a graph

file

Omegas like as two peas are just the same as large Omega.

folk understanding

Symbol meaning folk understanding
Θ Exact asymptotic behavior Equivalent to “=”
O upper bound Equivalent to “< =”
o Matsushita upper bound Equivalent to“
Ω Lower bound Equivalent to “> =”
ω Panasonic Equivalent to “>”

Summary

In order to help students quickly access to English materials, Tong Ge specially summarized the English words involved in these sections

chinese english
Complexity complexity
Time complexity time complexity
Spatial complexity space complexity
Asymptotic analysis asymptotic analysis
Worst case scenario the worst case
Best case the best case
Average the average case
Exact asymptotic behavior exact asymptotic behavior
Low order term low order terms
Constant term (pre constant) leading constants
Matsushita upper bound loose upper-bound

Postscript

In this section, we explain the meaning of Θ, O, O, Ω, and ω from pronunciation, mathematics, and popular understanding. At the end, we give the corresponding English of the terms involved in these sections. With these English, you can also quickly consult the information in this aspect.

However, in our daily communication with people, we are still used to using the big O representation. First, it represents the worst case, which can directly represent the complexity of the algorithm. Second, it is easier to write.

So, we just need to remember big O, only when others mention Θ, Ω and ω, we know what it means.

The previous sections have talked about so much, but in fact, only very simple algorithm complexity is involved.

So, what are the common algorithm complexity?

Next, let’s talk about it.

Pay attention to the owner of the public account “tongge read the source code” to unlock more knowledge of source code, foundation and architecture.