# Complexity of data structure

Time：2021-9-19

## 1、 Algorithm efficiency

There are two kinds of algorithm efficiency analysis:

1. Time efficiency

Time efficiency is also called time complexity. It mainly measures the running speed of an algorithm.

2. Space efficiency

Spatial efficiency is also called spatial complexity. It mainly measures the additional space required by an algorithm. In the early stage of computer development, because of the limited level of science and technology, the capacity of computer is often very small, but now with the rapid development of science and technology, the storage capacity of computer has reached a high level. Therefore, we no longer need to pay special attention to the spatial complexity of an algorithm.

## 2、 Time complexity

### 1. Concept

In computer science,The time complexity of the algorithm is a function, it quantitatively describes the running time of the algorithm. In theory, the time spent in the execution of an algorithm cannot be calculated. You can only know when you put your program on the machine and run it. But do we need every algorithm to be tested on the computer? It can be tested on the computer, but the same algorithm will have different differences on machines with different performance, so the analysis method of time complexity is available.

Because the time spent by an algorithm is directly proportional to the number of executions of the statements in it,The execution times of basic operations in the algorithm is the time complexity of the algorithm.

For example, how many times does the following code execute?

# include <stdio.h>

void Func1(int N)
{ int count = 0; for (int i = 0; i < N ; ++ i)

``````{ for (int j = 0; j < N ; ++ j)
{ ++count;
}
} for (int k = 0; k < 2 * N ; ++ k)
{ ++count;
} int M = 10; while (M--)
{ ++count;
}
printf("%dn", count);``````

}

Obviously, its execution times are f = nN+2N+10；

Often when we calculate the time complexity, we write an approximate number. Why?

We might as well imagine that for the above execution times function, when n tends to infinity, that 10 has little effect on it. If it is a high-order function, the size of this function often depends on the item with the highest number of times in the expression, so our time complexity also adopts this idea.

We might as well test:

# include <stdio.h> #include <time.h>

void Func1(int N)
{ int start = clock(); int count = 0; for (int i = 0; i < N ; ++ i)

``````{ for (int j = 0; j < N ; ++ j)
{ ++count;
}
} for (int k = 0; k < 2 * N ; ++ k)
{ ++count;
} int M = 10; while (M--)
{ ++count;
} int end = clock();
printf("%dn", end - start);// In milliseconds``````

} int main()
{

``````Func1(0);//0
Func1(100);//0
Func1(10000);//386
return 0;``````

}

We find that the difference is mainly in the highest order. So next, let’s introduce the asymptotic representation of big o.

### 2. Progressive representation of big O

In fact, when we calculate the time complexity, we do not have to calculate the exact execution times, but only the approximate execution times. Here, we use the asymptotic representation of large o.

Big O notation: a mathematical symbol used to describe the asymptotic behavior of a function.

Derivation of large O-order method:

1. Replace all addition constants in the run time with constant 1.

2. In the modified run times function, only the highest order term is retained.

3. If the highest order term exists and is not 1, the constant multiplied by this item is removed. The result is large O-order.

Therefore, after using the asymptotic representation of large o, the time complexity of func1 is: O (n ^ 2);

From the above, we can get the progressive representation of the current big O, remove those items that have little impact on the results, and succinctly express the execution times.

In addition, the time complexity of some algorithms has the best, average and worst cases:

Worst case: maximum number of runs of any input scale (upper bound)

Average case: expected number of runs of any input scale

Best case: minimum number of runs of any input scale (lower bound)

For example:

//Description: pass an array. Given a number, find out whether the array contains this number
int FindNum(int* arr,int N,int search_num)
{ int i = 0; for(i=0;i<len;i++)

``{if (search_num = = arr [i]) return 1; // return 1 if found``

}

Its best case is 1, that is, it is found only once

Average: n / 2

Worst case: n, traversing the entire array.

So, what is the time complexity of this algorithm? The answer is: O (n)

When calculating the time complexity, we often obtain its worst case.

### 3. Examples

//1. Calculate the time complexity of func2?
void Func2(int N)
{ int count = 0; for (int k = 0; k < 2 * N ; ++ k)

``````{ ++count;
} int M = 10; while (M--)
{ ++count;
}
printf("%dn", count);``````

}

What is its time complexity—-O(N)

//2. Calculate the time complexity of func3?
void Func3(int N, int M)
{ int count = 0; for (int k = 0; k < M; ++ k)
{ ++count;
} for (int k = 0; k < N ; ++ k)
{ ++count;
}
printf(“%dn”, count);
}

The answer is: O (n + m), because in this formula, we do not know the size of M and N. if n > > m, the time complexity of this problem is O (n)

//3. Calculate the time complexity of func4?
void Func4(int N)
{ int count = 0; for (int k = 0; k < 100; ++ k)

``````{ ++count;
}
printf("%dn", count);``````

}

For it, the number of executions is a certain numberconstant: 100, so its time complexity isO(1)

//4. Calculate the time complexity of BubbleSort?
void BubbleSort(int* a, int n)
{

``````assert(a); for (size_t end = n; end > 0; --end)
{ int exchange = 0; for (size_t i = 1; i < end; ++i)
{ if (a[i-1] > a[i])
{
Swap(&a[i-1], &a[i]);
exchange = 1;
}
} if (exchange == 0) break;
}``````

}

For example 4, the basic operation is executed for the best n times and the worst (n * (n + 1) / 2 times. By deriving the large O-order method + time complexity, it is generally the worst, so the time complexity is O (n ^ 2)

//5. Calculate the time complexity of binarysearch?
int BinarySearch(int* a, int n, int x)
{

``````assert(a); int begin = 0; int end = n-1; while (begin < end)
{ int mid = begin + ((end-begin)>>1); if (a[mid] < x)
begin = mid+1; else if (a[mid] > x)
end = mid; else
return mid;
} return -1;``````

}

In example 5, the basic operations are performed at best once and at worst o (logn) times, and the time complexity is O (logn) (we can understand how logn is calculated by origami search) Note:

In algorithm analysis, logn is represented by base 2 and logarithm n. Some places will be written as LGN.

//6. Calculate the time complexity of factorial recursive factorial?
long long Factorial(size_t N)
{ return N < 2 ? N : Factorial(N-1)*N;
}

How do we calculate the time complexity of recursion?

Recursive time complexity   = Number of recursions * number of times in each recursion

For the above question, a total of N recursions are made, and there is only one recursion in each time, so the time complexity is O (n)

//7. Calculate the time complexity of Fibonacci recursive Fibonacci?
long long Fibonacci(size_t N)
{ return N < 2 ? N : Fibonacci(N-1)+Fibonacci(N-2);
}

In this example, a total of N recursions are made, and the number of recursions is 2 ^ n-1, so the final time complexity is O (n ^ 2) Note:

If a recursion has n layers and there are n times in each layer, its final time complexity is O (n ^ 2)

## 3、 Spatial complexity

### 1. Concept

Space complexity is a measure of the amount of storage space temporarily occupied by an algorithm during operation. Space complexity is not how many bytes the program occupies, because it doesn’t make much sense,

So spatial complexity is the number of variables.

The calculation rules of spatial complexity are basically similar to the practical complexity, and the large o asymptotic representation is also used.(estimated)

### 2. Examples

//1. Calculate the spatial complexity of BubbleSort?
void BubbleSort(int* a, int n)
{

``````assert(a); for (size_t end = n; end > 0; --end)
{ int exchange = 0; for (size_t i = 1; i < end; ++i)
{ if (a[i-1] > a[i])
{
Swap(&a[i-1], &a[i]);
exchange = 1;
}
} if (exchange == 0) break;
}``````

}

* * we should note that during calculation, we calculate the additional space required by the algorithm, regardless of the space of array input. And we should note that the space can be reused. For example, an algorithm needs to create a variable and continuously assign values to the variable. At this time, the space complexity is O (1);
**

Therefore, since instance 1 uses two additional spaces, the space complexity is O (1)

//2. Calculate the spatial complexity of the generated Fibonacci array?
long long* Fibonacci(size_t n)
{ if(n==0)

``````{ return NULL;
} long long * fibArray = (long long *)malloc((n+1) * sizeof(long long));

fibArray = 0;// The first Fibonacci number
fibArray = 1;// The second Fibonacci number
For (int i = 2; I < = n; + + I) // generate Fibonacci numbers using loops``````

{

``````    fibArray[i ] = fibArray[ i - 1] + fibArray [i - 2];
} return fibArray ;``````

}

Example 2 dynamically opens up n spaces to put n Fibonacci numbers, so the space complexity is O (n)

//3. Calculate the spatial complexity of factorial recursive factorial?
long long Factorial(size_t N)
{ return N < 2 ? N : Factorial(N-1)*N;
}

The spatial complexity of recursive algorithms is usually the depth of recursion (the number of recursions)

Example 3 recursively called n times, opened up n stack frames, and each stack frame used a constant space. So the space complexity is O (n)

Finally, let’s look at another picture: This means that sometimes   Different time complexity is still close when n is small. When n is larger and larger, the time difference is more and more!

## Supervisor

Supervisor [note] Supervisor – H view supervisor command help Supervisorctl – H view supervisorctl command help Supervisorctl help view the action command of supervisorctl Supervisorctl help any action to view the use of this action 1. Introduction Supervisor is a process control system. Generally speaking, it can monitor your process. If the process exits abnormally, […]