The slope is positive and negative: The slope is a positive number, indicating that the original function is positively correlated with the independent variable, On the contrary, the original function is negatively correlated with the independent variable. The positive and negative slope can be used as a condition to judge the extreme value: If the left side of the stagnation point with slope equal to 0 is positive and the right side is negative, then the function Y = f (x) has a maximum. Slope is timing: The larger the slope is, the faster the function grows, and the smaller the slope is, the slower the function grows. When the slope is negative: The smaller the slope is, the faster the function decreases, and the larger the slope is, the slower the function decreases.
Derivative The derivative of a function at a point describes the rate of change of the function near that point. The essence of derivative is the local linear approximation of function by the concept of limit.
Gradient Gradient is a vector with both direction and size. The gradient of a point in a multivariate function is the partial derivative of the function at this point. Maximum value of gradient rise:
Along the direction of the gradient vector, that is, adding the gradient vector, it is easy to find the maximum value of the function. In logic regression, the method of finding the maximum probability, that is, the maximum likelihood function. Minimum value of gradient descent:
It is easy to find the minimum value of the function by subtracting the gradient vector in the opposite direction of the gradient vector. For gradient descent method, it is used in neural network to minimize the error.
Generally speaking, to find the extreme value of a function, we first need to find the derivative, and then make the derivative equal to zero, This point is not necessarily the extreme value, but it must be the necessary condition for the extreme value, that is to say, the derivative of the extreme value must be zero. Maximum value: Increasing function growth rate Y = left half of x 2 function, earlier
Differentiable and differentiable
The total differential exists, and the partial derivative must exist. What kind of functions are differentiable and differentiable? todo It involves how to select the differentiable function to theta in the logistic regression
Derivative, gradient and extremum
Gradient up and gradient down