• ## Pytoch: multidimensional linear regression

Time：2020-11-22

1. Objectives Fit function \$f (x) = 5.0x_ 1+4.0x_ 2+3.0x_ 3+3 \$ 2. Theory It is similar to one-dimensional linear regression. 3. Implementation 3.0 environment python == 3.6 torch == 1.4 3.1 necessary packages import torch import torch.nn as nn import torch.optim as optim import numpy as np 3.2 creating data and transforming forms # […]

• ## Wu Enda’s machine learning and memorizing knowledge points linear regression formula

Time：2020-11-8

Regression equation Octave code representation h = X * theta Cost function Least mean square (LMS) J = 0.5 * m^-1 * (X*theta-y)’ * (X*theta-y); Gradient descent theta = theta – alpha * m ^ -1 * X’ * (X * theta – y) Random gradient descent Normal equation theta=pinv(X’*X)*X’*y

• ## 09 linear regression and matrix operation

Time：2020-2-5

09 linear regression and matrix operation linear regression Definition: regression analysis through modeling between one or more independent variables and dependent variables. It can be a linear combination of one or more independent variables. Univariate linear regression: only one variable is involvedMultiple linear regression: two or more variables General formula: H (W) = W0 + […]

• ## Intuitively understand why the classification problem uses cross entropy loss instead of mean square error loss?

Time：2020-1-8

Catalog Cross entropy loss and mean square error loss Loss function angle Softmax back propagation angle Reference resources Blog: blog.shinelee.me | blog Park | CSDN Cross entropy loss and mean square error loss The final softmax layer of the conventional classification network is shown in the figure below, and the traditional machine learning method is […]