Tag:loss
-
Time:2021-1-19
Abstract:Deep neural network is based on calculus and some statistics. Deep neural network (DNN) is essentially composed of multiple connected perceptrons, one of which is a single neuron. We can regard artificial neural network (ANN) as an input system with a group of inputs fed along a weighted path. These inputs are then processed and […]
-
Time:2021-1-15
Detr is based on the standard transorfmer structure, and its performance is comparable to that of fast RCNN. However, the overall idea of this paper is very simple. I hope it can provide a general idea for many subsequent studies like fast RCNN Source: Xiaofei’s algorithm Engineering Notes official account Paper: end to end object […]
-
Time:2020-12-30
「IntroductionThe training and evaluation of the model is the core of the whole machine learning task process. Only by mastering the correct training and evaluation methods, and using them flexibly, can we carry out the experimental analysis and verification more quickly, so as to have a deeper understanding of the model. preface In the last […]
-
Time:2020-12-3
What is linear regression We may have been exposed to it in junior high school. Y = ax, X is the independent variable, y is the dependent variable, and a is the coefficient and the slope. If we know the coefficient a, then give me an X, and I can get a Y, which can […]
-
Time:2020-11-22
1. Objectives Fit function $f (x) = 5.0x_ 1+4.0x_ 2+3.0x_ 3+3 $ 2. Theory It is similar to one-dimensional linear regression. 3. Implementation 3.0 environment python == 3.6 torch == 1.4 3.1 necessary packages import torch import torch.nn as nn import torch.optim as optim import numpy as np 3.2 creating data and transforming forms # […]
-
Time:2020-11-21
1. Objectives Fitting function $f (x) = 2x_ {1}^{3}+3x_ 2^2+4x_ 3+0.5 $ 2. Theory The principle is similar to one-dimensional linear regression and multidimensional linear regression, but the frequency is higher. 3. Implementation 3.1 environment python == 3.6 torch == 1.4 3.2 construction data #This is the target weight and offset w = torch.FloatTensor([2.0, 3.0, […]
-
Time:2020-11-16
Introduction:There is no doubt that neural network is the most popular machine learning technology. So I think it’s very meaningful to understand how neural networks learn. Like going downhill, find the lowest point of the loss function. There is no doubt that neural network is the most popular machine learning technology. So I think it’s […]
-
Time:2020-11-11
Variational auto encoder (VAE) is a kind of generation model. The training model consists of encoder and decoder. The encoder maps the input samples to a certain low dimensional distribution, which is usually a multivariate Gaussian distribution with independent dimensions. Therefore, the output of the encoder is the mean and logarithmic variance of the Gaussian […]
-
Time:2020-11-10
In this paper, the Piou loss function extended from IOU index is proposed, which can effectively improve the rotation angle prediction and IOU effect in the scene of tilt target detection. It is suitable for anchor based method and anchor free method. In addition, the retail50k dataset is provided, which can be used to evaluate […]
-
Time:2020-11-3
The category of machine learning is larger than that of neural network. The classical machine learning model has strong logicality, mathematics and explainability. I choose to use the classical LR model as the beginning of machine learning!This paper focuses on the construction of the model and the derivation of mathematical principles, including the evaluation method […]
-
Time:2020-10-24
「IntroductionThe training and evaluation of the model is the core of the whole machine learning task process. Only by mastering the correct training and evaluation methods and using them flexibly, can we carry out experimental analysis and verification more quickly, and have a deeper understanding of the model. preface In the last articleKerasIn the article […]
-
Time:2020-9-30
catalog Write it at the front Cross-Entropy Loss (softmax loss) Contrastive Loss – CVPR2006 Triplet Loss – CVPR2015 Center Loss – ECCV2016 L-Softmax Loss – ICML2016 A-Softmax Loss – CVPR2017 AM-Softmax Loss-CVPR2018 ArcFace Loss – CVPR2019 Euclidean distance or angular distance and normalization reference resources Blog: blog Garden CSDN blog Write it at the front […]