Tag:Neuron

  • Generalization of neural network

    Time:2021-10-13

    Whenever we train our own neural network, we need to pay attention to the problem called neural networkgeneralizationProblems. In essence, this means how well our model can learn from given data and apply the information to other aspects. When training the neural network, there will be some data on the neural networkTraining,Some data will also […]

  • From software development to AI Engineers: model training

    Time:2021-9-6

    preface I wonder if you have seen the popular Korean drama kingdom in April? I’ve watched the whole episode. When Prince Yuanzi was born, he was bitten by zombies in the palace. However, because brain neurons had not yet formed and parasites could not control neurons, the medical woman judged that it would not affect […]

  • Illustrated, 700 pages of machine learning notes are on fire! Worth learning

    Time:2021-8-20

    Recently, I was learning machine learning. I saw this note. It was introduced in great detail. I recorded it as learning. author Jim Liang, from sap (the world’s largest business software company). Book features Clear organization, including graphical representation, easier to understand, detailed notes on formulas, etc. Content summary It is mainly divided into basic […]

  • A primer on using tensorflow and keras

    Time:2021-8-9

    By angel DasCompile VKSource: towards Data Science introduce Artificial neural networks (ANNs) are the advanced version of machine learning technology and the core of deep learning. Artificial neural networks involve the following concepts. Input-output layer, hidden layer, neurons under hidden layer, forward propagation and back propagation. Simply put, the input layer is a set of […]

  • Params and MAC calculation in lenet-5

    Time:2021-6-16

    First, the code implementation of lenet-5 network (based on pytorch) is given class LeNet(nn.Module): def __init__(self): super(LeNet, self).__init__() # input_size=(1*28*28) self.conv1 = nn.Sequential( # in_channels, out_channels, kernel_size #Padding = 2 to ensure the same input and output size nn.Conv2d(1, 6, 5, padding=2), # input_size=(6*28*28) nn.ReLU(), # output_size=(6*14*14) nn.MaxPool2d(kernel_size=2, stride=2), ) self.conv2 = nn.Sequential( nn.Conv2d(6, 16, […]

  • The principle of autoencoder

    Time:2021-6-2

    This article is from: http://ufldl.stanford.edu/wiki/index.php…      There are some difficult points in the original translation, which were written once in my own language. Supervised neural networks require that our data be labeled. However, neural networks are not limited to dealing with labeled data, but also deal with unlabeled data, such as: ${x ^ {(1)}, […]

  • Using neural network to realize the biology that can avoid obstacles autonomously

    Time:2021-6-1

    This article is a very interesting project I saw last year. I tried to imitate its code and write a similar project, but it has not been completed. Here is a related blog translated by the original author. Maybe more people are interested in it. Project demonstration: demo implemented by the original author with JavaScript […]

  • Neural networks are cute

    Time:2021-5-29

    Neural network is very cute! 0. ClassificationThe most important use of neural network is classification. In order to give you an intuitive understanding of classification, let’s take a look at a few examplesSpam identification: now there is an e-mail that extracts all the words that appear in it and sends them to a machine. The […]

  • Activation functions sigmoid, tanh, relu

    Time:2021-3-18

    The goal of activation functions is to make neural networks nonlinear. The activation function is continuous and differentiable. Continuous: when the input value changes slightly, the output value also changes slightly; Differentiable: in the domain of definition, there is a derivative everywhere; Common activation functions: sigmoid, tanh, relu. sigmoid Sigmoid is a smooth step function […]

  • On the influence of the size of the convolution nucleus of pytorch on fully connected neurons

    Time:2021-3-11

    The size of neurons in 3 * 3 convolution nucleus and 2 * 5 convolution nucleus #Here’s Kerner_ size = 2*5 class CONV_ NET( torch.nn.Module ): #CONV_ Net class inheritance nn.Module class def __init__(self): super(CONV_ NET, self).__ init__ () make conv_ Net class contains parent class nn.Module All properties of #Super () requires two arguments, […]

  • Machine learning (7): Wu Enda’s notes

    Time:2021-2-14

    Why study neural network algorithm Neural network is an old algorithm, but it is also the preferred technology of many machine learning. We already have linear regression and logistic regression algorithms. Why should we study neural network algorithm? Let’s look at the logistic regression problem with enough nonlinear terms. For complex machine learning problems, there […]

  • Derivation of mathematical principle of softmax output layer

    Time:2021-1-28

    Compared with computers, I prefer mathematical deduction. I think mathematical deduction is touchable and more three-dimensional, and even you can trace back to the root. As the first article of neural network algorithm, I decided to start with softmax output layer. I think this article is good This article is very professional As shown in […]