Tag:sample

  • Application of Flink real-time computing in microblog

    Time:2021-7-27

    Introduction: by combining Flink real-time stream computing framework with business scenarios, microblog has done a lot of work in platform and service, and has also done a lot of optimization in development efficiency and stability. We improve development efficiency through modular design and platform development. Cao Fuqiang, senior system engineer and data computing director of […]

  • Pneumonia detection using CNN and python

    Time:2021-7-24

    By Muhammad ArdiCompile FlinSource | analyticsvidhya introduce well! I just finished a deep learning project a few hours ago. Now I want to share what I have done. The goal of this challenge is to determine whether a person has pneumonia. If so, determine whether it is caused by bacteria or viruses. Well, I think […]

  • What is heteroscedasticity

    Time:2021-7-23

    Today, let’s talk about heteroscedasticity. Before heteroscedasticity, let’s talk about another concept similar to heteroscedasticity: homovariance. What is homovariance? The same variance = the same + variance, as the name suggests, is the same variance. What is the variance? Variance is used to reflect the fluctuation of data. The same variance means that the fluctuation […]

  • Multiple time series rolling prediction based on extended data tecdat: R language: Arima, regression and arimax model analysis

    Time:2021-7-22

    Link to the original text:http://tecdat.cn/?p=22849 When it is necessary to choose the most suitable prediction model or method for data, forecasters usually divide the available samples into two parts: inner samples (also known as “training set”) and reserved samples (or outer samples, or “test set”). Then, the model is estimated in the sample, and some […]

  • On the theorem of large numbers

    Time:2021-7-20

    We talked about the central limit theorem. In this section, we will talk about the theorem of large numbers. The theorem of large numbers and the central limit theorem are relatively close concepts, and these two theorems often appear together. Let’s take a concrete look at the content of the theorem of large numbers The […]

  • Densebox: early anchor free research with advanced ideas | CVPR 2015

    Time:2021-7-13

    The design of densebox detection algorithm is very advanced. Nowadays, many anchor free methods have their own shadow. If it didn’t appear a little later than fast r-cnn at that time, the field of target detection might have been developing in the direction of anchor free for a long time  Source: Xiaofei’s algorithm Engineering Notes […]

  • Fsaf: embedding anchor free branch to guide acnor based algorithm training | cvpr2019

    Time:2021-7-6

    Fsaf deeply analyzes the selection problem of FPN layer in training, and embeds it into the original network in the form of super simple anchor free branch, which almost has no effect on the speed. It can select the optimal FPN layer more accurately and bring good accuracy improvement  Source: Xiaofei’s algorithm Engineering Notes official […]

  • How to solve the problem of intelligent annotation

    Time:2021-6-25

    Whether in the field of traditional machine learning or today’s hot deep learning, supervised learning based on training samples with clear labels or results is still a main model training method. Especially in the field of deep learning, more data is needed to improve the effect of the model. At present, there are some large-scale […]

  • Overview of generative countermeasure network model

    Time:2021-6-24

    Survey of generative countermeasure network model Author: Zhang Zhenyuan GAN About Gan Generative countermeasure network(Generative adversarial networks,GANs)The core idea of zero sum game is derived from zero sum game, including generator and discriminator. The generator receives random variables and generates “false” samples, and the discriminator is used to judge whether the input samples are real […]

  • Cross entropy loss function NN. Cross entropy loss ()

    Time:2021-6-22

    nn.CrossEntropyLoss() 1. Introduction When using pytorch deep learning framework to do multi classification, the cross entropy loss function NN. Crossentropyloss () 2. Information quantity and entropy Amount of information:It is used to measure the uncertainty of an event; The greater the probability of an event, the smaller the uncertainty, the smaller the amount of information […]

  • How to use moco-v2 in pytorch to reduce computational constraints

    Time:2021-6-15

    Author | guessCompile VKSource: analytics vidhya introduce Simclr papers(http://cse.iitkgp.ac.in/~arastogi/papers/simclr.pdf)It explains how the framework benefits from larger models and larger batches, and if it has enough computing power, it can produce similar results to the supervised model. But these requirements make the calculation of the framework quite large. Wouldn’t it be great if we could have […]

  • PSI formula model / data stability index

    Time:2021-6-14

    Since the model is developed based on the samples of a specific period, whether the model is suitable for the ethnic groups outside the development samples must be tested for stability. The population stability index (PSI) can measure the distribution differences of the scores of test samples and model development samples, and is the most […]