• ## R language uses metropolis hasting sampling algorithm for logistic regression

Time：2021-6-10

Link to the original text: http://tecdat.cn/?p=6761 In logistic regression, we use the binary dependent variable y_ I regress to covariate X_ I go up. The following code uses metropolis sampling to explore beta_ 1 and beta_ 2 to covariate Xi. Define exit and fractional logarithm link function Logit < – function (x) {log (x / […]

• ## R language uses metropolis – Logistic regression with hasting sampling algorithm

Time：2021-4-20

Link to the original text:http://tecdat.cn/?p=6761 In logistic regression, we use the binary dependent variable y_ I regress to covariate X_ I go up. The following code uses metropolis sampling to explore beta_ 1 and beta_ 2 to covariate Xi. Define exit and fractional logarithm link function logit <- function ( x ){ log ( x […]

• ## R language modeling income inequality: distribution function fitting and Lorenz curve

Time：2021-3-26

Link to the original text:http://tecdat.cn/?p=20613 Lorenz curve is derived from economics, which is used to describe the phenomenon of social income imbalance. The income is arranged in descending order, and the cumulative proportion of income and population is calculated respectively.In this paper, we study income and inequality. Let’s start with some simulation data > (income=sort(income)) […]

• ## Python general function to realize array calculation

Time：2021-2-19

1、 Operation of array Array operations can be added, subtracted, multiplied and divided. At the same time, these arithmetic operators can be arbitrarily combined to achieve the effect. >>> x=np.arange(5) >>> x array([0, 1, 2, 3, 4]) >>> x=5 >>> x=np.arange(5) >>> x+5 array([5, 6, 7, 8, 9]) >>> x-5 array([-5, -4, -3, -2, -1]) […]

• ## ISP Foundation (08) – dynamic range compression

Time：2021-2-13

Dynamic range compression in image processing 1 introduction of dynamic range compression The real scene in nature can show a wide range of color brightness range, such as from very dark (10 ^ – 5 CD / m2) night to bright (10 ^ 5 CD / m2) sunlight, with nearly 10 orders of magnitude of […]

• ## Algorithm engineering 5. Exponential distribution family

Time：2021-1-21

• ## Data science crash course: interpretive logic regression

Time：2020-10-6

By Mandy GuCompile | FlinSource: towards science Logistic regression is used to model the probability of event occurrence by estimating the logarithmic probability of event occurrence. If we assume that there is a linear relationship between logarithmic ratio and j independent variables, then we can model the probability p of event occurrence as follows: You […]

• ## Tensorflow learning notes (3): logical regression

Time：2020-3-21

Preface This paper uses tensorflow to train the logistic regression model, and compares it with scikit learn. Dataset from Andrew NG’s open online course deep learning Code #!/usr/bin/env python # -*- coding=utf-8 -*- #@ Author: Chen Zhiping # @date: 2017-01-04 # @description: compare the logistics regression of tensorflow with sklearn based on the exercise of […]

• ## Python machine learning — Logical Regression

Time：2020-2-28

We know that the perceptron algorithm can’t do anything for the data which can’t be completely linearly segmented. In this paper, we will introduce another very effective binary classification model – logical regression. It is widely used in classification tasks Logical regression is a classification model. Before implementation, we will introduce several concepts: Odds ratio: […]