R language uses metropolis hasting sampling algorithm for logistic regression

Time:2021-6-10

Link to the original text: http://tecdat.cn/?p=6761

In logistic regression, we use the binary dependent variable y_ I regress to covariate X_ I go up. The following code uses metropolis sampling to explore beta_ 1 and beta_ 2 to covariate Xi.

Define exit and fractional logarithm link function

Logit < - function (x) {log (x / (1-x))} This function calculates beta_ 1,beta_ 2. It returns the posterior logarithm to obtain numerical stability( β 1, β 2)( β 1, β 2)。 It returns the posterior logarithm to obtain numerical stability.
log_post<-function(Y,X,beta){
    prob1  <- expit(beta[1] + beta[2]*X)
like+prior}

This is the main function of MCMC. Can.sd is the candidate standard deviation.

Bayes.logistic<-function(y,X,
                         n.samples=10000,
                         can.sd=0.1){
 
     keep.beta     <- matrix(0,n.samples,2)
     keep.beta[1,] <- beta

     acc   <- att <- rep(0,2)
 
    for(i in 2:n.samples){

      for(j in 1:2){

       att[j] <- att[j] + 1

      #Select candidates:

       canbeta    <- beta
       canbeta[j] <- rnorm(1,beta[j],can.sd)
       canlp      <- log_post(Y,X,canbeta)

      #Calculated acceptance rate:

       R <- exp(canlp-curlp)  
       U <- runif(1)                          
       if(U<R){       
          acc[j] <- acc[j]+1
       }
     }
     keep.beta[i,]<-beta

   }
   #Return the posterior sample of beta and the acceptance rate of metropolis


list(beta=keep.beta,acc.rate=acc/att)}

Generate simulation data

 set.seed(2008)
 n         <- 100
 X         <- rnorm(n)
  true.p    <- expit(true.beta[1]+true.beta[2]*X)
 Y         <- rbinom(n,1,true.p)

Fitting model

 burn      <- 10000
 n.samples <- 50000
  fit  <- Bayes.logistic(Y,X,n.samples=n.samples,can.sd=0.5)
 tock <- proc.time()[3]

 tock-tick

##    3.72

result

R language uses metropolis hasting sampling algorithm for logistic regression

abline(true.beta[1],0,lwd=2,col=2)

R language uses metropolis hasting sampling algorithm for logistic regression

abline(true.beta[2],0,lwd=2,col=2)

R language uses metropolis hasting sampling algorithm for logistic regression

 hist(fit$beta[,1],main="Intercept",xlab=expression(beta[1]),breaks=50) 

R language uses metropolis hasting sampling algorithm for logistic regression

abline(v=true.beta[2],lwd=2,col=2)

R language uses metropolis hasting sampling algorithm for logistic regression

 print("Posterior mean/sd")
## [1] "Posterior mean/sd"
 print(round(apply(fit$beta[burn:n.samples,],2,mean),3))
## [1] -0.076  0.798
 print(round(apply(fit$beta[burn:n.samples,],2,sd),3))
## [1] 0.214 0.268

## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -1.6990  -1.1039  -0.6138   1.0955   1.8275  
## 
## Coefficients:
##             Estimate Std. Error z value Pr(>|z|)   
## (Intercept) -0.07393    0.21034  -0.352  0.72521   
## X            0.76807    0.26370   2.913  0.00358 **
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 138.47  on 99  degrees of freedom
## Residual deviance: 128.57  on 98  degrees of freedom
## AIC: 132.57
## 
## Number of Fisher Scoring iterations: 4

Thank you very much for reading this article. If you have any questions, please leave a message below!

R language uses metropolis hasting sampling algorithm for logistic regression