R language uses metropolis – Logistic regression with hasting sampling algorithm

Time：2021-4-20

In logistic regression, we use the binary dependent variable y_ I regress to covariate X_ I go up. The following code uses metropolis sampling to explore beta_ 1 and beta_ 2 to covariate Xi.

Define exit and fractional logarithm linkfunction

``````logit <- function ( x ){ log ( x /( one - x ))}  This function calculates beta_ 1，beta_ 2. It returns the posterior logarithm to obtain numerical stability. (β one ,β two )(β one ,β two )。 It returns the posterior logarithm to obtain numerical stability.
log_post<-function(Y,X,beta){
prob1  <- expit(beta[1] + beta[2]*X)
like+prior}``````

This is the main function of MCMC . can . SD is the candidate standard deviation.

``````Bayes.logistic<-function(y,X,
n.samples=10000,
can.sd=0.1){

keep.beta     <- matrix(0,n.samples,2)
keep.beta[1,] <- beta

acc   <- att <- rep(0,2)

for(i in 2:n.samples){

for(j in 1:2){

att[j] <- att[j] + 1

#Select candidates:

canbeta    <- beta
canbeta[j] <- rnorm(1,beta[j],can.sd)
canlp      <- log_post(Y,X,canbeta)

#Calculated acceptance rate:

R <- exp(canlp-curlp)
U <- runif(1)
if(U<R){
acc[j] <- acc[j]+1
}
}
keep.beta[i,]<-beta

}
#Return the posterior sample of beta and the acceptance rate of metropolis

list(beta=keep.beta,acc.rate=acc/att)}``````

Generate simulation data

`````` set.seed(2008)
n         <- 100
X         <- rnorm(n)
true.p    <- expit(true.beta[1]+true.beta[2]*X)
Y         <- rbinom(n,1,true.p)``````

Fitting model

`````` burn      <- 10000
n.samples <- 50000
fit  <- Bayes.logistic(Y,X,n.samples=n.samples,can.sd=0.5)
tock <- proc.time()[3]

tock-tick``````

``##    3.72``

result

`abline(true.beta[1],0,lwd=2,col=2)`

`abline(true.beta[2],0,lwd=2,col=2)`

`` hist(fit\$beta[,1],main="Intercept",xlab=expression(beta[1]),breaks=50) ``

``abline(v=true.beta[2],lwd=2,col=2)``

`````` print("Posterior mean/sd")
## [1] "Posterior mean/sd"
print(round(apply(fit\$beta[burn:n.samples,],2,mean),3))
## [1] -0.076  0.798
print(round(apply(fit\$beta[burn:n.samples,],2,sd),3))
## [1] 0.214 0.268

##
## Deviance Residuals:
##     Min       1Q   Median       3Q      Max
## -1.6990  -1.1039  -0.6138   1.0955   1.8275
##
## Coefficients:
##             Estimate Std. Error z value Pr(>|z|)
## (Intercept) -0.07393    0.21034  -0.352  0.72521
## X            0.76807    0.26370   2.913  0.00358 **
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for binomial family taken to be 1)
##
##     Null deviance: 138.47  on 99  degrees of freedom
## Residual deviance: 128.57  on 98  degrees of freedom
## AIC: 132.57
##
## Number of Fisher Scoring iterations: 4``````