Sparse Bayesian Linear Regression using Generalized Normal Priors

I implemented sparse Bayesian linear regression using generalized normal priors in R. The generalized normal priors used here correspond to the non-convex Lb penalty [1], where 0 < b < 1.

Following [2], I used reversible-jump MCMC for inference.

Usage Example:
library(lars)
data(diabetes)
x <- scale(diabetes$x)
y <- scale(diabetes$y)

betas <- rjmcmcBLasso(x, y, max.steps = 50000, b = 0.2)

R Code:
rjmcmcBLasso.R

References:
[1] Zongben Xu, Hai Zhang, Yao Wang and Xiangyu Chang. L1/2 Regularizer. Science in China Series F: Information Sciences. 52(1):1-9. 2009.
[2] Xiaohui Chen, Z Jane Wang and Martin J McKeown. A Bayesian Lasso via Reversible-Jump MCMC. Signal Processing. 91(8):1920-1932. 2011.