Generates posterior samples under Bayesian sparse regression based on the bridge prior using the CG-accelerated Gibbs sampler of Nishimura et. al. (2018). The linear and logistic model are currently supported.

pip install bayesbridge==0.2.1



Python package for Bayesian sparse regression, implementing the standard (Polya-Gamma augmented) Gibbs sampler as well as the CG-accelerated sampler of Nishimura and Suchard (2018). The latter algorithm can be orders of magnitudes faster for a large and sparse design matrix.


pip install bayesbridge


The Bayesian bridge is based on the following prior on the regression coefficients \beta_j's:

The Bayesian bridge recovers the the Bayesian lasso when \alpha = 1 but can provide an improved separation of the significant coefficients from the rest when \alpha < 1.


from bayesbridge import BayesBridge, RegressionModel, RegressionCoefPrior

model = RegressionModel(y, X, family='logit')
prior = RegressionCoefPrior(bridge_exponent=.5)
bridge = BayesBridge(model, prior)
mcmc_output = bridge.gibbs(
    n_burnin=100, n_post_burnin=1000, thin=1,
    coef_sampler_type='cholesky' # Try 'cg' for large and sparse X
coef_samples = mcmc_output['samples']['coef']

where y is a 1-D numpy array and X is a 2-D numpy array or scipy sparse matrix.

Currently the linear and logistic model (binomial outcomes) are supported. See demo.ipynb for demonstration of further features.


If you find this package useful, please cite

Akihiko Nishimura and Marc A. Suchard (2018). Prior-preconditioned conjugate gradient for accelerated Gibbs sampling in "large n & large p" sparse Bayesian logistic regression models. arXiv:1810.12437.