deepgp - Bayesian Deep Gaussian Processes using MCMC
Performs Bayesian posterior inference for deep Gaussian
processes following Sauer, Gramacy, and Higdon (2023,
<doi:10.48550/arXiv.2012.08015>). See Sauer (2023,
<http://hdl.handle.net/10919/114845>) for comprehensive
methodological details and
<https://bitbucket.org/gramacylab/deepgp-ex/> for a variety of
coding examples. Models are trained through MCMC including
elliptical slice sampling of latent Gaussian layers and
Metropolis-Hastings sampling of kernel hyperparameters.
Vecchia-approximation for faster computation is implemented
following Sauer, Cooper, and Gramacy (2023,
<doi:10.48550/arXiv.2204.02904>). Optional monotonic warpings
are implemented following Barnett et al. (2024,
<doi:10.48550/arXiv.2408.01540>). Downstream tasks include
sequential design through active learning Cohn/integrated mean
squared error (ALC/IMSE; Sauer, Gramacy, and Higdon, 2023),
optimization through expected improvement (EI; Gramacy, Sauer,
and Wycoff, 2022 <doi:10.48550/arXiv.2112.07457>), and contour
location through entropy (Booth, Renganathan, and Gramacy, 2024
<doi:10.48550/arXiv.2308.04420>). Models extend up to three
layers deep; a one layer model is equivalent to typical
Gaussian process regression. Incorporates OpenMP and SNOW
parallelization and utilizes C/C++ under the hood.