arXiv Analytics

Sign in

arXiv:2404.02873 [stat.ML]AbstractReferencesReviewsResources

Gaussian Process Regression with Soft Inequality and Monotonicity Constraints

Didem Kochan, Xiu Yang

Published 2024-04-03Version 1

Gaussian process (GP) regression is a non-parametric, Bayesian framework to approximate complex models. Standard GP regression can lead to an unbounded model in which some points can take infeasible values. We introduce a new GP method that enforces the physical constraints in a probabilistic manner. This GP model is trained by the quantum-inspired Hamiltonian Monte Carlo (QHMC). QHMC is an efficient way to sample from a broad class of distributions. Unlike the standard Hamiltonian Monte Carlo algorithm in which a particle has a fixed mass, QHMC allows a particle to have a random mass matrix with a probability distribution. Introducing the QHMC method to the inequality and monotonicity constrained GP regression in the probabilistic sense, our approach improves the accuracy and reduces the variance in the resulting GP model. According to our experiments on several datasets, the proposed approach serves as an efficient method as it accelerates the sampling process while maintaining the accuracy, and it is applicable to high dimensional problems.

Related articles: Most relevant | Search more
arXiv:1807.02811 [stat.ML] (Published 2018-07-08)
A Tutorial on Bayesian Optimization
arXiv:2406.12678 [stat.ML] (Published 2024-06-18)
Contraction rates for conjugate gradient and Lanczos approximate posteriors in Gaussian process regression
arXiv:2304.02641 [stat.ML] (Published 2023-04-05)
Self-Distillation for Gaussian Process Regression and Classification