Year: 2021
Author: Helumt Harbrecht, John D. Jakeman, Peter Zaspel
Communications in Computational Physics, Vol. 29 (2021), Iss. 4 : pp. 1152–1185
Abstract
Gaussian processes and other kernel-based methods are used extensively to construct approximations of multivariate data sets. The accuracy of these approximations is dependent on the data used. This paper presents a computationally efficient algorithm to greedily select training samples that minimize the weighted $L^p$ error of kernel-based approximations for a given number of data. The method successively generates nested samples, with the goal of minimizing the error in high probability regions of densities specified by users. The algorithm presented is extremely simple and can be implemented using existing pivoted Cholesky factorization methods. Training samples are generated in batches which allows training data to be evaluated (labeled) in parallel. For smooth kernels, the algorithm performs comparably with the greedy integrated variance design but has significantly lower complexity. Numerical experiments demonstrate the efficacy of the approach for bounded, unbounded, multi-modal and non-tensor product densities. We also show how to use the proposed algorithm to efficiently generate surrogates for inferring unknown model parameters from data using Bayesian inference.
You do not have full access to this article.
Already a Subscriber? Sign in as an individual or via your institution
Journal Article Details
Publisher Name: Global Science Press
Language: English
DOI: https://doi.org/10.4208/cicp.OA-2020-0060
Communications in Computational Physics, Vol. 29 (2021), Iss. 4 : pp. 1152–1185
Published online: 2021-01
AMS Subject Headings: Global Science Press
Copyright: COPYRIGHT: © Global Science Press
Pages: 34
Keywords: Experimental design active learning Gaussian process radial basis function uncertainty quantification Bayesian inference.
Author Details
-
PyApprox: A software package for sensitivity analysis, Bayesian inference, optimal experimental design, and multi-fidelity uncertainty quantification and surrogate modeling
Jakeman, J.D.
Environmental Modelling & Software, Vol. 170 (2023), Iss. P.105825
https://doi.org/10.1016/j.envsoft.2023.105825 [Citations: 4] -
Fixed-budget approximation of the inverse kernel matrix for identification of nonlinear dynamic processes
Antropov, Nikita | Agafonov, Evgeny | Tynchenko, Vadim | Bukhtoyarov, Vladimir | Kukartsev, VladislavJournal of Applied Engineering Science, Vol. 20 (2022), Iss. 1 P.150
https://doi.org/10.5937/jaes0-31772 [Citations: 0] -
Adaptive experimental design for multi‐fidelity surrogate modeling of multi‐disciplinary systems
Jakeman, John D. | Friedman, Sam | Eldred, Michael S. | Tamellini, Lorenzo | Gorodetsky, Alex A. | Allaire, DougInternational Journal for Numerical Methods in Engineering, Vol. 123 (2022), Iss. 12 P.2760
https://doi.org/10.1002/nme.6958 [Citations: 6]