|
|
Numerical methods and programming, 2017, Volume 18, Issue 3, Pages 214–220
(Mi vmp874)
|
|
|
|
This article is cited in 1 scientific paper (total in 1 paper)
Application of block low-rank matrices in Gaussian processes for regression
D. A. Sushnikova Skolkovo Institute of Science and Technology
Abstract:
The Gaussian processes for regression are considered. During simulation of correlated noises using the Gaussian processes, the main difficulty is the computation of the posterior mean and dispersion of the prediction. This computation requires the inversion of the dense covariance matrix of order $n$, where $n$ is the sample size. In addition, for the likelihood evaluation we need to compute the logarithm of the determinant of the dense covariance matrix, which is also a time-consuming problem. A new method for the fast computation of the covariance matrix logarithm is proposed. This method is based on the approximation of this matrix by a sparse matrix. The proposed method appears to be time efficient compared to the HODLR (Hierarchically Off-Diagonal Low-Rank) method and the traditional dense method.
Keywords:
Gaussian processes, $\mathcal{H}^2$ matrix, sparse matrix, Cholesky factorization.
Received: 17.05.2017
Citation:
D. A. Sushnikova, “Application of block low-rank matrices in Gaussian processes for regression”, Num. Meth. Prog., 18:3 (2017), 214–220
Linking options:
https://www.mathnet.ru/eng/vmp874 https://www.mathnet.ru/eng/vmp/v18/i3/p214
|
|