Research output: Chapter in Book/Conference proceedings/Edited volume › Conference contribution › Scientific › peer-review

**Gradient descent for gaussian processes variance reduction.** / Bottarelli, Lorenzo; Loog, Marco.

Research output: Chapter in Book/Conference proceedings/Edited volume › Conference contribution › Scientific › peer-review

Bottarelli, L & Loog, M 2018, Gradient descent for gaussian processes variance reduction. in X Bai, ER Hancock, TK Ho, RC Wilson, B Biggio & A Robles-Kelly (eds), *Structural, Syntactic, and Statistical Pattern Recognition : Joint IAPR International Workshop, S+SSPR 2018, Proceedings.* Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11004 , Springer, Cham, pp. 160-169, Joint IAPR International Workshops on Structural and Syntactic Pattern Recognition, SSPR 2018 and Statistical Techniques in Pattern Recognition, SPR 2018, Beijing, China, 17/08/18. https://doi.org/10.1007/978-3-319-97785-0_16

Bottarelli, L., & Loog, M. (2018). Gradient descent for gaussian processes variance reduction. In X. Bai, E. R. Hancock, T. K. Ho, R. C. Wilson, B. Biggio, & A. Robles-Kelly (Eds.), *Structural, Syntactic, and Statistical Pattern Recognition : Joint IAPR International Workshop, S+SSPR 2018, Proceedings *(pp. 160-169). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11004 ). Cham: Springer. https://doi.org/10.1007/978-3-319-97785-0_16

Bottarelli L, Loog M. Gradient descent for gaussian processes variance reduction. In Bai X, Hancock ER, Ho TK, Wilson RC, Biggio B, Robles-Kelly A, editors, Structural, Syntactic, and Statistical Pattern Recognition : Joint IAPR International Workshop, S+SSPR 2018, Proceedings. Cham: Springer. 2018. p. 160-169. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-319-97785-0_16

@inproceedings{e9affc3e2f744044a630b17a24eba993,

title = "Gradient descent for gaussian processes variance reduction",

abstract = "A key issue in Gaussian Process modeling is to decide on the locations where measurements are going to be taken. A good set of observations will provide a better model. Current state of the art selects such a set so as to minimize the posterior variance of the Gaussian Process by exploiting submodularity. We propose a Gradient Descent procedure to iteratively improve an initial set of observations so as to minimize the posterior variance directly. The performance of the technique is analyzed under different conditions by varying the number of measurement points, the dimensionality of the domain and the hyperparameters of the Gaussian Process. Results show the applicability of the technique and the clear improvements that can be obtain under different settings.",

author = "Lorenzo Bottarelli and Marco Loog",

year = "2018",

doi = "10.1007/978-3-319-97785-0_16",

language = "English",

isbn = "978-3-319-97784-3",

series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",

publisher = "Springer",

pages = "160--169",

editor = "X. Bai and E.R. Hancock and T.K. Ho and R.C. Wilson and B. Biggio and A. Robles-Kelly",

booktitle = "Structural, Syntactic, and Statistical Pattern Recognition",

}

TY - GEN

T1 - Gradient descent for gaussian processes variance reduction

AU - Bottarelli, Lorenzo

AU - Loog, Marco

PY - 2018

Y1 - 2018

N2 - A key issue in Gaussian Process modeling is to decide on the locations where measurements are going to be taken. A good set of observations will provide a better model. Current state of the art selects such a set so as to minimize the posterior variance of the Gaussian Process by exploiting submodularity. We propose a Gradient Descent procedure to iteratively improve an initial set of observations so as to minimize the posterior variance directly. The performance of the technique is analyzed under different conditions by varying the number of measurement points, the dimensionality of the domain and the hyperparameters of the Gaussian Process. Results show the applicability of the technique and the clear improvements that can be obtain under different settings.

AB - A key issue in Gaussian Process modeling is to decide on the locations where measurements are going to be taken. A good set of observations will provide a better model. Current state of the art selects such a set so as to minimize the posterior variance of the Gaussian Process by exploiting submodularity. We propose a Gradient Descent procedure to iteratively improve an initial set of observations so as to minimize the posterior variance directly. The performance of the technique is analyzed under different conditions by varying the number of measurement points, the dimensionality of the domain and the hyperparameters of the Gaussian Process. Results show the applicability of the technique and the clear improvements that can be obtain under different settings.

UR - http://www.scopus.com/inward/record.url?scp=85052209166&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-97785-0_16

DO - 10.1007/978-3-319-97785-0_16

M3 - Conference contribution

SN - 978-3-319-97784-3

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 160

EP - 169

BT - Structural, Syntactic, and Statistical Pattern Recognition

A2 - Bai, X.

A2 - Hancock, E.R.

A2 - Ho, T.K.

A2 - Wilson, R.C.

A2 - Biggio, B.

A2 - Robles-Kelly, A.

PB - Springer

CY - Cham

ER -

ID: 47578466