A key issue in Gaussian Process modeling is to decide on the locations where measurements are going to be taken. A good set of observations will provide a better model. Current state of the art selects such a set so as to minimize the posterior variance of the Gaussian Process by exploiting submodularity. We propose a Gradient Descent procedure to iteratively improve an initial set of observations so as to minimize the posterior variance directly. The performance of the technique is analyzed under different conditions by varying the number of measurement points, the dimensionality of the domain and the hyperparameters of the Gaussian Process. Results show the applicability of the technique and the clear improvements that can be obtain under different settings.

Original languageEnglish
Title of host publicationStructural, Syntactic, and Statistical Pattern Recognition
Subtitle of host publicationJoint IAPR International Workshop, S+SSPR 2018, Proceedings
EditorsX. Bai, E.R. Hancock, T.K. Ho, R.C. Wilson, B. Biggio, A. Robles-Kelly
Place of PublicationCham
PublisherSpringer Verlag
Pages160-169
Number of pages10
ISBN (Electronic)978-3-319-97785-0
ISBN (Print)978-3-319-97784-3
DOIs
Publication statusPublished - 2018
EventJoint IAPR International Workshops on Structural and Syntactic Pattern Recognition, SSPR 2018 and Statistical Techniques in Pattern Recognition, SPR 2018 - Beijing, China
Duration: 17 Aug 201819 Aug 2018

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
PublisherSpringer
Volume11004
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceJoint IAPR International Workshops on Structural and Syntactic Pattern Recognition, SSPR 2018 and Statistical Techniques in Pattern Recognition, SPR 2018
CountryChina
CityBeijing
Period17/08/1819/08/18

ID: 47578466