Carl Edward Rasmussen (1996). Bayesian Regression using Gaussian Process priors, presentation at the meeting of the American Statistical Association in Chicago, Aug. 4-8. Slides are available in postscript.
Carl Edward Rasmussen (1996). Evaluation of Gaussian Processes and other Methods for Non-Linear Regression, PhD thesis, graduate department of Computer Science, University of Toronto. postscript.
C. K. I. Williams (1996). Computing with infinite networks, in M. C. Mozer and M. I. Jordan and T. Petsche, eds., Advances in Neural Information Processing Systems 9, MIT Press, Cam
bridge, MA. postscript.
C. K. I. Williams (1998). Prediction with Gaussian Processes: From Linear Regression to Linear Prediction and Beyond, in Learning and Inference in Graphical Models, M. I. Jordan ed.
, Kluwer Academic Press. Also available as technical report, NCRG/97/012, Aston University. postscript.
C. K. I. Williams and Carl Edward Rasmussen (1996). Gaussian Processes for Regression, in Touretsky, Mozer and Hasselmo, eds., Advance
s in Neural Information Processing Systems 8, MIT Press. postscript.
Huaiyu Zhu, C. K. I. Williams, Richard Rohwer and Michal Morciniec (1997). Gaussian Regression and Optimal Finite Dimensional Linear Models, technical report, NCRG/97/011, Aston University. postscript.
Evidence framework or integration over hyperparameters?
Input dependent Noise
- NCRG/98/002 Regression with Input-dependent Noise: A Gaussian Process Treatment
- P. W. Goldberg and C. K. I. Williams and C. M. Bishop
In Advances in Neural Information Processing Systems 10.
Editor: M. I. Jordan and M. J. Kearns and S. A. Solla.
Abstract: Gaussian processes provide natural non-parametric prior distributions over regression functions. In this paper we consider regression problems where there is noise on the output, and the variance of the noise depends
on the inputs. If we assume that the noise is a smooth function of the inputs, then it is natural to model the noise variance using a second Gaussian process, in addition to the Gaussian process governing the noise-free output value. We show that prior u
ncertainty about the parameters controlling both processes can be handled and that the posterior distribution of the noise rate can be sampled from using Markov chain Monte Carlo methods. Our results on a synthetic data set give a posterior noise variance
that well-approximates the true variance.
- NCRG/97/002 Regression with Input-Dependent Noise: A Bayesian Treatment
- Christopher M. Bishop and Cazhaow S. Qazaz
In Advances in Neural Information Processing Systems.
Abstract: In most treatments of the regression problem it is assumed that the distribution of target data can be described by a deterministic function of the inputs, together with additive Gaussian noise having constant variance. The
use of maximum likelihood to train such models then corresponds to the minimization of a sum-of-squares error function. In many applications a more realistic model would allow the noise variance itself to depend on the input variables. However, the use o
f maximum likelihood to train such models would give highly biased results. In this paper we show how a Bayesian treatment can allow for an input-dependent variance while overcoming the bias of maximum likelihood.
Page updated 2.2.1998 Jouko.Lampinen@hut.fi