University of Newcastle upon Tyne   Faculty of Science Agriculture and Engineering    School of Computing Science   For Researchers
  Decoration http://www.ncl.ac.uk/  

  About Us ] [ For Applicants ] [ For Students ] [ For Researchers ] [ For Business ] [ Internal Website ] [ Search ]

Nonparametric inference in machine learning

Speaker: Lehel Csato

6th July 2005 , 2pm , Devonshire G21/G22 Conference Room

Abstract

Machine learning collects algorithms applicable for various data-sets and various problem types. One problem often encountered in machine learning methods is the issue of selecting the number of parameters, implying in some form the complexity of the model. Let us take the regression example. The generalised linear models are function approximators using a linear combination of functions. In the limit of infinitely many primitives the generalised linear models are universal approximators. The number of functions is usually fixed before applying the parameter inference and one has to make several runs with different number of parameters to obtain a good algorithms. Nonparametric methods are flexible inference procedures that sidestep the specification of model complexity by using a more flexible "parametrisation" of the functions used for approximation. The presentation will detail the application of Gaussian processes to various machine learning problems,like classification, regression or robust regression. Problems with the non-parametric setup, possible solutions, applications and programs will be presented.

Last Modified: 25 September, 2003