Citation

BibTex format

@inproceedings{Chatzis:2011,
author = {Chatzis, SP and Korkinof, D and Demiris, Y},
title = {The One-Hidden Layer Non-parametric Bayesian Kernel Machine},
url = {http://hdl.handle.net/10044/1/12580},
year = {2011}
}

RIS format (EndNote, RefMan)

TY  - CPAPER
AB - In this paper, we present a nonparametric Bayesian approach towards one-hidden-layer feedforward neural net- works. Our approach is based on a random selection of the weights of the synapses between the input and the hidden layer neurons, and a Bayesian marginalization over the weights of the connections between the hidden layer neurons and the output neurons, giving rise to a kernel-based nonparametric Bayesian inference procedure for feedforward neural networks. Compared to existing approaches, our method presents a number of advan- tages, with the most significant being: (i) it offers a significant improvement in terms of the obtained generalization capabilities; (ii) being a nonparametric Bayesian learning approach, it entails inference instead of fitting to data, thus resolving the overfitting issues of non-Bayesian approaches; and (iii) it yields a full predictive posterior distribution, thus naturally providing a measure of uncertainty on the generated predictions (expressed by means of the variance of the predictive distribution), without the need of applying computationally intensive methods, e.g., bootstrap. We exhibit the merits of our approach by investigating its application to two difficult multimedia content classification applications: semantic characterization of audio scenes based on content, and yearly song classification, as well as a set of benchmark classification and regression tasks.
AU - Chatzis,SP
AU - Korkinof,D
AU - Demiris,Y
PY - 2011///
TI - The One-Hidden Layer Non-parametric Bayesian Kernel Machine
UR - http://hdl.handle.net/10044/1/12580
ER -