Search or filter publications

Filter by type:

Filter by publication type

Filter by year:

to

Results

  • Showing results for:
  • Reset all filters

Search results

  • JOURNAL ARTICLE
    Creswell A, Bharath AA,

    Denoising Adversarial Autoencoders

    Unsupervised learning is of growing interest because it unlocks the potentialheld in vast amounts of unlabelled data to learn useful representations forinference. Autoencoders, a form of generative model, may be trained by learningto reconstruct unlabelled input data from a latent representation space. Morerobust representations may be produced by an autoencoder if it learns torecover clean input samples from corrupted ones. Representations may be furtherimproved by introducing regularisation during training to shape thedistribution of the encoded data in latent space. We suggest denoisingadversarial autoencoders, which combine denoising and regularisation, shapingthe distribution of latent space using adversarial training. We introduce anovel analysis that shows how denoising may be incorporated into the trainingand sampling of adversarial autoencoders. Experiments are performed to assessthe contributions that denoising makes to the learning of representations forclassification and sample synthesis. Our results suggest that autoencoderstrained using a denoising criterion achieve higher classification performance,and can synthesise samples that are more consistent with the input data thanthose trained without a corruption process.

  • CONFERENCE PAPER
    Eleftheriadis S, Nicholson TFW, Deisenroth MP, Hensman Jet al.,

    Identification of Gaussian Process State Space Models

    , Advances in Neural Information Processing Systems (NIPS) 2017, Publisher: Neural Information Processing Systems Foundation, Inc.

    The Gaussian process state space model (GPSSM) is a non-linear dynamicalsystem, where unknown transition and/or measurement mappings are described byGPs. Most research in GPSSMs has focussed on the state estimation problem.However, the key challenge in GPSSMs has not been satisfactorily addressed yet:system identification. To address this challenge, we impose a structuredGaussian variational posterior distribution over the latent states, which isparameterised by a recognition model in the form of a bi-directional recurrentneural network. Inference with this structure allows us to recover a posteriorsmoothed over the entire sequence(s) of data. We provide a practical algorithmfor efficiently computing a lower bound on the marginal likelihood using thereparameterisation trick. This additionally allows arbitrary kernels to be usedwithin the GPSSM. We demonstrate that we can efficiently generate plausiblefuture trajectories of the system we seek to model with the GPSSM, requiringonly a small number of interactions with the true system.

  • CONFERENCE PAPER
    Salimbeni H, Deisenroth M,

    Doubly stochastic variational inference for deep Gaussian processes

    , NIPS 2017, Publisher: Advances in Neural Information Processing Systems (NIPS)

    Gaussian processes (GPs) are a good choice for function approximation as theyare flexible, robust to over-fitting, and provide well-calibrated predictiveuncertainty. Deep Gaussian processes (DGPs) are multi-layer generalisations ofGPs, but inference in these models has proved challenging. Existing approachesto inference in DGP models assume approximate posteriors that forceindependence between the layers, and do not work well in practice. We present adoubly stochastic variational inference algorithm, which does not forceindependence between layers. With our method of inference we demonstrate that aDGP model can be used effectively on data ranging in size from hundreds to abillion points. We provide strong empirical evidence that our inference schemefor DGPs works well in practice in both classification and regression.

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://wlsprd.imperial.ac.uk:80/respub/WEB-INF/jsp/search-t4-html.jsp Request URI: /respub/WEB-INF/jsp/search-t4-html.jsp Query String: id=954&limit=10&page=5&respub-action=search.html Current Millis: 1513278530012 Current Time: Thu Dec 14 19:08:50 GMT 2017