BibTex format
@article{Creswell:2016,
author = {Creswell, A and Bharath, AA},
title = {Task Specific Adversarial Cost Function},
url = {https://arxiv.org/abs/1609.08661v1},
year = {2016}
}
In this section
Several of our current PhD candidates and fellow researchers at the Data Science Institute have published, or in the proccess of publishing, papers to present their research.
@article{Creswell:2016,
author = {Creswell, A and Bharath, AA},
title = {Task Specific Adversarial Cost Function},
url = {https://arxiv.org/abs/1609.08661v1},
year = {2016}
}
TY - JOUR
AB - The cost function used to train a generative model should fit the purpose of the model. If the model is intended for tasks such as generating perceptually correct samples, it is beneficial to maximise the likelihood of a sample drawn from the model, Q, coming from the same distribution as the training data, P. This is equivalent to minimising the Kullback-Leibler (KL) distance, KL[Q||P]. However, if the model is intended for tasks such as retrieval or classification it is beneficial to maximise the likelihood that a sample drawn from the training data is captured by the model, equivalent to minimising KL[P||Q]. The cost function used in adversarial training optimises the Jensen-Shannon entropy which can be seen as an even interpolation between KL[Q||P] and KL[P||Q]. Here, we propose an alternative adversarial cost function which allows easy tuning of the model for either task. Our task specific cost function is evaluated on a dataset of hand-written characters in the following tasks: Generation, retrieval and one-shot learning.
AU - Creswell,A
AU - Bharath,AA
PY - 2016///
TI - Task Specific Adversarial Cost Function
UR - https://arxiv.org/abs/1609.08661v1
ER -
Data Science Imperial
William Penney Laboratory
Imperial College London
South Kensington Campus
London SW7 2AZ
United Kingdom
Email us.
Sign up to our mailing list.
Follow us on Twitter, LinkedIn and Instagram.