Imperial College London

ProfessorDuncanGillies

Faculty of EngineeringDepartment of Computing

Emeritus Professor
 
 
 
//

Contact

 

+44 (0)20 7594 8317d.gillies Website

 
 
//

Location

 

373Huxley BuildingSouth Kensington Campus

//

Summary

 

Publications

Citation

BibTex format

@inproceedings{Liu:2012,
author = {Liu, R and Gillies, DF},
publisher = {CSREA Press},
title = {An eigenvalue-problem formulation to non-parametric mutual information maximation for linear dimensionality reduction},
year = {2012}
}

RIS format (EndNote, RefMan)

TY  - CPAPER
AB - Well-known dimensionality reduction (feature extraction) techniques, such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA), are formulated as eigenvalue-problems, where the required features are eigenvectors of some objective matrix. Eigenvalue-problems are theoretically elegant, and have advantages over iterative algorithms. In contrast to iterative algorithms, they can discover globally optimal features in one go, thus reducing computation times and avoiding local optima. Here we propose an eigenvalue-problem formulation for linear dimensionality reduction basedon maximising the mutual information between the class variable and the extracted features. Mutual information takes into account all moments of the input data while PCA and LDA only account for the first two moments. Our experiments show that our proposed method achieves better, more discriminative projectionsthan PCA and LDA, and gives better classification results for datasets in which each class is well-represented.
AU - Liu,R
AU - Gillies,DF
PB - CSREA Press
PY - 2012///
TI - An eigenvalue-problem formulation to non-parametric mutual information maximation for linear dimensionality reduction
ER -