Imperial College London

Professor Paul M. Matthews

Faculty of MedicineDepartment of Brain Sciences

Edmond and Lily Safra Chair, Head of Department
 
 
 
//

Contact

 

+44 (0)20 7594 2855p.matthews

 
 
//

Assistant

 

Ms Siobhan Dillon +44 (0)20 7594 2855

 
//

Location

 

E502Burlington DanesHammersmith Campus

//

Summary

 

Publications

Citation

BibTex format

@article{Kolbeinsson:2021:10.1109/JSTSP.2021.3064182,
author = {Kolbeinsson, A and Kossaifi, J and Panagakis, I and Bulat, A and Anandkumar, A and Tzoulaki, I and Matthews, P},
doi = {10.1109/JSTSP.2021.3064182},
journal = {IEEE Journal of Selected Topics in Signal Processing},
pages = {630--640},
title = {Tensor dropout for robust learning},
url = {http://dx.doi.org/10.1109/JSTSP.2021.3064182},
volume = {15},
year = {2021}
}

RIS format (EndNote, RefMan)

TY  - JOUR
AB - CNNs achieve high levels of performance by leveraging deep, over-parametrized neural architectures, trained on large datasets. However, they exhibit limited generalization abilities outside their training domain and lack robustness to corruptions such as noise and adversarial attacks. To improve robustness and obtain more computationally and memory efficient models, better inductive biases are needed. To provide such inductive biases, tensor layers have been successfully proposed to leverage multi-linear structure through higher-order computations. In this paper, we propose tensor dropout, a randomization technique that can be applied to tensor factorizations, such as those parametrizing tensor layers. In particular, we study tensor regression layers, parametrized by low-rank weight tensors and augmented with our proposed tensor dropout. We empirically show that our approach improves generalization for image classification on ImageNet and CIFAR-100. We also establish state-of-the-art accuracy for phenotypic trait prediction on the largest available dataset of brain MRI (U.K. Biobank), where multi-linear structure is paramount. In all cases, we demonstrate superior performance and significantly improved robustness, both to noisy inputs and to adversarial attacks. We establish the theoretical validity of our approach and the regularizing effect of tensor dropout by demonstrating the link between randomized tensor regression with tensor dropout and deterministic regularized tensor regression.
AU - Kolbeinsson,A
AU - Kossaifi,J
AU - Panagakis,I
AU - Bulat,A
AU - Anandkumar,A
AU - Tzoulaki,I
AU - Matthews,P
DO - 10.1109/JSTSP.2021.3064182
EP - 640
PY - 2021///
SN - 1932-4553
SP - 630
TI - Tensor dropout for robust learning
T2 - IEEE Journal of Selected Topics in Signal Processing
UR - http://dx.doi.org/10.1109/JSTSP.2021.3064182
UR - http://hdl.handle.net/10044/1/87607
VL - 15
ER -