Imperial College London

ProfessorDaniloMandic

Faculty of EngineeringDepartment of Electrical and Electronic Engineering

Professor of Signal Processing
 
 
 
//

Contact

 

+44 (0)20 7594 6271d.mandic Website

 
 
//

Assistant

 

Miss Vanessa Rodriguez-Gonzalez +44 (0)20 7594 6267

 
//

Location

 

813Electrical EngineeringSouth Kensington Campus

//

Summary

 

Publications

Citation

BibTex format

@inproceedings{Mandic:2000:10.1109/ICASSP.2000.860132,
author = {Mandic, DP and Chambers, JA and Bozic, MM},
doi = {10.1109/ICASSP.2000.860132},
pages = {3406--3409},
title = {On global asymptotic stability of fully connected recurrent neural networks},
url = {http://dx.doi.org/10.1109/ICASSP.2000.860132},
year = {2000}
}

RIS format (EndNote, RefMan)

TY  - CPAPER
AB - © 2000 IEEE. Conditions for global asymptotic stability (GAS) of a nonlinear relaxation process realized by a recurrent neural network (RNN) are provided. Existence, convergence, and robustness of such a process are analyzed. This is undertaken based upon the contraction mapping theorem (CMT) and the corresponding fixed point iteration (FPI). Upper bounds for such a process are shown to be the conditions of convergence for a commonly analyzed RNN with a linear state dependence.
AU - Mandic,DP
AU - Chambers,JA
AU - Bozic,MM
DO - 10.1109/ICASSP.2000.860132
EP - 3409
PY - 2000///
SN - 1520-6149
SP - 3406
TI - On global asymptotic stability of fully connected recurrent neural networks
UR - http://dx.doi.org/10.1109/ICASSP.2000.860132
ER -