Incremental neural learning by dynamic and spatial changing weights

Noriyasu Homma, Madan M. Gupta

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In this paper a new neural network model is presented for incremental learning tasks where networks are required to learn new knowledge without forgetting the old one. An essential core of the proposed neural network structure is their dynamic and spatial changing connection weights (DSCWs). A learning scheme is developed for the formulation of the dynamic changing weights, while a structural adaptation is formulated by the spatial changing (growing) connecting weights. To avoid disturbing the past knowledge by the creation of new connections, a restoration mechanism is introduced by using the DSCWs. Usefulness of the proposed model is demonstrated by using a system identification task.

Original languageEnglish
Title of host publicationIFAC Proceedings Volumes (IFAC-PapersOnline)
EditorsGabriel Ferrate, Eduardo F. Camacho, Luis Basanez, Juan. A. de la Puente
PublisherIFAC Secretariat
Pages247-252
Number of pages6
Edition1
ISBN (Print)9783902661746
DOIs
Publication statusPublished - 2002
Event15th World Congress of the International Federation of Automatic Control, 2002 - Barcelona, Spain
Duration: 2002 Jul 212002 Jul 26

Publication series

NameIFAC Proceedings Volumes (IFAC-PapersOnline)
Number1
Volume15
ISSN (Print)1474-6670

Conference

Conference15th World Congress of the International Federation of Automatic Control, 2002
Country/TerritorySpain
CityBarcelona
Period02/7/2102/7/26

Keywords

  • Brain models
  • Classification
  • Function approximation
  • Learning algorithms
  • Long-term memory and short-term memory
  • Neural networks

Fingerprint

Dive into the research topics of 'Incremental neural learning by dynamic and spatial changing weights'. Together they form a unique fingerprint.

Cite this