Reassessment of catastrophic interferenceYamaguchi, MakotoCANeuroReport: October 25th, 2004 - Volume 15 - Issue 15 - p 2423-2426 COMPUTATIONAL NEUROSCIENCE Abstract Author InformationAuthors Connectionist models using the back propagation learning rule are known to have a serious problem in that they exhibit catastrophic interference (or forgetting) with sequential training. After the model learns the first set of patterns, if the model is trained on another set of patterns, its performance on the first set dramatically deteriorates very rapidly. The present study reconsiders this issue with three sets of simulations. With orthogonal input vectors, interference can be reasonably mild. The number of hidden units was critical for the degree of interference, in contrast to suggestions of previous studies. Output coding scheme was found to be critical. The length of input lists also influenced the degree of interference. This study suggests that the interference problem has been overstated in the literature. Department of Educational Psychology, Waseda University, Tokyo 169-8050, Japan CACorresponding Author: email@example.com Received 6 July 2004; accepted 3 September 2004 © 2004 Lippincott Williams & Wilkins, Inc.