RT Journal Article T1 Elements for a general memory structure: properties of recurrent neural networks used to form situation models A1 Makarov Slizneva, Valeriy A1 Song, Yongli A1 Velarde, Manuel G. A1 Hübner, David A1 Cruse, Holk AB We study how individual memory items are stored assuming that situations given in the environment can be represented in the form of synaptic-like couplings in recurrent neural networks. Previous numerical investigations have shown that specific architectures based on suppression or max units can successfully learn static or dynamic stimuli (situations). Here we provide a theoretical basis concerning the learning process convergence and the network response to a novel stimulus. We show that, besides learning "simple" static situations, a nD network can learn and replicate a sequence of up to n different vectors or frames. We find limits on the learning rate and show coupling matrices developing during training in different cases including expansion of the network into the case of nonlinear interunit coupling. Furthermore, we show that a specific coupling matrix provides low-pass-filter properties to the units, thus connecting networks constructed by static summation units with continuous-time networks. We also show under which conditions such networks can be used to perform arithmetic calculations by means of pattern completion. PB Springer Verlag SN 0340-1200 YR 2008 FD 2008-05 LK https://hdl.handle.net/20.500.14352/50132 UL https://hdl.handle.net/20.500.14352/50132 LA eng DS Docta Complutense RD 24 abr 2025