Learning and generalization in neural networks with diffusion-limited architectures

Authors

  • Christopher Monterola ⋅ PH National Institute of Physics, University of the Philippines Diliman
  • Caesar Saloma ⋅ PH National Institute of Physics, University of the Philippines Diliman

Abstract

We study the ability of neural networks with diffusion-limited configurations, to learn mathematical operations, and after being trained, to generalize to new inputs which are not part of the training set. This type of network configuration is important because real biological neural networks evolve spatially following the rule of diffusion-limited aggregation. In particular, the operations of signal tranduction, differentiation, integration, thresholding, and number representation conversions, are considered. Such operations are essential in the survival of a biological species. For example, differentiation is needed to recognize rates of change while integration is important is quantifying sizes or magnitudes. The study also attempts to clarify a fundamental issue concerning configuration efficiency (least number of neurons and simplest type of neuron interconnections) in generalizing neural networks. Our results show that diffusion-limited networks are trainable and have excellent generalizing capability for mathematical tasks described above.

Downloads

Issue

Article ID

SPP-1995-IP-06

Section

Instrumentation and Computational Physics

Published

1995-10-21

How to Cite

[1]
C Monterola and C Saloma, Learning and generalization in neural networks with diffusion-limited architectures, Proceedings of the Samahang Pisika ng Pilipinas 13, SPP-1995-IP-06 (1995). URL: https://proceedings.spp-online.org/article/view/SPP-1995-IP-06.