Minimal perceptrons for memorizing complex patterns

Marissa Pastor, Juyong Song, Danh Tai Hoang, Junghyo Jo*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Feedforward neural networks have been investigated to understand learning and memory, as well as applied to numerous practical problems in pattern classification. It is a rule of thumb that more complex tasks require larger networks. However, the design of optimal network architectures for specific tasks is still an unsolved fundamental problem. In this study, we consider three-layered neural networks for memorizing binary patterns. We developed a new complexity measure of binary patterns, and estimated the minimal network size for memorizing them as a function of their complexity. We formulated the minimal network size for regular, random, and complex patterns. In particular, the minimal size for complex patterns, which are neither ordered nor disordered, was predicted by measuring their Hamming distances from known ordered patterns. Our predictions agree with simulations based on the back-propagation algorithm.

Original languageEnglish
Pages (from-to)31-37
Number of pages7
JournalPhysica A: Statistical Mechanics and its Applications
Volume462
DOIs
Publication statusPublished - 15 Nov 2016
Externally publishedYes

Fingerprint

Dive into the research topics of 'Minimal perceptrons for memorizing complex patterns'. Together they form a unique fingerprint.

Cite this