Enhancing neural-network performance via assortativity

Sebastiano De Franciscis*, Samuel Johnson, Joaquín J. Torres

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The performance of attractor neural networks has been shown to depend crucially on the heterogeneity of the underlying topology. We take this analysis a step further by examining the effect of degree-degree correlations- assortativity-on neural-network behavior. We make use of a method recently put forward for studying correlated networks and dynamics thereon, both analytically and computationally, which is independent of how the topology may have evolved. We show how the robustness to noise is greatly enhanced in assortative (positively correlated) neural networks, especially if it is the hub neurons that store the information.

Original languageEnglish
Article number036114
JournalPhysical Review E - Statistical, Nonlinear, and Soft Matter Physics
Volume83
Issue number3
DOIs
Publication statusPublished - 25 Mar 2011

ASJC Scopus subject areas

  • Statistical and Nonlinear Physics
  • Statistics and Probability
  • Condensed Matter Physics

Fingerprint

Dive into the research topics of 'Enhancing neural-network performance via assortativity'. Together they form a unique fingerprint.

Cite this