Jump to content

Hopfield network/Origins

From Wikiversity

Origins of the Network

[edit | edit source]

Ising model of a neural network as a memory model is first proposed by William A. Little in 1974,[1] which is acknowledged by Hopfield in his 1982 paper.[2] Networks with continuous dynamics were developed by Hopfield in his 1984 paper.[3] A major advance in memory storage capacity was developed by Krotov and Hopfield in 2016[4] through a change in network dynamics and energy function. This idea was further extended by Demircigil and collaborators in 2017.[5]

The continuous dynamics of large memory capacity models was developed in a series of papers between 2016 and 2020.[4][6] [7] Large memory storage capacity Hopfield Networks are now called Dense Associative Memories or modern Hopfield networks.

Learning Task

[edit | edit source]
  • Explain the role of the Energy Function for the convergence of the Hopfield networks and the recognition of trained input data for the Hopfield network.

References

[edit | edit source]
  1. Little, W. A. (1974). "The Existence of Persistent States in the Brain". Mathematical Biosciences 19 (1–2): 101–120. doi:10.1016/0025-5564(74)90031-5. 
  2. Hopfield, J. J. (1982). "Neural networks and physical systems with emergent collective computational abilities". Proceedings of the National Academy of Sciences 79 (8): 2554–2558. doi:10.1073/pnas.79.8.2554. PMID 6953413. PMC 346238. //www.ncbi.nlm.nih.gov/pmc/articles/PMC346238/. 
  3. Hopfield, J. J. (1984). "Neurons with graded response have collective computational properties like those of two-state neurons". Proceedings of the National Academy of Sciences 81 (10): 3088–3092. doi:10.1073/pnas.81.10.3088. PMID 6587342. PMC 345226. //www.ncbi.nlm.nih.gov/pmc/articles/PMC345226/. 
  4. 4.0 4.1 Krotov, Dmitry; Hopfield, John (2016). "Dense Associative Memory for Pattern Recognition". Neural Information Processing Systems 29: 1172–1180. 
  5. Mete, Demircigil et al. (2017). "On a model of associative memory with huge storage capacity.". Journal of Statistical Physics 168 (2): 288–299. doi:10.1007/s10955-017-1806-y. https://link.springer.com/article/10.1007/s10955-017-1806-y. 
  6. Ramsauer, Hubert et al. (2021). "Hopfield Networks is All You Need". International Conference on Learning Representations. 
  7. Krotov, Dmitry; Hopfield, John (2021). "Large associative memory problem in neurobiology and machine learning". International Conference on Learning Representations.