This article may be too technical for most readers to understand. Please help improve it to make it understandable to non-experts, without removing the technical details.(October 2021) (Learn how and when to remove this message)
Modern Hopfield networks[1][2] (also known as Dense Associative Memories[3]) are generalizations of the classical Hopfield networks that break the linear scaling relationship between the number of input features and the number of stored memories. This is achieved by introducing stronger non-linearities (either in the energy function or neurons’ activation functions) leading to super-linear[3] (even an exponential[4]) memory storage capacity as a function of the number of feature neurons. The network still requires a sufficient number of hidden neurons.[5]
The key theoretical idea behind the Modern Hopfield networks is to use an energy function and an update rule that is more sharply peaked around the stored memories in the space of neuron’s configurations compared to the classical Hopfield network.[3]
^Ramsauer, Hubert; et al. (2021). "Hopfield Networks is All You Need". International Conference on Learning Representations. arXiv:2008.02217.
^"Hopfield Networks is All You Need". hopfield-layers. 2020-08-25. Archived from the original on 26 Mar 2023. Retrieved 2023-05-04.
^ abcKrotov, Dmitry; Hopfield, John (2016). "Dense Associative Memory for Pattern Recognition". Neural Information Processing Systems. 29: 1172–1180. arXiv:1606.01164.
^Demircigil, Mete; et al. (2017). "On a model of associative memory with huge storage capacity". Journal of Statistical Physics. 168 (2): 288–299. arXiv:1702.01929. Bibcode:2017JSP...168..288D. doi:10.1007/s10955-017-1806-y. S2CID 119317128.
^Krotov, Dmitry; Hopfield, John (2021). "Large associative memory problem in neurobiology and machine learning". International Conference on Learning Representations. arXiv:2008.06996.
and 27 Related for: Modern Hopfield network information
ModernHopfieldnetworks (also known as Dense Associative Memories) are generalizations of the classical Hopfieldnetworks that break the linear scaling...
Shun'ichi Amari made it adaptive in 1972. This was also called the Hopfieldnetwork (1982). See also David Rumelhart's work in 1986. In 1993, a neural...
feedforward networks include convolutional neural networks and radial basis function networks, which use a different activation function. Hopfieldnetwork Feed-forward...
threshold elements". IEEE Transactions. C (21): 1197–1206. Hopfield, J. J. (1982). "Neural networks and physical systems with emergent collective computational...
applied in bioinformatics and genetics. Hochreiter introduced modernHopfieldnetworks with continuous states and applied them to the task of immune repertoire...
(2022). "Annotated History of Modern AI and Deep Learning". arXiv:2212.11279 [cs.NE]. Hopfield, J. J. (1982). "Neural networks and physical systems with emergent...
and 1988. In 1982, physicist John Hopfield was able to prove that a form of neural network (now called a "Hopfield net") could learn and process information...
often used to optimize the weight matrix. The Hopfieldnetwork (like similar attractor-based networks) is of historic interest although it is not a general...
was experimenting on learning that posited a connectionist type network. Hopfieldnetworks had precursors in the Ising model due to Wilhelm Lenz (1920) and...
field, as "connectionism", by researchers from other disciplines including Hopfield, Rumelhart, and Hinton. Their main success came in the mid-1980s with the...
Lapa (1965); Kaoru Nakano (1977); Shun-Ichi Amari (1972); John Joseph Hopfield (1982). Backpropagation was independently discovered by: Henry J. Kelley...
Yarnold, Patrick (2009). Wanborough Manor: School for secret agents. Hopfield Publications. ISBN 978-0-9563489-0-6. Media related to Madeleine Damerment...
Salapura, V.; Maischberger, O. (1996). "A Generic Building Block for Hopfield Neural Networks with On-Chip Learning". 1996 IEEE International Symposium on Circuits...
Press. Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. PNAS, 79, 2554-2558. Hopfield, J. J...
tree model. Neural networks, such as recurrent neural networks (RNN), convolutional neural networks (CNN), and Hopfield neural networks have been added....
funding. The "winter" of neural network approach came to an end in the middle 1980s, when the work of John Hopfield, David Rumelhart and others revived...
annealing technology based on nuclear magnetic resonance (NMR), a quantum Hopfieldnetwork was implemented in 2009 that mapped the input data and memorized data...
neurons. He worked with John Hopfield and Nobelist Richard Feynman, helping to create three new fields: neural networks, neuromorphic engineering, and...
threshold elements". IEEE Transactions. C (21): 1197–1206. Hopfield, J. J. (1982). "Neural networks and physical systems with emergent collective computational...
Yarnold, Patrick (2009). Wanborough Manor: School for secret agents. Hopfield Publications. ISBN 978-0956348906. Listverse - 10 Amazing Female Spies...
2022-06-23. Desikan, Shubashree (2022-03-05). "Deepak Dhar and John J. Hopfield chosen for the Boltzmann medal". The Hindu. ISSN 0971-751X. Retrieved 2022-06-23...
and the Past and Future of Our Expanding Universe 1995-96 Hopfield, John J. - Neural Networks: Brains and Computers 1996-97 Shoemaker, Eugene - Near-Earth...
threshold elements". IEEE Transactions. C (21): 1197–1206. Hopfield, J. J. (1982). "Neural networks and physical systems with emergent collective computational...
Yarnold, Patrick (2009). Wanborough Manor: School for secret agents. Hopfield Publications. ISBN 978-0956348906. Nigel Perrin - SOE Agent Profiles: Yolande...