Neuro

24 Apr 2025

A NeuroAI Reading List

A NeuroAI Reading List #

Introduction #

  1. Hassabis, D., Kumaran, D., Summerfield, C. & Botvinick, M. Neuroscience-inspired artificial intelligence. Neuron 95, 245–258 (2017).
  2. Macpherson, T. et al. Natural and artificial intelligence: a brief introduction to the interplay between AI and neuroscience research. Neural Netw. 144, 603–613 (2021).
  3. Mcculloch & Pitts 1943 — A logical calculus of the ideas immanent in nervous activity (Link)
  4. John von Neumann — The Computer and the Brain
  5. Moravec, H. Mind Children: The Future of Robot and Human Intelligence (Harvard University Press, 1988).
  6. https://www.nature.com/articles/s41467-023-37180-x
  7. Potential benefits for AI: https://baicsworkshop.github.io/pdf/BAICS_10.pdf
    1. https://arxiv.org/abs/2303.13651
  8. Potential benefits for neuroscience: https://www.nature.com/articles/s41593-019-0520-2
    1. https://arxiv.org/abs/2209.03718

Convolutional Neural Networks #

  1. Hubel & Wiesel 1962 — Receptive fields, binocular interaction and functional architecture in the cat’s visual cortex
    • Inspired CNNs
  2. Fukushima 1980 — Neocognitron
    • First convolutional neural network
  3. LeCun & Bengio 2995 — Convolutional networks for images, speech, and time series.

Reinforcement learning #

  1. Thorndike 1932 — The fundamentals of learning
  2. Crow 1968 — Cortical synapses and reinforcement
  3. Rescorla 1972 — A theory of pavlovian conditioning 1972
  4. Klopf 1972 — Brain function and adaptive systems
  5. Schultz 1997 — A neural substrate of prediction and reward

Attention-based neural networks #

  1. Itti, L., Koch, C. & Niebur, E. A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998).
  2. Larochelle, H. & Hinton, G. Learning to combine foveal glimpses with a third-order Boltzmann machine. Adv. Neural Inform. Process. Syst. 23, 1243–1251 (2010).
  3. Xu, K. et al. Show, attend and tell: neural image caption generation with visual attention. In: Proceedings of the 32nd International Conference on Machine Learning (eds. Bach, F. & Blei, D.) vol. 37, 2048–2057 (PMLR, 2015).
  4. Vaswani, A. et al. Attention is all you need. Adv. Neural Inf. Process. Syst30, 6000–6010 (2017).

The need for embodied AI #

  1. Brooks, R. A. Intelligence without representation. Artificial Intelligence47, 139–159 https://doi.org/10.1016/0004-3702(91)90053-m (1991).

Architectural Design #

  1. Merel, J., Botvinick, M. & Wayne, G. Hierarchical motor control in mammals and machines. Nat. Commun. 10, 5489 (2019).

Flexibility #

  1. Zador, A. M. A critique of pure learning and what artificial neural networks can learn from animal brains. Nat. Commun. 10, 3770 (2019).
  2. Bommasani, R. et al. On the opportunities and risks of foundation models. https://arxiv.org/abs/2108.07258 (2021).
  3. Elman, J. L. Learning and development in neural networks: the importance of starting small. Cognition 48, 71–99 (1993).
  4. Lake, B. M., Ullman, T. D., Tenenbaum, J. B. & Gershman, S. J. Building machines that learn and think like people. Behav. Brain Sci. 40, e253 (2017).
  5. Doya, K. & Taniguchi, T. Toward evolutionary and developmental intelligence. Curr. Opin. Behav. Sci. 29, 91–96 https://doi.org/10.1016/j.cobeha.2019.04.006 (2019).
  6. Stanley, K. O., Clune, J., Lehman, J. & Miikkulainen, R. Designing neural networks through neuroevolution. Nat. Mach. Intell. 1, 24–35 (2019).
  7. Gupta, A., Savarese, S., Ganguli, S. & Fei-Fei, L. Embodied intelligence via learning and evolution. Nat. Commun. 12, 5721 (2021).
  8. Stöckl, C., Lang, D. & Maass, W. Structure induces computational function in networks with diverse types of spiking neurons. bioRxiv. https://doi.org/10.1101/2021.05.18.444689 (2022).
  9. Koulakov, A., Shuvaev, S., Lachi, D. & Zador, A. Encoding innate ability through a genomic bottleneck. bioRxivhttps://doi.org/10.1101/2021.03.16.435261 (2022).
  10. Pehlevan, C. & Chklovskii, D. B. Neuroscience-inspired online unsupervised learning algorithms: artificial neural networks. IEEE Signal Process. Mag. 36, 88–96 (2019).

Efficiency #

  1. Patterson, D. et al. Carbon emissions and large neural network training. https://arxiv.org/abs/2104.10350 (2021).
  2. Sokoloff, L. The metabolism of the central nervous system in vivo. Handb. Physiol. Sect. I Neurophysiol. 3, 1843–1864 (1960).
  3. Boahen, K. Dendrocentric learning for synthetic intelligence. Nature 612, 43–50 (2022).

Noise and Neuromorphic Computing #

  1. Dobrunz, L. E. & Stevens, C. F. Heterogeneity of release probability, facilitation, and depletion at central synapses. Neuron 18, 995–1008 (1997).
  2. Attwell, D. & Laughlin, S. B. An energy budget for signaling in the grey matter of the brain. J. Cereb. Blood Flow. Metab. 21, 1133–1145 (2001).
  3. Lennie, P. The cost of cortical computation. Curr. Biol. 13, 493–497 (2003).
  4. Davies, M. et al. Advancing neuromorphic computing with loihi: a survey of results and outlook. Proc. IEEE Inst. Electr. Electron. Eng. 109, 911–934 (2021).

Structure and function #

https://www.cell.com/trends/cognitive-sciences/fulltext/S1364-6613%2820%2930026-7 https://www.nature.com/articles/s42256-023-00748-9