Ai
24 Apr 2025
A NeuroAI Reading List #
Introduction #
- Hassabis, D., Kumaran, D., Summerfield, C. & Botvinick, M. Neuroscience-inspired artificial intelligence. Neuron 95, 245–258 (2017).
- Macpherson, T. et al. Natural and artificial intelligence: a brief introduction to the interplay between AI and neuroscience research. Neural Netw. 144, 603–613 (2021).
- Mcculloch & Pitts 1943 — A logical calculus of the ideas immanent in nervous activity (Link)
- John von Neumann — The Computer and the Brain
- Moravec, H. Mind Children: The Future of Robot and Human Intelligence (Harvard University Press, 1988).
- https://www.nature.com/articles/s41467-023-37180-x
- Potential benefits for AI: https://baicsworkshop.github.io/pdf/BAICS_10.pdf
- Potential benefits for neuroscience: https://www.nature.com/articles/s41593-019-0520-2
Convolutional Neural Networks #
- Hubel & Wiesel 1962 — Receptive fields, binocular interaction and functional architecture in the cat’s visual cortex
- Inspired CNNs
- Fukushima 1980 — Neocognitron
- First convolutional neural network
- LeCun & Bengio 2995 — Convolutional networks for images, speech, and time series.
Reinforcement learning #
- Thorndike 1932 — The fundamentals of learning
- Crow 1968 — Cortical synapses and reinforcement
- Rescorla 1972 — A theory of pavlovian conditioning 1972
- Klopf 1972 — Brain function and adaptive systems
- Schultz 1997 — A neural substrate of prediction and reward
Attention-based neural networks #
- Itti, L., Koch, C. & Niebur, E. A model of saliency-based visual attention for rapid scene analysis. IEEE Trans. Pattern Anal. Mach. Intell. 20, 1254–1259 (1998).
- Larochelle, H. & Hinton, G. Learning to combine foveal glimpses with a third-order Boltzmann machine. Adv. Neural Inform. Process. Syst. 23, 1243–1251 (2010).
- Xu, K. et al. Show, attend and tell: neural image caption generation with visual attention. In: Proceedings of the 32nd International Conference on Machine Learning (eds. Bach, F. & Blei, D.) vol. 37, 2048–2057 (PMLR, 2015).
- Vaswani, A. et al. Attention is all you need. Adv. Neural Inf. Process. Syst. 30, 6000–6010 (2017).
The need for embodied AI #
- Brooks, R. A. Intelligence without representation. Artificial Intelligence. 47, 139–159 https://doi.org/10.1016/0004-3702(91)90053-m (1991).
Architectural Design #
- Merel, J., Botvinick, M. & Wayne, G. Hierarchical motor control in mammals and machines. Nat. Commun. 10, 5489 (2019).
Flexibility #
- Zador, A. M. A critique of pure learning and what artificial neural networks can learn from animal brains. Nat. Commun. 10, 3770 (2019).
- Bommasani, R. et al. On the opportunities and risks of foundation models. https://arxiv.org/abs/2108.07258 (2021).
- Elman, J. L. Learning and development in neural networks: the importance of starting small. Cognition 48, 71–99 (1993).
- Lake, B. M., Ullman, T. D., Tenenbaum, J. B. & Gershman, S. J. Building machines that learn and think like people. Behav. Brain Sci. 40, e253 (2017).
- Doya, K. & Taniguchi, T. Toward evolutionary and developmental intelligence. Curr. Opin. Behav. Sci. 29, 91–96 https://doi.org/10.1016/j.cobeha.2019.04.006 (2019).
- Stanley, K. O., Clune, J., Lehman, J. & Miikkulainen, R. Designing neural networks through neuroevolution. Nat. Mach. Intell. 1, 24–35 (2019).
- Gupta, A., Savarese, S., Ganguli, S. & Fei-Fei, L. Embodied intelligence via learning and evolution. Nat. Commun. 12, 5721 (2021).
- Stöckl, C., Lang, D. & Maass, W. Structure induces computational function in networks with diverse types of spiking neurons. bioRxiv. https://doi.org/10.1101/2021.05.18.444689 (2022).
- Koulakov, A., Shuvaev, S., Lachi, D. & Zador, A. Encoding innate ability through a genomic bottleneck. bioRxiv. https://doi.org/10.1101/2021.03.16.435261 (2022).
- Pehlevan, C. & Chklovskii, D. B. Neuroscience-inspired online unsupervised learning algorithms: artificial neural networks. IEEE Signal Process. Mag. 36, 88–96 (2019).
Efficiency #
- Patterson, D. et al. Carbon emissions and large neural network training. https://arxiv.org/abs/2104.10350 (2021).
- Sokoloff, L. The metabolism of the central nervous system in vivo. Handb. Physiol. Sect. I Neurophysiol. 3, 1843–1864 (1960).
- Boahen, K. Dendrocentric learning for synthetic intelligence. Nature 612, 43–50 (2022).
Noise and Neuromorphic Computing #
- Dobrunz, L. E. & Stevens, C. F. Heterogeneity of release probability, facilitation, and depletion at central synapses. Neuron 18, 995–1008 (1997).
- Attwell, D. & Laughlin, S. B. An energy budget for signaling in the grey matter of the brain. J. Cereb. Blood Flow. Metab. 21, 1133–1145 (2001).
- Lennie, P. The cost of cortical computation. Curr. Biol. 13, 493–497 (2003).
- Davies, M. et al. Advancing neuromorphic computing with loihi: a survey of results and outlook. Proc. IEEE Inst. Electr. Electron. Eng. 109, 911–934 (2021).
Structure and function #
https://www.cell.com/trends/cognitive-sciences/fulltext/S1364-6613%2820%2930026-7 https://www.nature.com/articles/s42256-023-00748-9