Katana VentraIP

Computational neuroscience

Computational neuroscience (also known as theoretical neuroscience or mathematical neuroscience) is a branch of neuroscience which employs mathematics, computer science, theoretical analysis and abstractions of the brain to understand the principles that govern the development, structure, physiology and cognitive abilities of the nervous system.[1][2][3][4]

Computational neuroscience employs computational simulations[5] to validate and solve mathematical models, and so can be seen as a sub-field of theoretical neuroscience; however, the two fields are often synonymous.[6] The term mathematical neuroscience is also used sometimes, to stress the quantitative nature of the field.[7]


Computational neuroscience focuses on the description of biologically plausible neurons (and neural systems) and their physiology and dynamics, and it is therefore not directly concerned with biologically unrealistic models used in connectionism, control theory, cybernetics, quantitative psychology, machine learning, artificial neural networks, artificial intelligence and computational learning theory;[8][9] [10] although mutual inspiration exists and sometimes there is no strict limit between fields,[11][12][13] with model abstraction in computational neuroscience depending on research scope and the granularity at which biological entities are analyzed.


Models in theoretical neuroscience are aimed at capturing the essential features of the biological system at multiple spatial-temporal scales, from membrane currents, and chemical coupling via network oscillations, columnar and topographic architecture, nuclei, all the way up to psychological faculties like memory, learning and behavior. These computational models frame hypotheses that can be directly tested by biological or psychological experiments.

History[edit]

The term 'computational neuroscience' was introduced by Eric L. Schwartz, who organized a conference, held in 1985 in Carmel, California, at the request of the Systems Development Foundation to provide a summary of the current status of a field which until that point was referred to by a variety of names, such as neural modeling, brain theory and neural networks. The proceedings of this definitional meeting were published in 1990 as the book Computational Neuroscience.[14] The first of the annual open international meetings focused on Computational Neuroscience was organized by James M. Bower and John Miller in San Francisco, California in 1989.[15] The first graduate educational program in computational neuroscience was organized as the Computational and Neural Systems Ph.D. program at the California Institute of Technology in 1985.


The early historical roots of the field[16] can be traced to the work of people including Louis Lapicque, Hodgkin & Huxley, Hubel and Wiesel, and David Marr. Lapicque introduced the integrate and fire model of the neuron in a seminal article published in 1907,[17] a model still popular for artificial neural networks studies because of its simplicity (see a recent review[18]).


About 40 years later, Hodgkin and Huxley developed the voltage clamp and created the first biophysical model of the action potential. Hubel and Wiesel discovered that neurons in the primary visual cortex, the first cortical area to process information coming from the retina, have oriented receptive fields and are organized in columns.[19] David Marr's work focused on the interactions between neurons, suggesting computational approaches to the study of how functional groups of neurons within the hippocampus and neocortex interact, store, process, and transmit information. Computational modeling of biophysically realistic neurons and dendrites began with the work of Wilfrid Rall, with the first multicompartmental model using cable theory.

Chklovskii DB (2004). . Neuron. 43 (5): 609–17. doi:10.1016/j.neuron.2004.08.012. PMID 15339643. S2CID 16217065.

"Synaptic connectivity and neuronal morphology: two sides of the same coin"

; Churchland, Patricia Smith (1992). The computational brain. Cambridge, Mass: MIT Press. ISBN 978-0-262-03188-2.

Sejnowski, Terrence J.

Gerstner, W.; Kistler, W.; Naud, R.; Paninski, L. (2014). Neuronal Dynamics. Cambridge, UK: Cambridge University Press.  9781107447615.

ISBN

; Abbott, L. F. (2001). Theoretical neuroscience: computational and mathematical modeling of neural systems. Cambridge, Mass: MIT Press. ISBN 978-0-262-04199-7.

Dayan P.

Eliasmith, Chris; Anderson, Charles H. (2003). Neural engineering: Representation, computation, and dynamics in neurobiological systems. Cambridge, Mass: . ISBN 978-0-262-05071-5.

MIT Press

; Rieke, Fred; David Warland; Rob de Ruyter van Steveninck (1999). Spikes: exploring the neural code. Cambridge, Mass: MIT. ISBN 978-0-262-68108-7.

William Bialek

Schutter, Erik de (2001). Computational neuroscience: realistic modeling for experimentalists. Boca Raton: CRC.  978-0-8493-2068-2.

ISBN

Sejnowski, Terrence J.; Hemmen, J. L. van (2006). 23 problems in systems neuroscience. Oxford [Oxfordshire]: Oxford University Press.  978-0-19-514822-0.

ISBN

Michael A. Arbib; Shun-ichi Amari; Prudence H. Arbib (2002). The Handbook of Brain Theory and Neural Networks. Cambridge, Massachusetts: The MIT Press.  978-0-262-01197-6.

ISBN

(2014). Understanding vision: theory, models, and data. Oxford, UK: Oxford University Press. ISBN 978-0199564668.

Zhaoping, Li

a Python based simulator

BRIAN

web based 3D visualization tool to browse connections in the human brain

Budapest Reference Connectome

neural simulation software.

Emergent

a general neural simulation system.

GENESIS

is a simulator for spiking neural network models that focuses on the dynamics, size and structure of neural systems rather than on the exact morphology of individual neurons.

NEST

Journal of Mathematical Neuroscience

Journal of Computational Neuroscience

Neural Computation

Cognitive Neurodynamics

Frontiers in Computational Neuroscience

PLoS Computational Biology

Frontiers in Neuroinformatics