Modeling in neuroscience - and elsewhere
Modélisation en neurosciences - et ailleurs

Jean-Pierre Nadal

Master Mathématiques Vision Apprentissage (MVA),
Ecole Normale Supérieure Paris-Saclay
For general informations on the master, see the MVA website and agenda.
Course webpage in French here.

Course given in French or English depending on the students (and in general it is English that is wanted).
Practical informations and registrations, see below.
This page has been updated on January 9, 2023 .



Regular attendance is mandatory in order to be allowed to validate the course (absences must be justified). Before each class, students should fill in an attendance sheet.


Presentation

This course gives an introduction to the modeling in neuroscience - a domain called computational neuroscience -, making use of tools from information theory, Bayesian inference, dynamical systems, and statistical physics.

The main topics to be discussed are memory, neural coding and decision making in natural neural systems.

The course highligts the links and interactions between computational neuroscience, signal processing and machine learning. It also provides openings to other themes - in particular complex systems in the social sciences.

In terms of form, the course is as much about formal (mathematical), algorithmic and quantitative aspects, as it is about qualitative ones (historical aspects, conceptual contributions of modeling, interpretation of model analysis for the understanding of human and animal cognition).

Main topics to be discussed: more details below.


General information

Academic year 2023-2024

Intended audience and registration

The Course is intended for MVA students, but also open (on request, within the limits of available places) to students of other masters, as well as PhD students and postdocs, in mathematics, computer/data science, engineering science, or physics.

MVA students:
registration through the eCampus platform, but do not hesitate to send me an email telling me in 2 or 3 lines the motivation for taking the course.

Non MVA students:
To attend the Course, send me an email asking for attending the Course (please give 2 or 3 lines of motivation), then with the acceptance email you will get the password to register with the MVA secretariat as "auditeur externe" (external student),
see here.

Registration is mandatory to access the course material and for being on the course mailing list.

Planning and location

20h of class + exam.

Classes will take place at ENS Paris-Saclay, room 2E29 (bâtiment Sud-Ouest/South-West building),
on Tuesdays, from January 9 to February 27, 2024.

First class on Tuesday, January 9, from 9:30am to 11:30am (2 hour class).
The following Tuesdays, from January 16 to February 27, classes from 9:30am to 12:45pm (3h class with a 15mn half-time break). No class on February 6.

Course material

General informations are kept updated on the present page - public page hosted by the Physics department of the ENS (Paris).
A general bibliography can be found
here.

However, informations and course material (slides, articles) specific to the current academic year will be provided on the course website on the eCampus platform of the ENS Paris-Saclay, with access restricted to registered students:
https://ecampus.paris-saclay.fr/course/index.php?categoryid=8662.

Taking the Course for credits

Credits: 5 ECTS.

Regular attendance is mandatory in order to be allowed to validate the course (absences must be justified).

The validation of the course requires the production of a report presenting a critical account of an article and a micro-project based on this article, and to give an oral presentation. At the end of this oral some questions of knowledge and understanding of the course will also be asked.
For the choice of an article, as an indication, see the articles that will be listed as course material, and the articles chosen for validation in the recent years (see the
course website on the eCampus platform.

Oral exam dates: March 12, 19 or 26 (to be confirmed, number of sessions according to the number of students).

For more details on the validation modalities, see here.

No prerequisite
There is no strong prerequisite. The necessary concepts and tools are introduced as and when needed (notably for what concerns tools from statistical physics and information theory). Some knowledge of neurobiology, information theory, dynamical systems or statistical physics will be helpful but not necessary.

Contact
For any question, contact me by email: jean-pierre.nadal "AT" phys.ens.fr


Main topics to be discussed

Working memory

- "the magical number 7, plus or minus 2" (cognitive psychologist Miller's formula, 1956). Our short-term memory is just good enough to momentarily remember a telephone number consisting of 10 digits (or 5 pairs of digits). Can we understand this low capacity of short-term memory, and its universality (capacity independent of the type of objects memorised (numbers, letters, words, ...)?
- ergodicity breaking: How can we understand that, although very noisy, neuronal activity can remain for a long time specific to a stimulus presented briefly?

Tools and models: Statistical Physics / Random Markov Fields; "attractor networks", limits of very large networks; Hebbian plasticity; Hopfield model. Related topics: Statistical physics of disordered systems (e.g. "spin glasses"); combinatorial optimisation (e.g. K-satisfaction); formation of coalitions (between countries, between firms), emergence of social norms.

Supervised Learning

In the cerebellum, which is involved in motor learning and control, Purkinje cells are considered to play a major role in learning motor sequences. This results in changes in the synaptic efficacies of the synapses controlling the transfer of information from other areas to the Purkinje cells. Experiments indicate that a significant number - up to 80%! - of these synapses are silent, transmitting no signals. We shall see that this observation is understandable if we assume that, paradoxically, learning is optimised.

Tools and models: statistical learning, Perceptron.
Related topics: Support Vector Machines (SVM), and, obviously, deep learning.

Unsupervised learning - neural coding

Immediately after birth, the visual system undergoes important changes depending on the visual stimuli received. What principles govern this "self-organisation"? More generally, can we characterise the nature of the adaptation of a sensory system to the environment?

Tools and models: unsupervised Hebbian learning; information theory: mutual information (Shannon); histogram equalisation; redundancy reduction (Barlow); principal component analysis (PCA), independent component analysis (ICA).
Related topic: "natural image" statistics.

Population coding

In sensory areas, one often observes distributed neuronal representations, in which each neuron has a specific activity for a certain range of stimuli. Similar representations are observed in motor areas. In the rat, for example, cells code for head direction, each cell having a maximum response for a particular direction value ("preferred orientation" of the cell). What makes this type of coding effective? Can we say that each cell "codes for its preferred stimulus" (e.g. is orientation information simply given by the activity of neurons whose preferred stimulus is this orientation)?

Tools and models: information theory: mutual information (Shannon), estimation theory: Fisher information; Poisson process (as a model for neurons emitting spikes).

Related subject: ethology. The male peacock courts the female peacock by doing the wheel. In some species of swallows, males have red plumage at the throat, and females prefer the males with the most beautiful red. Why these behaviours - and what does this have to do with this course?

Categorical Perception, Perceptual decision making

A speech signal is a continuous signal, a modulation of pressure. Our hearing system makes us perceive it in a sequence of discrete elements - phonemes, syllables, words. Similarly, the visual system makes us perceive objects from continuous sensory inputs. Understanding the mechanisms underlying categorical perception is an important theme in neuroscience and cognitive science. Many laboratory experiments highlight generic properties of categorical perception when categories can be ambiguous (close phonemes, cat/dog, etc). We will see how the hypothesis of efficient coding makes it possible to account for these properties.

Categorical perception is a simple but remarkable example of decision making (deciding which category is present). A lot of work has been done on this aspect, particularly through the analysis of reaction times. For instance one finds that the more ambiguous the stimulus, the greater the average time. We will see how to link the characteristics of decision times with those of neuronal coding. Finally, we will see that some properties require the use of models taking into account neuronal dynamics (attractor networks), with therefore more biophysics. These models make it possible to account for the level of confidence in a decision.

Tools and models: information theory, Bayesian inference. Accumulation of evidence: random walks, Brownian motion. Neural dynamics: attractor neural networks.

Categorization: Artificial networks vs. human brain

The classes on supervised learning and on categorical perception will be an opportunity to discuss the similarities and differences between artificial and natural neural systems. We will see how advances in machine learning allow innovation in the analysis of neuronal activity, and conversely how neuroscience and cognitive science can shed light on the nature of representations constructed by deep networks.


Back to top