a statistical physics approach

Suggestions for further readings:

Lecture 2: Representations: Auto-encoders, Restricted Boltzmann Machines and Sparse feature learning

Lecture 3: Restricted Boltzmann Machines: connections with graphical models, phase transitions & applications

Hand-written notes: Transition in PCA and the Marcenko-Pastur eigenvalue distribution, Inference of a symmetry breaking direction, Linear AutoEncoders and PCA, Dynamics of Oja's rule

Self-references:

- Information theory, inference, learning algorithms. David MacKay, Cambridge University Press (2003)
- Introduction to the theory of neural computation. John Hertz, Andreas Hertz, Richard Palmer, Santa Fe Institute series (1991)
- Representation Learning: A Review and New Perspectives. Yoshua Bengio, Aaron Courville, Pascal Vincent. IEEE Transactions on Pattern Analysis and Machine Intelligence 35, 1798-1828 (2013)
- Nonlinear Hebbian Learning as a Unifying Principle in Receptive Field Formation. Carlos S.N. Brito, Wulfram Gerstner, PLoS Computational Biology 12: e1005070 (2016)

- Statistical physics and representations in real and artificial neural networks. Simona Cocco, Remi Monasson, Lorenzo Posani, Sophie Rosay, Jerome Tubiana, Physica A 504,45-76 (2018)
- Emergence of compositional representations in restricted Boltzmann machines. J. Tubiana, R. Monasson Physical Review Letters 118, 138501 (2017)
- Inverse statistical physics of protein sequences: a key issues review Simona Cocco, Christoph Feinauer, Matteo Figliuzzi, Remi Monasson, Martin Weigt, Rep. Phys. Prog. 81, 032601 (2018)
- Learning protein constitutive motifs from sequence data. J. Tubiana, S. Cocco, R. Monasson eLife 2019;8:e39397 (2019)