Blog

Timeline of Deep Learning

A timeline highlighting key milestones in the development of deep learning.

Phase Transition Point in Classical Dynamics

A brief derivation of the stability/chaos transition in a classical random dynamical system using mean-field theory.

Timeline for Statistical Mechanics of Neural Networks

A timeline highlighting key developments in the application of statistical mechanics to neural networks.

DMFT Method of Random Recurrent Neural Networks

This post introduces the dynamical mean field theory (DMFT) of random recurrent neural networks, proposed by H. Sompolinsky et al. in 1988. The DMFT provides a powerful framework to analyze the dynamics of large-scale random RNNs, which are widely used in neuroscience and machine learning.

Replica Method for Boolean Perceptron with Continuous Weight

An brief derivation for the asymptotic generalization error of a Boolean perceptron with continuous weights using the replica method. This problem was also the final assignment in the 'Statistical Mechanics of Neural Networks' course at SYSU, Fall 2024.

Scaling of Entropy Fluctuations and Complexity Across Free-Energy Levels

This problem was an assignment in the 'Statistical Mechanics of Neural Networks' course at SYSU, Fall 2024. In this post, we analyze how the scaling of entropy fluctuations affects the configurational entropy (complexity) across different free-energy levels in a simple toy model.

Replica Method for the Sherrington-Kirkpatrick Model

This note reviews the replica symmetric solution of the SK model, including sufficiently detailed derivations, and then shows the phase diagrams of the order parameters and free energy by numerical calculations.

Errata for "Replica calculations for the SK model"

This post lists errata for the note "Replica calculations for the SK model" of Haozhe Shan.

Cavity Method for the Sherrington-Kirkpatrick Model

.