This short note sketches a mean-field route to the stability/chaos transition in a simple classical random dynamical system. The model is a continuous-time analogue of a random neural network and the result is the familiar effective-gain threshold at which the maximal Lyapunov exponent changes sign.

Model

We consider

\[\partial_t x_i(t) \;=\; -\,x_i(t) \;+\; \sum_j J_{ij}\,\phi\big(x_j(t)\big)\]

where $J_{ij}\sim\mathcal{N}(0, g^2/N)$, $\phi$ is a smooth nonlinearity (e.g. $\tanh$), and $g$ is a gain parameter.

In the large-$N$ limit, averages over $i$ and over the disorder become self-averaging.

We will use the two-time correlations

\[\Delta(\tau)\;=\;\big\langle x_i(t)\,x_i(t+\tau)\big\rangle \qquad C_\phi(\tau)\;=\;\big\langle \phi\big(x_j(t)\big)\,\phi\big(x_j(t+\tau)\big)\big\rangle\]

assuming stationarity so that the correlations depend on the time difference $\tau$ only. (Indices are dummy under the mean-field average.)

Tangent dynamics and a two-time PDE

Introduce a small perturbation $x_i\mapsto x_i+\delta x_i$. The perturbed dynamics is

\[\partial_t\big(x_i+\delta x_i\big) \;=\; -\big(x_i+\delta x_i\big) \;+\; \sum_j J_{ij}\,\phi\big(x_j+\delta x_j\big)\]

Linearizing Eqs. (1)–(3) yields the tangent (variational) equation

\[(\partial_t+1)\,\delta x_i(t) \;=\; \sum_j J_{ij}\,\phi'\big(x_j(t)\big)\,\delta x_j(t)\]

The same equation at a shifted time $t+\tau$ is

\[(\partial_{t+\tau}+1)\,\delta x_i(t+\tau) \;=\; \sum_j J_{ij}\,\phi'\big(x_j(t+\tau)\big)\,\delta x_j(t+\tau)\]

Multiply (4) and (5), average over $i$ and disorder, and define the perturbation correlator

\[\Delta_\delta(t,\tau)\;=\;\big\langle \delta x_i(t)\,\delta x_i(t+\tau)\big\rangle \qquad C_{\phi'}(t,\tau)\;=\;\big\langle \phi'\big(x_j(t)\big)\,\phi'\big(x_j(t+\tau)\big)\big\rangle\]

Using $J_{ij}$-independence and $\mathrm{Var}(J_{ij})=g^2/N$ gives the closed large-$N$ equation

\[\Big[(\partial_t+1)(\partial_{t+\tau}+1)\Big]\;\Delta_\delta(t,\tau) \;=\; g^2\,C_{\phi'}(t,\tau)\,\Delta_\delta(t,\tau)\]

Now switch to “center” and “difference” times

\[T \;=\; t+(t+\tau) \quad \tau \;=\; (t+\tau)-t \quad\Rightarrow\quad \partial_t=\partial_T-\partial_\tau \quad \partial_{t+\tau}=\partial_T+\partial_\tau\]

Then (7) becomes

\[\big[(1+\partial_T)^2 - \partial_\tau^2\big]\,\Delta_\delta(T,\tau) \;=\; g^2\,C_{\phi'}(T,\tau)\,\Delta_\delta(T,\tau)\]

Mean-field closure for the unperturbed correlations

For the unperturbed process (1), the standard mean-field closure gives (under stationarity or after Fourier transforming in $T$)

\[\big(1 - \partial_\tau^2\big)\,\Delta(\tau)\;=\; g^2\,C_\phi(\tau)\]

It is convenient to view (10) as a Newton equation for a “particle” at coordinate $\Delta$ in an effective potential $V(\Delta)$:

\[\ddot{\Delta}(\tau) \;=\; -\,\frac{\partial V}{\partial \Delta} \qquad V(\Delta) \;=\; -\frac{1}{2}\,\Delta^2 \;+\; g^2\,V_2(\Delta) \qquad \frac{\partial V_2}{\partial \Delta} \;=\; C_\phi(\tau) \tag{11}\]

where dots denote $\tau$-derivatives. The dependence $C_\phi(\tau)=C_\phi\big(\Delta(\tau)\big)$ follows from a Gaussian closure: for large $N$ the field $x$ is approximately Gaussian and two-time statistics of $\phi(x)$ are functionals of $\Delta$.

A related identity under the same closure is

\[C_{\phi'}(\tau) \;=\; \frac{\partial C_\phi(\tau)}{\partial \Delta(\tau)} \;\equiv\; \frac{\partial^2 V_2}{\partial \Delta^2}\]

Schrödinger reduction for the growth mode

Assume stationarity in the background ($C_{\phi’}(T,\tau)\to C_{\phi’}(\tau)$) and look for separable solutions to (9) of the form

\[\Delta_\delta(T,\tau)\;=\; e^{\frac{k}{2}T}\,\psi(\tau) \qquad (T=2t)\]

Plugging (13) into (9) gives a time-independent “Schrödinger” equation \(\Big(-\partial_\tau^2 \;+\; U(\tau)\Big)\,\psi(\tau) \;=\; \mathcal{E}\,\psi(\tau) \qquad U(\tau) \;=\; -\,\frac{\partial^2 V}{\partial \Delta^2}(\tau) \qquad \mathcal{E} \;=\; 1 - \big(1+\tfrac{k}{2}\big)^2\)

The ground-state eigenvalue controls the largest growth rate. Near $\tau=0$ one may approximate $U(\tau)$ by its value at the origin. Using (11)–(12) one finds

\[U(0) \;=\; -\Big(-1 + g^2\,\frac{\partial^2 V_2}{\partial \Delta^2}\Big) \;=\; 1 - g^2\,C_{\phi'}(0)\]

With a flat potential approximation around $\tau=0$, the ground-state energy is

\[E_0 \;=\; U(0) \;=\; 1 - G^2 \qquad G^2 \;\equiv\; g^2\,C_{\phi'}(0) \;=\; g^2\,\big\langle \phi'\big(x\big)^2\big\rangle\]

where the last equality uses stationarity of the one-time marginal of $x$.

Equating $E_0$ with $\mathcal{E}$ from (14) gives

\[1 - G^2 \;=\; 1 - \big(1+\tfrac{k}{2}\big)^2 \quad\Longrightarrow\quad \frac{k}{2} \;=\; G - 1\]

where we picked the positive branch corresponding to growth.

Maximal Lyapunov exponent and the transition

The maximal Lyapunov exponent is

\[\lambda_{\max} \;=\; \lim_{t\to\infty}\frac{1}{t}\log\frac{\|\delta x(t)\|_2}{\|\delta x(0)\|_2} \;=\; \lim_{t\to\infty}\frac{1}{2t}\log\sum_i \big(\delta x_i(t)\big)^2\]

With $T=2t$ and the ansatz (13),

\[\lambda_{\max} \;=\;\frac{1}{T}\log e^{\frac{k}{2}T} \;=\; \frac{k}{2}\]

Combining (17)–(19) yields the compact expression

\[\boxed{\;\lambda_{\max} \;=\; G - 1,\qquad G^2 = g^2\,\langle \phi'(x)^2\rangle\;}\]

Thus the phase transition occurs at

\[G_c = 1 \quad\Longleftrightarrow\quad g_c^2\,\langle \phi'(x)^2\rangle = 1\]

below which perturbations decay ($\lambda_{\max}<0$) and above which they grow exponentially ($\lambda_{\max}>0$). For a linear system $\phi(x)=x$, one has $\langle \phi’(x)^2\rangle=1$ and the formula reduces to the intuitive $\lambda_{\max}=g-1$.

Remarks

  • The effective gain $G$ automatically incorporates the nonlinearity through the stationary variance of $x$. In practice, $G$ must be found self-consistently from (10)–(12).
  • The flat-$U(\tau)$ approximation around $\tau=0$ captures the threshold and leading behavior. Keeping the full $\tau$-dependence refines the spectrum but does not move the transition at $G_c=1$.
  • For sigmoids such as $\phi(x)=\tanh x$, saturation reduces $\langle\phi’(x)^2\rangle$ below $1$, pushing $g_c$ upward compared to the linear case.