In Convergence of Sequences (1) we focused on scalar sequences.
Today we will study the local stability of nonlinear multidimensional sequences defined by recursions like
\[x_{t} = f(x_{t-1}),\quad t=1,2,\dots\] where \(f\) is a nonlinear function \(f:\mathbb{R}^n \to \mathbb{R}^n\) and \(x_0\) is an initial vector in \(\mathbb{R}^n\).Assume there is a stationary point \(\overline{x}\) such that \[f(\overline{x})=\overline{x}\]
We want to understand the behavior of \(x_t\) as \(t\to\infty\) when we start near \(\overline{x}\).
At first order, we can study the linearized dynamics around \(\overline{x}\): \[y_t = A y_{t-1},\quad A = f'(\overline{x}),\quad y_t = x_t - \overline{x}.\]
The recursion \(y_t = A y_{t-1}\) has a closed form solution \[ y_t = A^t\,y_0. \]
Two questions drive the dynamics:
The answers depend on how we measure “size” and on whether \(A\) is normal (commutes with \(A^*\)) or non-normal.
We use here the following norms:
It yields a simple estimate of the convergence rate: \[\frac{\|y_t\|}{\|y_{t-1}\|} = \frac{\|A y_{t-1}\|_2}{\|y_{t-1}\|_2} \le \|A\|_2.\]
Definition (spectral radius): \[ \rho(A) := \max\{ |\lambda| : \lambda \in \sigma(A)\}. \]
Stability (discrete time): \[ \rho(A) < 1 \quad\Longleftrightarrow\quad A^t \to 0\ (t\to\infty). \]
Intuition: eigenvalues give the long-run scaling of invariant “modes” (exactly for diagonalizable/normal matrices; asymptotically for all).
But: eigenvalues alone can miss large short-run growth when \(A\) is non-normal.
A canonical 2×2 example: \[ A = \begin{bmatrix}0.9 & M\\ 0 & 0.9\end{bmatrix},\quad M\gg 1. \]
We have the closed form:
\[A^k = \begin{bmatrix}0.9^k & k M 0.9^{k-1}\\ 0 & 0.9^k\end{bmatrix}\]
And \(\|A^k\|_2\) grows like \(k M 0.9^{k-1}\) for large \(M\) and small \(k\), before eventually decaying to zero as \(k\to\infty\).
Even though the spectral radius is not a norm 1, it is related to matrix norms in the following way.
For any matrix norm \(\|\cdot\|\) (including the matrix norm), we have the following formula for the spectral radius: \[ \boxed{\ \rho(A) = \lim_{k\to\infty} \lVert A^k\rVert^{1/k}\ } \]
Interpretation
Remark
The classic power iteration targets the eigenvalue of largest modulus, i.e. it can estimate \(\rho(A)\) under standard conditions (dominant eigenvalue, start vector not orthogonal to its eigenvector).
Algorithm (basic form)
Convergence intuition
Now we want to study the convergence of nonlinear sequences defined by recursions like:
\[x_{n+1} = f(x_n),\quad n=0,1,2,\dots\]
with steady-state \(\overline{x}\) such that \(f(\overline{x})=\overline{x}\).
Denote by \(A=f'(\overline{x})\) the Jacobian matrix of \(f\) at \(\overline{x}\).
At which condition on \(A\) can we guarantee local convergence of \(f\) to \(\overline{x}\)? At which rate?
Insteady of studying stability of \(\overline{x}\) (that is for any \(x_0 \in \mathbb{R}^n\)), we can also characterize all the different dynamics we can obtain for different initial \(x_0\) (read different initial shocks to the economy).
We can define two sets of initial conditions:
Even when \(\rho(A)>1\), local dynamics of \(f\) can be characterized by \(A=f'(\overline{x})\)
\(\implies\) this is formalized in the stable manifold theorem
Definition (numerical radius):
\[ w(A) := \max_{\|x\|_2=1} |\langle Ax,x\rangle|. \]