Last modified: May 20, 2018
This article is written in: 🇺🇸
Eigenvalues and eigenvectors are foundational concepts in linear algebra, with extensive applications across various domains such as physics, computer graphics, and machine learning. These concepts are instrumental in decomposing complex matrix transformations, thereby simplifying numerical computations.
An eigenvector of a square matrix $A$ is a non-zero vector $v$ that, when multiplied by $A$, results in a scaled version of $v$. The scalar factor is the eigenvalue corresponding to that eigenvector. In mathematical terms, this relationship is described as:
$$ A v = \lambda v $$
where:
Eigenvalues: Eigenvalues are calculated by solving the characteristic equation, formulated as $det(A - \lambda I) = 0$. Here, $I$ is the identity matrix of the same dimension as $A$, and $det(\cdot)$ denotes the determinant. The roots of this equation yield the eigenvalues.
Eigenvectors: Upon finding each eigenvalue, its corresponding eigenvectors are obtained by substituting the eigenvalue into the equation $(A - \lambda I)v = 0$, followed by computing the null space.
Consider a 2x2 matrix:
$$A = \begin{bmatrix} 4 & 1 \ 2 & 3 \ \end{bmatrix}$$
Solve the characteristic equation $det(A - \lambda I) = 0$, which gives $\lambda^2 - 7\lambda + 10 = 0$. The roots of this equation are $\lambda_1 = 2$ and $\lambda_2 = 5$, representing the eigenvalues.
To find the corresponding eigenvectors: