🏪Covariance Matrix
Last updated
Last updated
A covariance matrix is a symmetric square matrix that summarizes the variances and covariances between variables in a dataset.
It provides a measure of how two variables change together.
If you have a dataset with n variables, the covariance matrix will be an n x n matrix. The element in the i-th row and j-th column represents the covariance between variables i and j.
The diagonal elements of the covariance matrix represent the variances of the individual variables, while the off-diagonal elements represent the covariances between pairs of variables.
Mathematically, for a dataset with variables X₁, X₂, ..., Xₙ, the covariance between variables i and j is computed as:
Where:
E[ ] denotes the expected value (or average)
μᵢ and μⱼ represent the means of variables Xᵢ and Xⱼ, respectively.
Symmetry: The covariance matrix is always symmetric because cov(Xᵢ, Xⱼ) = cov(Xⱼ, Xᵢ).
Diagonal elements: The diagonal elements of the covariance matrix represent the variances of the individual variables: cov(Xᵢ, Xᵢ) = var(Xᵢ).
Positive semi-definiteness: The covariance matrix is positive semi-definite, which means that its eigenvalues are non-negative.
The covariance matrix is used to capture the relationships and dependencies between variables in multivariate data.