πŸ“€SVD

Singular Value Decomposition(A mathematical technique)

What is SVD?

Singular value decomposition (SVD) is a matrix factorization technique that can be used to reduce the dimensionality, denoise, and visualize data. It is a fundamental technique in machine learning and data science.

It works by decomposing a matrix into three matrices:

  • A diagonal matrix of singular values

  • A matrix of left singular vectors

  • A matrix of right singular vectors

Formula

Given a matrix A, the SVD factorizes it into the following form:

A=UΞ£VTA=UΞ£V^T

Where:

  • U is an orthogonal matrix whose columns represent the left singular vectors of A.

  • Ξ£ is a diagonal matrix with non-negative values, known as singular values.

  • V^T is the transpose of an orthogonal matrix V, whose columns represent the right singular vectors of A.

The singular values in Ξ£ are arranged in descending order along the diagonal. They indicate the importance or significance of the corresponding singular vectors in U and V^T. The first singular vector pair captures the most significant pattern or structure in the matrix, while subsequent pairs capture less important patterns.

SVD Works Detail

The left and right singular vectors are the eigenvectors of the covariance matrix.

Singular values

The singular values are the square roots of the eigenvalues of the covariance matrix of the data. The eigenvalues of the covariance matrix are the variances of the data along each principal component. The singular values are arranged in descending order, with the largest singular value corresponding to the first principal component, the second largest singular value corresponding to the second principal component, and so on.

Left Singular Vectors

The left singular vectors are the eigenvectors of the covariance matrix corresponding to the singular values. The left singular vectors are arranged in the same order as the singular values. The left singular vector corresponding to the first singular value is the direction of the first principal component, the left singular vector corresponding to the second singular value is the direction of the second principal component, and so on.

Right Singular Vectors

The right singular vectors are the eigenvectors of the covariance matrix transposed corresponding to the singular values. The right singular vectors are arranged in the same order as the singular values. The right singular vector corresponding to the first singular value is the direction of the first principal component, the right singular vector corresponding to the second singular value is the direction of the second principal component, and so on.

Reducing the dimensionality

SVD can be used to reduce the dimensionality of a dataset by projecting the data onto a lower-dimensional subspace. The subspace is spanned by the left singular vectors corresponding to the largest singular values. The smaller singular values are discarded, as they account for less of the variance of the data.

Denoising a dataset

SVD can also be used to denoise a dataset. The noise in the data is typically concentrated in the directions of the smaller singular values. By discarding the smaller singular values, the noise in the data can be significantly reduced.

Last updated