📖
Wiki
CNCFSkywardAIHuggingFaceLinkedInKaggleMedium
  • Home
    • 🚀About
  • 👩‍💻👩Freesoftware
    • 🍉The GNU Hurd
      • 😄The files extension
      • 📽️Tutorial for starting
      • 🚚Continue Working for the Hurd
      • 🚴‍♂️cgo
        • 👯‍♀️Statically VS Dynamically binding
        • 🧌Different ways in binding
        • 👨‍💻Segfault
      • 🛃Rust FFI
    • 🧚🏻‍♂️Programming
      • 📖Introduction to programming
      • 📖Mutable Value Semantics
      • 📖Linked List
      • 📖Rust
        • 📖Keyword dyn
        • 📖Tonic framework
        • 📖Tokio
        • 📖Rust read files
  • 🛤️AI techniques
    • 🗄️framework
      • 🧷pytorch
      • 📓Time components
      • 📓burn
    • 🍡Adaptation
      • 🎁LoRA
        • ℹ️Matrix Factorization
        • 📀SVD
          • ✝️Distillation of SVD
          • 🦎Eigenvalues of a covariance matrix
            • 🧧Eigenvalues
            • 🏪Covariance Matrix
        • 🛫Checkpoint
      • 🎨PEFT
    • 🙋‍♂️Training
      • 🛻Training with QLoRA
      • 🦌Deep Speed
    • 🧠Stable Diffusion
      • 🤑Stable Diffusion model
      • 📼Stable Diffusion v1 vs v2
      • 🤼‍♀️The important parameters for stunning AI image
      • ⚾Diffusion in image
      • 🚬Classifier Free Guidance
      • ⚜️Denoising strength
      • 👷Stable Diffusion workflow
      • 📙LoRA(Stable Diffusion)
      • 🗺️Depth maps
      • 📋CLIP
      • ⚕️Embeddings
      • 🕠VAE
      • 💥Conditioning
      • 🍁Diffusion sampling/samplers
      • 🥠Prompt
      • 😄ControlNet
        • 🪡Settings Explained
        • 🐳ControlNet with models
    • 🦙Large Language Model
      • ☺️SMID
      • 👨‍🌾ARM NEON
      • 🍊Metal
      • 🏁BLAS
      • 🍉ggml
      • 💻llama.cpp
      • 🎞️Measuring model quality
      • 🥞Type for NNC
      • 🥞Token
      • 🤼‍♂️Doc Retrieval && QA with LLMs
      • Hallucination(AI)
    • 🐹diffusers
      • 💪Deconstruct the Stable Diffusion pipeline
  • 🎹Implementing
    • 👨‍💻diffusers
      • 📖The Annotated Diffusion Model
  • 🧩Trending
    • 📖Trending
      • 📖Vector database
      • 🍎Programming Languages
        • 📖Go & Rust manage their memories
        • 📖Performance of Rust and Python
        • 📖Rust ownership and borrowing
      • 📖Neural Network
        • 🎹Sliding window/convolutional filter
      • Quantum Machine Learning
  • 🎾Courses Collection
    • 📖Courses Collection
      • 📚Academic In IT
        • 📝Reflective Writing
      • 📖UCB
        • 📖CS 61A
          • 📖Computer Science
          • 📖Scheme
          • 📖Python
          • 📖Data Abstraction
          • 📖Object-Oriented Programming
          • 📖Interpreters
          • 📖Streams
      • 🍎MIT Algorithm Courses
        • 0️MIT 18.01
          • 0️Limits and continuity
          • 1️Derivatives
          • 3️Integrals
        • 1️MIT 6.042J
          • 🔢Number Theory
          • 📊Graph Theory
            • 🌴Graph and Trees
            • 🌲Shortest Paths and Minimum Spanning Trees
        • 2️MIT 6.006
          • Intro and asymptotic notation
          • Sorting and Trees
            • Sorting
            • Trees
          • Hashing
          • Graphs
          • Shortest Paths
          • Dynamic Programming
          • Advanced
        • 3️MIT 6.046J
          • Divide and conquer
          • Dynamic programming
          • Greedy algorithms
          • Graph algorithms
Powered by GitBook
On this page
  • What is its value?
  • Example
  • Formula
  • The important properties of the covariance matrix
  • Summary

Was this helpful?

Edit on GitHub
  1. AI techniques
  2. Adaptation
  3. LoRA
  4. SVD
  5. Eigenvalues of a covariance matrix

Covariance Matrix

A covariance matrix is a symmetric square matrix that summarizes the variances and covariances between variables in a dataset.

What is its value?

It provides a measure of how two variables change together.

Example

If you have a dataset with n variables, the covariance matrix will be an n x n matrix. The element in the i-th row and j-th column represents the covariance between variables i and j.

The diagonal elements of the covariance matrix represent the variances of the individual variables, while the off-diagonal elements represent the covariances between pairs of variables.

Formula

Mathematically, for a dataset with variables X₁, X₂, ..., Xₙ, the covariance between variables i and j is computed as:

cov(Xi,Xj)=E[(Xi−μi)(Xj−μj)]cov(Xᵢ, Xⱼ) = E[(Xᵢ - μᵢ)(Xⱼ - μⱼ)]cov(Xi​,Xj​)=E[(Xi​−μi​)(Xj​−μj​)]

Where:

  • E[ ] denotes the expected value (or average)

  • μᵢ and μⱼ represent the means of variables Xᵢ and Xⱼ, respectively.

The important properties of the covariance matrix

  1. Symmetry: The covariance matrix is always symmetric because cov(Xᵢ, Xⱼ) = cov(Xⱼ, Xᵢ).

  2. Diagonal elements: The diagonal elements of the covariance matrix represent the variances of the individual variables: cov(Xᵢ, Xᵢ) = var(Xᵢ).

  3. Positive semi-definiteness: The covariance matrix is positive semi-definite, which means that its eigenvalues are non-negative.

Summary

The covariance matrix is used to capture the relationships and dependencies between variables in multivariate data.

PreviousEigenvaluesNextCheckpoint

Last updated 1 year ago

Was this helpful?

🛤️
🍡
🎁
📀
🦎
🏪