πŸ“–
Wiki
CNCFSkywardAIHuggingFaceLinkedInKaggleMedium
  • Home
    • πŸš€About
  • πŸ‘©β€πŸ’»πŸ‘©Freesoftware
    • πŸ‰The GNU Hurd
      • πŸ˜„The files extension
      • πŸ“½οΈTutorial for starting
      • 🚚Continue Working for the Hurd
      • πŸš΄β€β™‚οΈcgo
        • πŸ‘―β€β™€οΈStatically VS Dynamically binding
        • 🧌Different ways in binding
        • πŸ‘¨β€πŸ’»Segfault
      • πŸ›ƒRust FFI
    • πŸ§šπŸ»β€β™‚οΈProgramming
      • πŸ“–Introduction to programming
      • πŸ“–Mutable Value Semantics
      • πŸ“–Linked List
      • πŸ“–Rust
        • πŸ“–Keyword dyn
        • πŸ“–Tonic framework
        • πŸ“–Tokio
        • πŸ“–Rust read files
  • πŸ›€οΈAI techniques
    • πŸ—„οΈframework
      • 🧷pytorch
      • πŸ““Time components
      • πŸ““burn
    • 🍑Adaptation
      • 🎁LoRA
        • ℹ️Matrix Factorization
        • πŸ“€SVD
          • ✝️Distillation of SVD
          • 🦎Eigenvalues of a covariance matrix
            • 🧧Eigenvalues
            • πŸͺCovariance Matrix
        • πŸ›«Checkpoint
      • 🎨PEFT
    • πŸ™‹β€β™‚οΈTraining
      • πŸ›»Training with QLoRA
      • 🦌Deep Speed
    • 🧠Stable Diffusion
      • πŸ€‘Stable Diffusion model
      • πŸ“ΌStable Diffusion v1 vs v2
      • πŸ€Όβ€β™€οΈThe important parameters for stunning AI image
      • ⚾Diffusion in image
      • 🚬Classifier Free Guidance
      • ⚜️Denoising strength
      • πŸ‘·Stable Diffusion workflow
      • πŸ“™LoRA(Stable Diffusion)
      • πŸ—ΊοΈDepth maps
      • πŸ“‹CLIP
      • βš•οΈEmbeddings
      • πŸ• VAE
      • πŸ’₯Conditioning
      • 🍁Diffusion sampling/samplers
      • πŸ₯ Prompt
      • πŸ˜„ControlNet
        • πŸͺ‘Settings Explained
        • 🐳ControlNet with models
    • πŸ¦™Large Language Model
      • ☺️SMID
      • πŸ‘¨β€πŸŒΎARM NEON
      • 🍊Metal
      • 🏁BLAS
      • πŸ‰ggml
      • πŸ’»llama.cpp
      • 🎞️Measuring model quality
      • πŸ₯žType for NNC
      • πŸ₯žToken
      • πŸ€Όβ€β™‚οΈDoc Retrieval && QA with LLMs
      • Hallucination(AI)
    • 🐹diffusers
      • πŸ’ͺDeconstruct the Stable Diffusion pipeline
  • 🎹Implementing
    • πŸ‘¨β€πŸ’»diffusers
      • πŸ“–The Annotated Diffusion Model
  • 🧩Trending
    • πŸ“–Trending
      • πŸ“–Vector database
      • 🍎Programming Languages
        • πŸ“–Go & Rust manage their memories
        • πŸ“–Performance of Rust and Python
        • πŸ“–Rust ownership and borrowing
      • πŸ“–Neural Network
        • 🎹Sliding window/convolutional filter
      • Quantum Machine Learning
  • 🎾Courses Collection
    • πŸ“–Courses Collection
      • πŸ“šAcademic In IT
        • πŸ“Reflective Writing
      • πŸ“–UCB
        • πŸ“–CS 61A
          • πŸ“–Computer Science
          • πŸ“–Scheme
          • πŸ“–Python
          • πŸ“–Data Abstraction
          • πŸ“–Object-Oriented Programming
          • πŸ“–Interpreters
          • πŸ“–Streams
      • 🍎MIT Algorithm Courses
        • 0️MIT 18.01
          • 0️Limits and continuity
          • 1️Derivatives
          • 3️Integrals
        • 1️MIT 6.042J
          • πŸ”’Number Theory
          • πŸ“ŠGraph Theory
            • 🌴Graph and Trees
            • 🌲Shortest Paths and Minimum Spanning Trees
        • 2️MIT 6.006
          • Intro and asymptotic notation
          • Sorting and Trees
            • Sorting
            • Trees
          • Hashing
          • Graphs
          • Shortest Paths
          • Dynamic Programming
          • Advanced
        • 3️MIT 6.046J
          • Divide and conquer
          • Dynamic programming
          • Greedy algorithms
          • Graph algorithms
Powered by GitBook
On this page
  • Overview
  • How does textual inversion work?
  • Example of embeddings
  • Embedding an object
  • Embedding a style
  • Where to find embeddings?
  • How to use embeddings?
  • Web interface
  • AUTOMATIC111
  • Checking the embeddings are using in AUTOMATIC1111
  • Pros and cons of using embedding
  • Pros
  • Cons
  • Credit

Was this helpful?

Edit on GitHub
  1. AI techniques
  2. Stable Diffusion

Embeddings

The textual inversion

PreviousCLIPNextVAE

Last updated 1 year ago

Was this helpful?

Overview

Embedding is the result of , a method to define new keywords in a model without modifying it. The method has gained attention because its capable of injecting new styles or objects to a model with as few as 3 -5 sample images.

How does textual inversion work?

The textual inversion is NOT the ability to add new styles or objects β€” other fine-tuning methods can do that as well or better. It is the fact that it can do so without changing the model.

First you define a new keyword that’s not in the model for the new object or style. That new keyword will get tokenized (that is represented by a number) just like any other keywords in the prompt.

Each token is then converted to a unique embedding vector to be used by the model for image generation.

Textual inversion finds the embedding vector of the new keyword that best represents the new style or object, without changing any part of the model. You can think of it as finding a way within the language model to describe the new concept.

Example of embeddings

Embedding an object

Toy cat can be used with other existing concepts (boat, backpack) in the model

Embedding a style

Where to find embeddings?

How to use embeddings?

Web interface

The downside of web interface is you cannot use the embedding with a different model or change any parameters.

AUTOMATIC111

Next, rename the file as the keyword you wanted to use this embedding with. It has to be something not exist in the model. marc_allante.bin is a good choice.

Put it in the embeddings folder in the GUI’s working directory: `stable-diffusion-webui/embeddings`

Restart the GUI. In startup terminal, you should see a message like:

Loaded a total of 1 textual inversion embeddings.
Embeddings: marc_allante

Use the filename as part of the prompt to

(marc_allante:1.2)  a dog

Checking the embeddings are using in AUTOMATIC1111

There’s a button between the trash and the copy buttons:

Click it and you will see all the embeddings that are available. They are all under the Textual Inversion tab.

Clicking any of them will insert that into the prompt. This function is especially useful to eliminate the tedious work of making sure you’ve entered the embedding magic word correctly.

Pros and cons of using embedding

Pros

  • It is small size (100KB or less)

Cons

  • The drawback of using embedding is sometimes its not clear which model it is supposed to be used with.

Credit

Hugging Face host the , which is a repository of large number of custom embeddings.

is another great site you can browse models, including embeddings. Filter with textual inversion to view embeddings only.

is a great way to try out embeddings without downloading them.

First, download an embedding file from the . It is the file named learned_embedds.bin. Make sure don’t right click and save in the below screen. That will save a webpage that it links to. Click of the file name and click the download button in the next page.

πŸ›€οΈ
🧠
βš•οΈ
Stable Diffusion Concept Library
Civtai
Stable Diffusion Conceptualizer
Concept Library
textual inversion
LogoHow to use embeddings in Stable Diffusion - Stable Diffusion ArtStable Diffusion Art
New embedding is found for the new token S* through textual inversion. Source: original research article
Example of embedding an object.
Example of embedding a style.