πŸ“–
Wiki
CNCFSkywardAIHuggingFaceLinkedInKaggleMedium
  • Home
    • πŸš€About
  • πŸ‘©β€πŸ’»πŸ‘©Freesoftware
    • πŸ‰The GNU Hurd
      • πŸ˜„The files extension
      • πŸ“½οΈTutorial for starting
      • 🚚Continue Working for the Hurd
      • πŸš΄β€β™‚οΈcgo
        • πŸ‘―β€β™€οΈStatically VS Dynamically binding
        • 🧌Different ways in binding
        • πŸ‘¨β€πŸ’»Segfault
      • πŸ›ƒRust FFI
    • πŸ§šπŸ»β€β™‚οΈProgramming
      • πŸ“–Introduction to programming
      • πŸ“–Mutable Value Semantics
      • πŸ“–Linked List
      • πŸ“–Rust
        • πŸ“–Keyword dyn
        • πŸ“–Tonic framework
        • πŸ“–Tokio
        • πŸ“–Rust read files
  • πŸ›€οΈAI techniques
    • πŸ—„οΈframework
      • 🧷pytorch
      • πŸ““Time components
      • πŸ““burn
    • 🍑Adaptation
      • 🎁LoRA
        • ℹ️Matrix Factorization
        • πŸ“€SVD
          • ✝️Distillation of SVD
          • 🦎Eigenvalues of a covariance matrix
            • 🧧Eigenvalues
            • πŸͺCovariance Matrix
        • πŸ›«Checkpoint
      • 🎨PEFT
    • πŸ™‹β€β™‚οΈTraining
      • πŸ›»Training with QLoRA
      • 🦌Deep Speed
    • 🧠Stable Diffusion
      • πŸ€‘Stable Diffusion model
      • πŸ“ΌStable Diffusion v1 vs v2
      • πŸ€Όβ€β™€οΈThe important parameters for stunning AI image
      • ⚾Diffusion in image
      • 🚬Classifier Free Guidance
      • ⚜️Denoising strength
      • πŸ‘·Stable Diffusion workflow
      • πŸ“™LoRA(Stable Diffusion)
      • πŸ—ΊοΈDepth maps
      • πŸ“‹CLIP
      • βš•οΈEmbeddings
      • πŸ• VAE
      • πŸ’₯Conditioning
      • 🍁Diffusion sampling/samplers
      • πŸ₯ Prompt
      • πŸ˜„ControlNet
        • πŸͺ‘Settings Explained
        • 🐳ControlNet with models
    • πŸ¦™Large Language Model
      • ☺️SMID
      • πŸ‘¨β€πŸŒΎARM NEON
      • 🍊Metal
      • 🏁BLAS
      • πŸ‰ggml
      • πŸ’»llama.cpp
      • 🎞️Measuring model quality
      • πŸ₯žType for NNC
      • πŸ₯žToken
      • πŸ€Όβ€β™‚οΈDoc Retrieval && QA with LLMs
      • Hallucination(AI)
    • 🐹diffusers
      • πŸ’ͺDeconstruct the Stable Diffusion pipeline
  • 🎹Implementing
    • πŸ‘¨β€πŸ’»diffusers
      • πŸ“–The Annotated Diffusion Model
  • 🧩Trending
    • πŸ“–Trending
      • πŸ“–Vector database
      • 🍎Programming Languages
        • πŸ“–Go & Rust manage their memories
        • πŸ“–Performance of Rust and Python
        • πŸ“–Rust ownership and borrowing
      • πŸ“–Neural Network
        • 🎹Sliding window/convolutional filter
      • Quantum Machine Learning
  • 🎾Courses Collection
    • πŸ“–Courses Collection
      • πŸ“šAcademic In IT
        • πŸ“Reflective Writing
      • πŸ“–UCB
        • πŸ“–CS 61A
          • πŸ“–Computer Science
          • πŸ“–Scheme
          • πŸ“–Python
          • πŸ“–Data Abstraction
          • πŸ“–Object-Oriented Programming
          • πŸ“–Interpreters
          • πŸ“–Streams
      • 🍎MIT Algorithm Courses
        • 0️MIT 18.01
          • 0️Limits and continuity
          • 1️Derivatives
          • 3️Integrals
        • 1️MIT 6.042J
          • πŸ”’Number Theory
          • πŸ“ŠGraph Theory
            • 🌴Graph and Trees
            • 🌲Shortest Paths and Minimum Spanning Trees
        • 2️MIT 6.006
          • Intro and asymptotic notation
          • Sorting and Trees
            • Sorting
            • Trees
          • Hashing
          • Graphs
          • Shortest Paths
          • Dynamic Programming
          • Advanced
        • 3️MIT 6.046J
          • Divide and conquer
          • Dynamic programming
          • Greedy algorithms
          • Graph algorithms
Powered by GitBook
On this page

Was this helpful?

Edit on GitHub
  1. Courses Collection
  2. Courses Collection
  3. MIT Algorithm Courses
  4. MIT 6.042J
  5. Graph Theory

Shortest Paths and Minimum Spanning Trees

Shortest Paths

The shortest path problem involves finding the shortest path between two vertices in a graph.

Dijkstra's algorithm

import heapq

def dijkstra(graph, start):
    distances = {vertex: float('inf') for vertex in graph}
    distances[start] = 0
    heap = [(0, start)]
    
    while heap:
        (distance, current_vertex) = heapq.heappop(heap)
        if distance > distances[current_vertex]:
            continue
        for neighbor, weight in graph[current_vertex].items():
            path_cost = distance + weight
            if path_cost < distances[neighbor]:
                distances[neighbor] = path_cost
                heapq.heappush(heap, (path_cost, neighbor))
    
    return distances


# Example graph
graph = {
    'A': {'B': 3, 'C': 1},
    'B': {'A': 3, 'C': 4, 'D': 2},
    'C': {'A': 1, 'B': 4, 'D': 1},
    'D': {'B': 2, 'C': 1}
}

# Test case
assert dijkstra(graph, 'A') == {'A': 0, 'B': 3, 'C': 1, 'D': 2}

Bellman-Ford algorithm

def bellman_ford(graph, start):
    distances = {vertex: float('inf') for vertex in graph}
    distances[start] = 0

    for _ in range(len(graph) - 1):
        for vertex in graph:
            for neighbor, weight in graph[vertex].items():
                path_cost = distances[vertex] + weight
                if path_cost < distances[neighbor]:
                    distances[neighbor] = path_cost

    return distances

# Example graph
graph = {
    'A': {'B': -1, 'C': 4},
    'B': {'C': 3, 'D': 2, 'E': 2},
    'C': {},
    'D': {'B': 1, 'C': 5},
    'E': {'D': -3}
}

# Test case
assert bellman_ford(graph, 'A') == {'A': 0, 'B': -1, 'C': 2, 'D': -2, 'E': 1}

Minimum Spanning Trees

The minimum spanning tree problem involves finding a spanning tree of a graph with the smallest possible weight

Prim's algorithm

import heapq

def prim(graph, start):
    visited = set()
    edges = [(0, start)]
    min_spanning_tree = []
    
    while edges:
        weight, vertex = heapq.heappop(edges)
        if vertex not in visited:
            visited.add(vertex)
            min_spanning_tree.append((weight, vertex))
            for neighbor, edge_weight in graph[vertex].items():
                if neighbor not in visited:
                    heapq.heappush(edges, (edge_weight, neighbor))
    
    return min_spanning_tree

# Example graph
graph = {
    'A': {'B': 2, 'C': 3},
    'B': {'A': 2, 'C': 1, 'D': 1},
    'C': {'A': 3, 'B': 1, 'D': 2},
    'D': {'B': 1, 'C': 2}
}

# Test case
assert prim(graph, 'A') == [(2, 'B'), (1, 'C'), (1, 'D')]

Kruskal's algorithm

def kruskal(graph):
    parents = {}
    rank = {}
    for vertex in graph:
        parents[vertex] = vertex
        rank[vertex] = 0

    def find(vertex):
        if parents[vertex] != vertex:
            parents[vertex] = find(parents[vertex])
        return parents[vertex]

    def union(vertex1, vertex2):
        root1 = find(vertex1)
        root2 = find(vertex2)
        if root1 != root2:
            if rank[root1] > rank[root2]:
                parents[root2] = root1
            else:
                parents[root1] = root2
                if rank[root1] == rank[root2]:
                    rank[root2] += 1

    min_spanning_tree = []
    edges = [(weight, vertex1, vertex2) for vertex1 in graph for vertex2, weight in graph[vertex1].items()]
    edges.sort()
    for weight, vertex1, vertex2 in edges:
        if find(vertex1) != find(vertex2):
            union(vertex1, vertex2)
            min_spanning_tree.append((weight, vertex1, vertex2))

    return min_spanning_tree

# Example graph
graph = {
    'A': {'B': 7, 'D': 5},
    'B': {'A': 7, 'C': 8, 'D': 9, 'E': 7},
    'C': {'B': 8, 'E': 5},
    'D': {'A': 5, 'B': 9, 'E': 15, 'F': 6},
    'E': {'B': 7, 'C': 5, 'D': 15, 'F': 8, 'G': 9},
    'F': {'D': 6, 'E': 8, 'G': 11},
    'G': {'E': 9, 'F': 11}
}

# Test case
assert kruskal(graph) == [(5, 'A', 'D'), (7, 'B', 'E'), (5, 'C', 'E'), (6, 'D', 'F'), (7, 'E', 'C'), (9, 'E', 'G')]
PreviousGraph and TreesNextMIT 6.006

Last updated 1 year ago

Was this helpful?

🎾
πŸ“–
🍎
1️
πŸ“Š
🌲