Back to Blog

Graphs: From Basics to Optimization

How to traverse, search, and arrange the data structure behind networks, social media, and maps.


Graphs are everywhere. Your social network is a graph. The internet is a graph. Maps are graphs. A family tree is a graph. Yet, graphs are often explained poorly: drawn as random nodes and squiggly lines, with algorithms presented as mysterious procedures.

In this article, we'll build intuition through interactive visualization. You'll trace algorithms step-by-step, see how different approaches solve the same problem, and understand why graphs matter in computer science.

What we're covering

You'll trace through:

  • What graphs are and the basic terminology
  • Depth-first search (DFS) and breadth-first search (BFS)
  • Shortest path algorithms (Dijkstra's algorithm)
  • Graph layout algorithms and why positioning matters
  • The minimum crossing number problem and optimization

What Is a Graph?

A graph G is a data structure formally defined as a pair of two sets: G = (V, E), where:

  • V is a set of vertices (or nodes), independent entities that represent objects or concepts.
  • E is a set of edges (or links), pairs of vertices representing relationships.

For example, in a social network, each person is a vertex. Each friendship is an edge connecting two vertices. In a road network, each city is a vertex. Each road is an edge.

An edge connecting vertices u and v is denoted (u, v). For a simple graph, there's at most one edge between any pair of distinct vertices, and no self-loops (edges from a vertex to itself).

Directed vs Undirected

Graphs can be directed or undirected:

  • Undirected: Edges go both ways. An undirected edge (u, v) can be traversed from u to v or from v to u. Friendship on Facebook is undirected.
  • Directed: Edges have direction. A directed edge (u, v) only goes from u to v. Following on Twitter is directed, you can follow someone who doesn't follow you back.

An undirected edge (u, v) is equivalent to two directed edges: (u, v) and (v, u). So any directed graph algorithm can be applied to undirected graphs by treating each undirected edge as two directed edges.

Weighted Graphs

Edges can have weights: numbers representing cost, distance, capacity, or probability. A weighted graph is also called a network.

In a road network, the weight of an edge might be the distance (in km) between two cities, or the travel time in traffic. In a communication network, it might be the bandwidth available. Weights can be positive, negative, or zero, depending on the application.

Graph Structure
ABCD

Key Properties

Several properties characterize graphs:

  • Degree: The number of edges connected to a vertex. In the graph above, vertex B has degree 3 (connected to A, C, and D). In directed graphs, we distinguish in-degree (incoming edges) and out-degree (outgoing edges).
  • Connected: An undirected graph is connected if there's a path from every vertex to every other vertex. A directed graph is strongly connected if every vertex can reach every other vertex following directed edges.
  • Acyclic: A graph is acyclic if it contains no cycles, sequences of edges that start and end at the same vertex. A directed acyclic graph is called a DAG.
Graph representations: Adjacency list vs matrix

When storing a graph, we need to decide how to represent edges. Two main strategies:

Adjacency List: For each vertex, store a list of edges starting from that vertex. This is a dictionary mapping each vertex to its outgoing edges.

adjacencyList = {
A: [(A, B), (A, C)],
B: [(B, D)],
C: [],
D: []
};

Adjacency Matrix: A 2D array where matrix[i][j] is 1 if there's an edge from vertex i to vertex j, 0 otherwise. (Or the edge weight if weighted.)

adjacencyMatrix = [
[0, 1, 1, 0],  // A → B, C
[0, 0, 0, 1],  // B → D
[0, 0, 0, 0],  // C (no outgoing edges)
[0, 0, 0, 0]   // D (no outgoing edges)
];

The choice depends on whether your graph is sparse or dense:

  • Sparse graphs: Few edges (|E| ≈ |V|). Examples: social networks (each person has a limited number of friends), road networks. Use adjacency lists: O(V + E) space, less wasted memory.
  • Dense graphs: Many edges (|E| ≈ V²). Examples: complete graphs, heavily connected networks. Use adjacency matrix: O(V²) space, but faster edge lookups and operations.

Performance comparison: Adjacency lists excel at listing neighbors (O(1) per neighbor), but checking if a specific edge exists takes O(degree). Adjacency matrices check edges in O(1) but need O(V) space regardless of edge count. For most algorithms, adjacency lists are preferred unless you know you have a dense graph.


Finding Routes: BFS and Dijkstra

Now we can tackle the real problem: finding the best route for a delivery.

Imagine a parcel needs to go from a warehouse to a customer's home. The graph represents the road network: each intersection is a vertex, each road is a directed edge (respecting one-way streets). We need the shortest route.

Problem

Given a graph and starting/ending vertices, find the shortest path between them.

The definition of "shortest" matters. In an unweighted graph (where all edges represent one block), shortest means fewest hops. In a weighted graph (where edges have real distances), shortest means minimum total distance.

Breadth-First Search: Minimum Hops

Let's start simple: imagine a perfectly regular city grid (like downtown San Francisco) where all blocks are roughly equal. We can count blocks instead of measuring distance.

Breadth-first search (BFS) finds the path with the fewest hops. It works by exploring outward from the start like a wave: first all neighbors, then all neighbors' neighbors, and so on. When we reach the destination, we've found the shortest path.

Try stepping through BFS on the graph below. Watch how it explores outward in layers:

Breadth-First Search
ABCDEF

How BFS Works

BFS uses a queue (FIFO: First In, First Out). The algorithm maintains a "frontier" of vertices to explore, always processing the oldest discovered vertex first.

function bfs(graph, start, goal) {
const queue = [start];
const distances = new Map();
const parents = new Map();

distances.set(start, 0);
parents.set(start, null);

while (queue.length > 0) {
  const v = queue.shift();  // Dequeue the first vertex

  if (v === goal) {
    return { distances, parents };  // Found the goal
  }

  // Explore all neighbors
  for (let edge of adjacencyList[v]) {
    const u = edge.destination;

    if (!distances.has(u)) {
      // First time discovering this vertex
      distances.set(u, distances.get(v) + 1);
      parents.set(u, v);
      queue.push(u);  // Enqueue for later exploration
    }
  }
}

return { distances, parents };  // Goal unreachable
}

Why does BFS work? The key insight: BFS explores vertices in order of their distance from the start. Because we always process the nearest unexplored vertices first (thanks to the queue), when we reach the goal, we've taken the shortest path.

The queue naturally preserves this ordering: vertices are added in layers (all at distance 1, then distance 2, etc.). By the time we process a vertex at distance d, all vertices at distance d-1 have already been processed.

  • Time complexity: O(V + E) – visit each vertex once, explore each edge once
  • Space complexity: O(V) – the queue holds at most V vertices
  • When to use: Finding shortest paths in unweighted graphs, exploring layer by layer

Reconstructing the Path

BFS gives us distances and parents, but how do we reconstruct the actual path? We trace backward from the goal to the start using the parents map:

function reconstructPath(parents, destination) {
if (parents.get(destination) === undefined) {
  return null;  // Unreachable
}

const path = [destination];
let current = destination;

while (parents.get(current) !== null) {
  current = parents.get(current);
  path.unshift(current);  // Add to front
}

return path;  // Returns [start, ..., destination]
}

Watch the step-by-step reconstruction of a path using the parents map:

Path Reconstruction
ABCDE
Path: E

Depth-First Search (DFS)

Depth-first search uses a different strategy: go as deep as possible along one path before backtracking. It's useful for exploring graph structure, detecting cycles, and finding connected components, but not for shortest paths.

function dfs(v, visited = new Set(), inTime = {}, outTime = {}, time = 0) {
time++;
inTime[v] = time;  // Record when we enter the vertex
visited.add(v);

for (let edge of adjacencyList[v]) {
  const u = edge.destination;

  if (!visited.has(u)) {
    dfs(u, visited, inTime, outTime, time);
  }
}

time++;
outTime[v] = time;  // Record when we leave the vertex

return { visited, inTime, outTime, time };
}
  • Time complexity: O(V + E) – same as BFS
  • Space complexity: O(V) – recursion stack can go V deep
  • When to use: Detecting cycles, finding connected components, topological sorting, finding strongly connected components
Why not use DFS for shortest paths?

DFS explores as far as possible along each branch before backtracking. This means it might find a long path to the goal before finding a short one. By the time it returns, it doesn't know if the path is actually shortest.

BFS, conversely, explores systematically by distance. Once it reaches the goal, we know no shorter path exists.


Weighted Networks: Dijkstra's Algorithm

BFS works perfectly for unweighted graphs where all edges are equal. But real road networks have varying distances. One block might be 100m, another 500m. We need to handle weighted edges.

Problem

Given a weighted graph with non-negative edge weights, find the path with minimum total weight from start to destination.

In road networks, weight = distance (km). In flight networks, weight = cost. In social networks, weight = trust level. The algorithm is the same; only the interpretation changes.

Dijkstra's algorithm (1959) solves this using a greedy approach: always expand the nearest unvisited vertex next. This ensures optimality because, with non-negative weights, once we visit a vertex, we've found the true shortest path to it.

Dijkstra's Algorithm
421536A0B4CD2E
A→A
0
A→B
4
A→D
2
A→C
A→E

How Dijkstra Works

  1. Initialize: start vertex has distance 0, all others have distance ∞.
  2. Repeat until all vertices are visited:
    1. Pick the unvisited vertex with the smallest known distance. This is the greedy choice.
    2. For each unvisited neighbor u of this vertex, check: is the path through the current vertex shorter than the previously known path to u?
    3. If yes, update u's distance and remember we came from the current vertex.
function dijkstra(graph, start, goal) {
const distances = new Map();
const parents = new Map();
const visited = new Set();

// Initialize all vertices
for (let v of graph.vertices) {
  distances.set(v, v === start ? 0 : Infinity);
  parents.set(v, null);
}

while (visited.size < graph.vertices.length) {
  // Find the nearest unvisited vertex
  let current = null;
  let minDist = Infinity;

  for (let v of graph.vertices) {
    if (!visited.has(v) && distances.get(v) < minDist) {
      current = v;
      minDist = distances.get(v);
    }
  }

  if (current === null || minDist === Infinity) break;  // No reachable unvisited vertices
  visited.add(current);

  if (current === goal) break;  // Found the goal

  // Relax edges: update neighbors' distances
  for (let edge of graph.adjacencyList[current]) {
    const neighbor = edge.destination;
    const weight = edge.weight;

    const newDist = distances.get(current) + weight;
    if (newDist < distances.get(neighbor)) {
      distances.set(neighbor, newDist);
      parents.set(neighbor, current);
    }
  }
}

return { distances, parents };
}

The key difference from BFS: BFS uses a regular queue and counts hops. Dijkstra uses a priority queue that always gives us the minimum-distance vertex next, ensuring we process vertices in order of their true distance from the start.

  • Time complexity: O((V + E) log V) with a binary heap priority queue; O(V²) with a simple array (acceptable for small graphs)
  • Why it works: Greedy choice is optimal. Once we visit a vertex, its distance is final because all remaining vertices are farther away (non-negative weights).
  • Limitation: Fails with negative weights. (The guarantee breaks: a longer path through a negative edge might actually be shorter.)
  • For negative weights: Use Bellman-Ford algorithm instead (slower: O(VE), but handles negatives).
A* Algorithm: Dijkstra with Intuition

Dijkstra explores in all directions equally, potentially visiting many vertices. But if we're navigating from point A to point B on a map, we should prioritize directions toward B.

A algorithm* adds a heuristic: an estimate of remaining distance to the goal. It then prioritizes vertices that seem closest to the goal. With a good heuristic, A* visits far fewer vertices than Dijkstra while still finding the optimal path.

Example heuristic: straight-line distance in GPS navigation. Even though you can't fly in a straight line, the heuristic guides the search toward the destination.

Exploring Edge Weights

Dijkstra's algorithm shows us how weights affect shortest paths. Try adjusting the edge weights below to see how the shortest distances change in real-time. Notice how lowering a weight can redirect traffic through different edges:

Adjust Edge Weights
421536ABCDE
AB:4
AD:2
BC:1
BD:5
CE:3
DE:6
Shortest path A→E:8

Graph Visualization: Layout Algorithms

Algorithms let us compute optimal paths through graphs. But humans also need to understand graphs by looking at them. How should we position vertices on a screen?

Problem

Given a graph, arrange its vertices in 2D space to make the structure visually clear.

There's no universal "best" layout. Different graphs need different strategies. A social network needs to show communities clustering. An organizational chart needs hierarchical structure. A circuit diagram needs minimal edge crossings.

Layout Algorithms
ABCDEF

Layout Strategies

Random Layout

Place vertices randomly. O(V) time. Fast, but produces tangled, hard-to-read diagrams. Only use when speed matters more than aesthetics.

Circular Layout

Arrange vertices in a circle. O(V) time. Works well for small, symmetric graphs. For larger graphs, many edges still cross, reducing clarity.

Hierarchical Layout (Sugiyama Algorithm)

Arrange vertices in layers, used for directed acyclic graphs (DAGs). All edges point downward (or in one consistent direction). Automatically minimizes crossings. Standard for flowcharts, dependency diagrams, and organizational hierarchies.

Force-Directed Layout (Fruchterman-Reingold)

Simulate physics: treat vertices as repelling particles and edges as springs. Vertices push each other apart (repulsion) while edges pull connected vertices together (attraction). The system converges to an equilibrium, a natural, readable layout.

function forceDirectedLayout(graph, iterations = 100) {
const vertices = graph.vertices;
const forces = {};

// Start with random positions
for (let v of vertices) {
  v.x = Math.random() * 400;
  v.y = Math.random() * 400;
  forces[v.id] = { x: 0, y: 0 };
}

for (let iter = 0; iter < iterations; iter++) {
  // Reset forces
  for (let v of vertices) {
    forces[v.id] = { x: 0, y: 0 };
  }

  // Repulsion: vertices push each other away
  for (let i = 0; i < vertices.length; i++) {
    for (let j = i + 1; j < vertices.length; j++) {
      const v1 = vertices[i];
      const v2 = vertices[j];
      const dx = v1.x - v2.x;
      const dy = v1.y - v2.y;
      const distance = Math.sqrt(dx * dx + dy * dy) + 0.01;
      const force = 100 / (distance * distance);  // Inverse square law

      forces[v1.id].x += (dx / distance) * force;
      forces[v1.id].y += (dy / distance) * force;
      forces[v2.id].x -= (dx / distance) * force;
      forces[v2.id].y -= (dy / distance) * force;
    }
  }

  // Attraction: connected vertices pull together
  for (let edge of graph.edges) {
    const v1 = graph.getVertex(edge.from);
    const v2 = graph.getVertex(edge.to);
    const dx = v2.x - v1.x;
    const dy = v2.y - v1.y;
    const distance = Math.sqrt(dx * dx + dy * dy) + 0.01;
    const force = distance * 0.1;

    forces[v1.id].x += (dx / distance) * force;
    forces[v1.id].y += (dy / distance) * force;
    forces[v2.id].x -= (dx / distance) * force;
    forces[v2.id].y -= (dy / distance) * force;
  }

  // Apply forces and dampen to converge
  const damping = 0.99;
  for (let v of vertices) {
    v.vx = (v.vx || 0) * damping + forces[v.id].x * 0.01;
    v.vy = (v.vy || 0) * damping + forces[v.id].y * 0.01;
    v.x += v.vx;
    v.y += v.vy;
  }
}

return vertices;
}
  • Time: O(V²) per iteration (all-pairs repulsion). Usually converges in 50-200 iterations.
  • Result: Natural-looking, readable layouts. Clusters emerge naturally.
  • Best for: General-purpose graph visualization, network analysis, social network exploration.

Planar Graphs and Minimum Crossing Number

We've seen how to arrange vertices aesthetically (force-directed layout). But visual clarity also depends on one specific thing: edge crossings. Every crossing makes a diagram harder to read.

Problem

Given a graph, find a vertex arrangement that minimizes the number of edge crossings.

Planarity

A graph is planar if it can be drawn in 2D with zero crossings. Formally, a planar graph is one that admits a planar embedding, an isomorphism (one-to-one correspondence) to a plane graph where vertices are points in ℝ² (the 2D plane) and edges are non-intersecting curves (typically straight line segments or Jordan curves, which are simple closed curves).

Key requirements for a valid plane graph embedding:

  • Vertices in 2D: Each vertex is a point in the Euclidean plane.
  • Non-intersecting edges: Edges are curves between vertices that don't cross except at endpoints.
  • No edge-vertex overlaps: An edge cannot pass through or touch a vertex other than its endpoints.

But can any graph be drawn this way? The answer is no. Some graphs are fundamentally non-planar, no arrangement of vertices can eliminate all edge crossings. The theory that characterizes these is Kuratowski's theorem.

Kuratowski's Theorem and Forbidden Minors

Kuratowski's theorem states: A graph is planar if and only if it does not contain K₅ or K₃,₃ (or any subdivision of either) as a subgraph.

In other words, K₅ and K₃,₃ are the forbidden minors for planarity, they are the simplest (fewest vertices/edges) non-planar graphs, and every other non-planar graph contains one of them as a subgraph. This is a remarkable result: despite infinite possible graphs, all non-planarity can be traced back to just two base graphs.

If a graph is planar, algorithms like Hopcroft-Tarjan can test it in O(V) time and compute a planar embedding (zero-crossing drawing) automatically.

Non-Planar Graphs
12345

Complete and Bipartite Graphs

To understand non-planar graphs, we need to study the two simplest non-planar graphs mentioned by Kuratowski.

A complete graph Kn is one where every vertex is connected to every other vertex. With n vertices, Kn has exactly n(n-1)/2 edges, the maximum possible for a simple graph. Complete graphs are dense by definition.

Complete graphs grow in complexity quickly:

  • K₃ (3 vertices, 3 edges): Planar, it's just a triangle.
  • K₄ (4 vertices, 6 edges): Planar, can be drawn as a square with diagonals.
  • K₅ (5 vertices, 10 edges): Non-planar, the first complete graph that cannot be drawn without edge crossings.

A complete bipartite graph Km,n is a graph where vertices are partitioned into two groups (of sizes m and n), and every vertex in group A is connected to every vertex in group B, but vertices within the same group are never connected. Km,n has exactly m × n edges.

Similar to complete graphs:

  • K2,2: Planar (4 vertices, 4 edges, forms a cycle).
  • K2,3 and K3,2: Planar.
  • K3,3 (two groups of 3, 9 edges total): Non-planar, the second forbidden minor.

Why are K₅ and K₃,₃ special? They are the smallest non-planar graphs. Any graph obtained by removing vertices or edges from K₅ is planar. The same holds for K₃,₃. This minimality is why Kuratowski identified them as the forbidden minors, they are the atoms of non-planarity.

Testing for Planarity: Euler's Constraints

While Kuratowski's theorem is elegant, checking for K₅ and K₃,₃ subgraphs can be expensive. A faster necessary condition for planarity (though not sufficient) comes from Euler's work on graphs:

For any simple, connected planar graph with V vertices and E edges:

  • If V ≥ 3: E ≤ 3V - 6
  • If V ≥ 3 and the graph has no cycles of length 3: E ≤ 2V - 4

These inequalities are necessary but not sufficient, if a graph violates them, it is definitely non-planar. But satisfying them doesn't guarantee planarity. However, they provide a quick O(1) test to rule out planarity early: if a graph has too many edges for its vertex count, it must be non-planar.

Example: K₅ has 5 vertices and 10 edges. Check: 10 ≤ 3(5) - 6 = 9? No, 10 > 9, so K₅ violates Euler's constraint and must be non-planar. Similarly, K₃,₃ has 6 vertices and 9 edges: 9 ≤ 2(6) - 4 = 8? No, so it's also caught by the second constraint.

Non-Planar Graphs: The Optimization Problem

Most real graphs are non-planar, they require at least some crossings. The question becomes: how many crossings are unavoidable, and can we find an arrangement that achieves that minimum?

This is the minimum crossing number problem: given a graph, find the layout with the fewest edge crossings. Unfortunately, this is NP-hard, no known efficient exact algorithm exists.

Crossing Minimization
061210CrossingsIterations

Crossing Number Definitions

The crossing number cr(G) of a graph G is the minimum number of edge crossings in any 2D embedding of G. For planar graphs, cr(G) = 0. For non-planar graphs, cr(G) > 0.

Example crossing numbers:

  • K₅: cr(K₅) = 1 (one crossing unavoidable, as shown in the visualization)
  • K₃,₃: cr(K₃,₃) = 1 (also one crossing unavoidable)
  • K₆: cr(K₆) = 3 (optimal layout requires exactly 3 crossings)

Determining the exact crossing number is NP-hard in general, but researchers have developed formulas for special cases. Guy's conjecture (1960) proposes that the crossing number of the complete graph Kn is:

cr(Kn) = ⌊n/2⌋ × ⌊(n-1)/2⌋ × ⌊(n-2)/2⌋ × ⌊(n-3)/2⌋ / 4

Similarly, Zarankiewicz's conjecture (1954) estimates the crossing number for complete bipartite graphs Km,n:

cr(K) = ⌊m/2⌋ × ⌈m/2⌉ × ⌊n/2⌋ × ⌈n/2⌉ (for Km,n)

Both formulas have been proven as upper bounds (the crossing number is at most these values), and they match known values for small cases. They remain unproven as lower bounds, that is, we don't yet know if they give the exact minimum.

Rectilinear vs. Curved Edges

All our examples so far have used straight-line (rectilinear) edges. But an important question arises: can using curved edges reduce crossings?

The answer is yes. The rectilinear crossing number (using only straight lines) can be larger than the crossing number (allowing curves). For instance, there exist graphs where cr(G) = 4 but the minimum number of crossings using only straight-line edges is 12.

However, this comes with trade-offs:

  • Simplicity: Straight-line edges are computationally simpler to handle and test for intersections.
  • Visual clarity: Curves add flexibility but increase visual complexity.
  • Search space: Finding optimal curves requires optimizing many more parameters, position becomes O(V) while curve control points become O(V + E), significantly larger search spaces.

For practical purposes, the advantage of curves is often outweighed by increased computational cost. Most graph visualization systems prefer straight-line layouts unless dealing with very dense, highly non-planar graphs where curves noticeably reduce clutter.

Why Minimize Crossings?

  • Readability: Each crossing is a visual distraction. Fewer crossings mean clearer diagrams. Studies show humans understand graphs with fewer crossings faster and more accurately.
  • VLSI Design: In circuit design, crossing edges must be on different layers. More crossings mean more layers, which increases cost and complexity.
  • Network Analysis: In social networks, biological networks, or citation networks, fewer crossings reveal structure (communities, hierarchies, bottlenecks).
  • Knowledge Graphs & UML Diagrams: The difference between a readable and incomprehensible diagram is often just the number of crossings.

Approximation Algorithms

Since exact algorithms are intractable, we use approximations that trade quality for speed:

  • Heuristics (Fast): "Keep central vertices in the middle" or "circular arrangement with chord minimization." O(V log V) or O(V²), but local-optimum quality.
  • Gradient Descent (Medium): Iteratively move vertices in the direction that reduces crossings. O(V²) per iteration, converges to a local minimum.
  • Simulated Annealing (Good): Occasionally accept worse layouts with decreasing probability, escaping local minima. O(V²) per iteration, better quality than gradient descent.
  • Genetic Algorithms (Good): Treat layouts as chromosomes. Breed good solutions and mutate them. Often converges faster than simulated annealing for large graphs.

In practice, the choice depends on your graph size and how much time you can spend:

  • Real-time visualization? Use a fast heuristic.
  • Publication-quality diagrams? Run simulated annealing overnight.
  • Huge graphs? Genetic algorithms often scale better.
Topological Sorting for DAGs

If your graph is a directed acyclic graph (DAG), you don't need to minimize crossings, you can achieve zero crossings automatically with topological sorting.

Topological sort orders vertices so that every edge goes from an earlier vertex to a later one. Combined with hierarchical layout (putting vertices at levels), this guarantees zero crossings for DAGs.

DFS can compute topological sort in O(V + E) time by ordering vertices by reverse out-time.


Putting It Together: From Delivery to Optimization

We started with a simple problem: finding the shortest delivery route. Along the way, we discovered an entire landscape of graph algorithms:

  • BFS – For unweighted graphs, find shortest paths (fewest hops) in O(V + E) time
  • DFS – For structure analysis: find connected components, detect cycles, topological sort in O(V + E) time
  • Dijkstra – For weighted graphs, find shortest paths with cost/distance in O((V + E) log V) time
  • Layout algorithms – Make graphs human-readable: force-directed for general graphs, hierarchical for DAGs, circular for small graphs
  • Optimization algorithms – For hard problems like minimum crossing number, use heuristics (fast, okay quality) or metaheuristics (slow, better quality)

The Real Journey

The progression reflects real-world complexity:

  1. Simple grid city: All blocks equal → BFS counts blocks → deliver via shortest hop-path
  2. Real city with varied streets: Blocks have different lengths → Dijkstra weighs distances → deliver via shortest distance
  3. Visualizing the network: Humans need to understand the routes → layout algorithms arrange vertices → force-directed shows structure
  4. Circuit design problem: Crossings are costly → minimize crossing number → use optimization algorithms

Choosing the Right Algorithm

The art of algorithm design is recognizing your problem and choosing accordingly:

  • Social network: Need fast traversal? DFS. Find friends of friends? BFS. Mutual friendships? Strongly connected components.
  • Navigation app: Shortest route? Dijkstra. Fastest route with turn penalties? Weighted Dijkstra. Route to nearest restaurant? BFS from all restaurants.
  • Circuit layout: Can it be drawn without crossings? Test planarity. If not, optimize crossing number.
  • Protein interactions: Visualize clusters? Force-directed layout. Find critical pathways? Dijkstra. Detect feedback loops? DFS cycle detection.

Next steps: Implement Dijkstra's algorithm and test it on a real road network. Use a force-directed layout to visualize a social network. Or tackle minimum crossing number on a circuit diagram. The best way to internalize graphs is to build with them.