Graphs Vectors and Matrices Daniel A Spielman Yale

  • Slides: 105
Download presentation
Graphs, Vectors, and Matrices Daniel A. Spielman Yale University AMS Josiah Willard Gibbs Lecture

Graphs, Vectors, and Matrices Daniel A. Spielman Yale University AMS Josiah Willard Gibbs Lecture January 6, 2016

From Applied to Pure Mathematics Algebraic and Spectral Graph Theory Sparsification: approximating graphs by

From Applied to Pure Mathematics Algebraic and Spectral Graph Theory Sparsification: approximating graphs by graphs with fewer edg The Kadison-Singer problem

A Social Network Graph

A Social Network Graph

A Social Network Graph

A Social Network Graph

A Social Network Graph “vertex” or “node” “edge” = pair of nodes

A Social Network Graph “vertex” or “node” “edge” = pair of nodes

A Social Network Graph “vertex” or “node” “edge” = pair of nodes

A Social Network Graph “vertex” or “node” “edge” = pair of nodes

A Big Social Network Graph

A Big Social Network Graph

A Graph = vertices, = edges, pairs of vertices 1 8 2 3 5

A Graph = vertices, = edges, pairs of vertices 1 8 2 3 5 4 6 9 12 7 10 11

The Graph of a Mesh

The Graph of a Mesh

Examples of Graphs 1 8 2 3 5 4 6 9 12 7 10

Examples of Graphs 1 8 2 3 5 4 6 9 12 7 10 11

Examples of Graphs 1 8 2 3 5 4 6 9 12 7 10

Examples of Graphs 1 8 2 3 5 4 6 9 12 7 10 11

How to understand large-scale structure Draw the graph Identify communities and hierarchical structure Use

How to understand large-scale structure Draw the graph Identify communities and hierarchical structure Use physical metaphors Edges as resistors or rubber bands Examine processes Diffusion of gas / Random Walks

The Laplacian quadratic form of

The Laplacian quadratic form of

The Laplacian quadratic form of 1 0. 5 1. 5 0 1 0. 5

The Laplacian quadratic form of 1 0. 5 1. 5 0 1 0. 5

The Laplacian quadratic form of 1 0. 5 1. 5 0 1 0. 5

The Laplacian quadratic form of 1 0. 5 1. 5 0 1 0. 5

The Laplacian matrix of

The Laplacian matrix of

Graphs as Resistor Networks View edges as resistors connecting vertices Apply voltages at some

Graphs as Resistor Networks View edges as resistors connecting vertices Apply voltages at some vertices. Measure induced voltages and current flow. 1 V 0 V

Graphs as Resistor Networks Induced voltages minimize subject to constraints. , 1 V 0

Graphs as Resistor Networks Induced voltages minimize subject to constraints. , 1 V 0 V

Graphs as Resistor Networks Induced voltages minimize subject to constraints. , 1 V 1

Graphs as Resistor Networks Induced voltages minimize subject to constraints. , 1 V 1 V 0. 5 V 0 V 0 V 0. 5 V 0. 375 V 0. 625 V

Graphs as Resistor Networks Induced voltages minimize subject to constraints. , 1 V 1

Graphs as Resistor Networks Induced voltages minimize subject to constraints. , 1 V 1 V 0. 5 V 0 V 0 V 0. 5 V 0. 375 V 0. 625 V

Graphs as Resistor Networks Induced voltages minimize subject to constraints. , Effective conductance =

Graphs as Resistor Networks Induced voltages minimize subject to constraints. , Effective conductance = current flow with one vo 1 V 1 V 0. 5 V 0 V 0 V 0. 5 V 0. 375 V 0. 625 V

Weighted Graphs Edge assigned a non-negative real weigh measuring strength of connection 1/resistance

Weighted Graphs Edge assigned a non-negative real weigh measuring strength of connection 1/resistance

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Edges

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Edges are drawn as curves for visibility.

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize to fix scale, require

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize to fix scale, require

Courant-Fischer Theorem Where is the smallest eigenvalue of and is the corresponding eigenvector.

Courant-Fischer Theorem Where is the smallest eigenvalue of and is the corresponding eigenvector.

Courant-Fischer Theorem Where is the smallest eigenvalue of and is the corresponding eigenvector. For

Courant-Fischer Theorem Where is the smallest eigenvalue of and is the corresponding eigenvector. For and is a constant vector

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize Such that and

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize Such that and Courant-Fischer Theorem: solution is , the eigenvector of the second-smallest eigenvalue ,

Spectral Graph Drawing (Hall ’ 70) = area under blue curves

Spectral Graph Drawing (Hall ’ 70) = area under blue curves

Spectral Graph Drawing (Hall ’ 70) = area under blue curves

Spectral Graph Drawing (Hall ’ 70) = area under blue curves

Space the points evenly

Space the points evenly

And, move them to the circle

And, move them to the circle

Finish by putting me back in the center

Finish by putting me back in the center

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize Such that and

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize Such that and

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize

Spectral Graph Drawing (Hall ’ 70) Want to map with most edges short Minimize Such that and and , to prevent

Spectral Graph Drawing (Hall ’ 70) Minimize Such that and Courant-Fischer Theorem: solution is

Spectral Graph Drawing (Hall ’ 70) Minimize Such that and Courant-Fischer Theorem: solution is , up to rotation

Spectral Graph Drawing (Hall ’ 70) 9 1 2 3 5 5 4 1

Spectral Graph Drawing (Hall ’ 70) 9 1 2 3 5 5 4 1 6 8 7 4 6 8 9 Arbitrary Drawing 3 2 7 Spectral Drawing

Spectral Graph Drawing (Hall ’ 70) Original Drawing Spectral Drawing

Spectral Graph Drawing (Hall ’ 70) Original Drawing Spectral Drawing

Spectral Graph Drawing (Hall ’ 70) Original Drawing Spectral Drawing

Spectral Graph Drawing (Hall ’ 70) Original Drawing Spectral Drawing

Dodecahedron Best embedded by first three eigenvectors

Dodecahedron Best embedded by first three eigenvectors

Spectral drawing of Erdos graph: edge between co-authors of papers

Spectral drawing of Erdos graph: edge between co-authors of papers

When there is a “nice” drawing: Most edges are short Vertices are spread out

When there is a “nice” drawing: Most edges are short Vertices are spread out and don’t clump too mu is close to 0 When is big, say there is no nice picture of the graph

Expanders: when is big Formally: infinite families of graphs of constant degree d and

Expanders: when is big Formally: infinite families of graphs of constant degree d and large Examples: random d-regular graphs Ramanujan graphs Have no communities or clusters. Incredibly useful in Computer Science: Act like random graphs (pseudo-random) Used in many important theorems and algorit

Good Expander Graphs -regular graphs with Courant-Fischer: for all

Good Expander Graphs -regular graphs with Courant-Fischer: for all

Good Expander Graphs -regular graphs with Courant-Fischer: for all For , the complete graph

Good Expander Graphs -regular graphs with Courant-Fischer: for all For , the complete graph on , so for vertices

Good Expander Graphs

Good Expander Graphs

(S-Teng ‘ 04) Sparse Approximations of Graphs A graph is a sparse approximation of

(S-Teng ‘ 04) Sparse Approximations of Graphs A graph is a sparse approximation of if has few edges and few: the number of edges in is or , where if for all

(S-Teng ‘ 04) Sparse Approximations of Graphs A graph is a sparse approximation of

(S-Teng ‘ 04) Sparse Approximations of Graphs A graph is a sparse approximation of if has few edges and few: the number of edges in is or , where if Where for all if for all

(S-Teng ‘ 04) Sparse Approximations of Graphs A graph is a sparse approximation of

(S-Teng ‘ 04) Sparse Approximations of Graphs A graph is a sparse approximation of if has few edges and few: the number of edges in is or , where if Where for all if for all

(S-Teng ‘ 04) Sparse Approximations of Graphs The number of edges in is or

(S-Teng ‘ 04) Sparse Approximations of Graphs The number of edges in is or , where Where if for all

Why we sparsify graphs To save memory when storing graphs. To speed up algorithms:

Why we sparsify graphs To save memory when storing graphs. To speed up algorithms: flow problems in graphs (Benczur-Karger ‘ 96) linear equations in Laplacians (S-Teng ‘ 04)

Graph Sparsification Theorems For every , there is a and (Batson-S-Srivastava ‘ 0

Graph Sparsification Theorems For every , there is a and (Batson-S-Srivastava ‘ 0

Graph Sparsification Theorems For every , there is a and (Batson-S-Srivastava ‘ 0 By

Graph Sparsification Theorems For every , there is a and (Batson-S-Srivastava ‘ 0 By careful random sampling, can quickly get (S-Srivastava ‘ 08)

Laplacian Matrices

Laplacian Matrices

Laplacian Matrices

Laplacian Matrices

Laplacian Matrices

Laplacian Matrices

Matrix Sparsification

Matrix Sparsification

Matrix Sparsification subset of vectors, scaled up

Matrix Sparsification subset of vectors, scaled up

Matrix Sparsification subset of vectors, scaled up

Matrix Sparsification subset of vectors, scaled up

Matrix Sparsification

Matrix Sparsification

Simplification of Matrix Sparsification is equivalent to

Simplification of Matrix Sparsification is equivalent to

Simplification of Matrix Sparsification Set We need

Simplification of Matrix Sparsification Set We need

Simplification of Matrix Sparsification Set “Decomposition of the identity” “Parseval frame” “Isotropic Position”

Simplification of Matrix Sparsification Set “Decomposition of the identity” “Parseval frame” “Isotropic Position”

Matrix Sparsification by Sampling (Rudelson ‘ 99, Ahlswede-Winter ‘ 02, Tropp ’ 11) For

Matrix Sparsification by Sampling (Rudelson ‘ 99, Ahlswede-Winter ‘ 02, Tropp ’ 11) For Choose If choose with probability , set

Matrix Sparsification by Sampling (Rudelson ‘ 99, Ahlswede-Winter ‘ 02, Tropp ’ 11) For

Matrix Sparsification by Sampling (Rudelson ‘ 99, Ahlswede-Winter ‘ 02, Tropp ’ 11) For Choose If choose with probability , set (effective conductance

Matrix Sparsification by Sampling (Rudelson ‘ 99, Ahlswede-Winter ‘ 02, Tropp ’ 11) For

Matrix Sparsification by Sampling (Rudelson ‘ 99, Ahlswede-Winter ‘ 02, Tropp ’ 11) For Choose If choose with probability , set

Matrix Sparsification by Sampling (Rudelson ‘ 99, Ahlswede-Winter ‘ 02, Tropp ’ 11) For

Matrix Sparsification by Sampling (Rudelson ‘ 99, Ahlswede-Winter ‘ 02, Tropp ’ 11) For Choose If choose with probability , set With high probability, choose vectors an d

Optimal (? ) Matrix Sparsification (Batson-S-Srivastava ‘ 09 For with Can choose vectors and

Optimal (? ) Matrix Sparsification (Batson-S-Srivastava ‘ 09 For with Can choose vectors and nonzero values for the so that

Optimal (? ) Matrix Sparsification (Batson-S-Srivastava ‘ 09 For with Can choose vectors and

Optimal (? ) Matrix Sparsification (Batson-S-Srivastava ‘ 09 For with Can choose vectors and nonzero values for the so that What are the !?

Optimal (? ) Matrix Sparsification (Batson-S-Srivastava ‘ 09 For with Can choose vectors and

Optimal (? ) Matrix Sparsification (Batson-S-Srivastava ‘ 09 For with Can choose vectors and nonzero values for the so that !?

The Kadison-Singer Problem ‘ 59 Equivalent to: Anderson’s Paving Conjectures (‘ 79, ‘ 81)

The Kadison-Singer Problem ‘ 59 Equivalent to: Anderson’s Paving Conjectures (‘ 79, ‘ 81) Bourgain-Tzafriri Conjecture (‘ 91) Feichtinger Conjecture (‘ 05) Many others Implied by: Weaver’s KS 2 conjecture (‘ 04)

Weaver’s Conjecture: Isotropic vectors for every unit vector

Weaver’s Conjecture: Isotropic vectors for every unit vector

Partition into approximately ½-Isotropic S

Partition into approximately ½-Isotropic S

Partition into approximately ½-Isotropic S

Partition into approximately ½-Isotropic S

Partition into approximately ½-Isotropic S

Partition into approximately ½-Isotropic S

Partition into approximately ½-Isotropic S because

Partition into approximately ½-Isotropic S because

Big vectors make this difficult

Big vectors make this difficult

Big vectors make this difficult

Big vectors make this difficult

Weaver’s Conjecture KS 2 There exist positive constants if all and so that and

Weaver’s Conjecture KS 2 There exist positive constants if all and so that and then exists a partition into S 1 and S 2 with

Theorem (Marcus-S-Srivastava ‘ 15) For all if all and then exists a partition into

Theorem (Marcus-S-Srivastava ‘ 15) For all if all and then exists a partition into S 1 and S 2 with

We want

We want

We want

We want

We want Consider expected polynomial of a random partiti

We want Consider expected polynomial of a random partiti

Proof Outline 1. Prove expected characteristic polynomial has real roots 2. Prove its largest

Proof Outline 1. Prove expected characteristic polynomial has real roots 2. Prove its largest root is at most 3. Prove is an interlacing family, so exists a partition whose polynomial has largest root at most

Interlacing Polynomial interlaces if Example:

Interlacing Polynomial interlaces if Example:

Common Interlacing and have a common interlacing if can partition the line into intervals

Common Interlacing and have a common interlacing if can partition the line into intervals so that each contains one root from each polynomial )( )( ) ( )(

Common Interlacing If p 1 and p 2 have a common interlacing, for some

Common Interlacing If p 1 and p 2 have a common interlacing, for some i. Largest root of average )( )( ) ( )(

Common Interlacing If p 1 and p 2 have a common interlacing, for some

Common Interlacing If p 1 and p 2 have a common interlacing, for some i. Largest root of average )( )( ) ( )(

Without a common interlacing

Without a common interlacing

Without a common interlacing

Without a common interlacing

Without a common interlacing

Without a common interlacing

Without a common interlacing

Without a common interlacing

Common Interlacing If p 1 and p 2 have a common interlacing, for some

Common Interlacing If p 1 and p 2 have a common interlacing, for some i. Largest root of average )( )( ) ( )(

Common Interlacing and )( have a common interlacing iff is real rooted for all

Common Interlacing and )( have a common interlacing iff is real rooted for all )( ) ( )(

Interlacing Family of Polynomials is an interlacing family if its members can be placed

Interlacing Family of Polynomials is an interlacing family if its members can be placed on the leaves of a tree so when every node is labeled with the average of leaves siblings have common interlacings

Interlacing Family of Polynomials is an interlacing family if its members can be placed

Interlacing Family of Polynomials is an interlacing family if its members can be placed on the leaves of a tree so when every node is labeled with the average of leaves siblings have common interlacings have a common interlacing

Interlacing Family of Polynomials is an interlacing family if its members can be placed

Interlacing Family of Polynomials is an interlacing family if its members can be placed on the leaves of a tree so when every node is labeled with the average of leaves siblings have common interlacings have a common interlacing

Interlacing Family of Polynomials Theorem: There is a so that

Interlacing Family of Polynomials Theorem: There is a so that

Interlacing Family of Polynomials Theorem: There is a so that have a common interlacing

Interlacing Family of Polynomials Theorem: There is a so that have a common interlacing

Interlacing Family of Polynomials Theorem: There is a so that have a common interlacing

Interlacing Family of Polynomials Theorem: There is a so that have a common interlacing

Our family is interlacing Form other polynomials in the tree by fixing the choices

Our family is interlacing Form other polynomials in the tree by fixing the choices of where some vectors

Summary 1. Prove expected characteristic polynomial has real roots 2. Prove its largest root

Summary 1. Prove expected characteristic polynomial has real roots 2. Prove its largest root is at most 3. Prove is an interlacing family, so exists a partition whose polynomial has largest root at most

To learn more about Laplacians, see My class notes from “Spectral Graph Theory” and

To learn more about Laplacians, see My class notes from “Spectral Graph Theory” and “Graphs and Networks” My web page on Laplacian linear equations, sparsification, etc. To learn more about Kadison-Singer Papers in Annals of Mathematics and survey from ICM. Available on ar. Xiv and my web page