Positive Semidefinite Matrices & Semidefinite Programming
Symmetric Matrices & Spectral Theorem
A matrix
Spectral Theorem: If
has eigenvalues (counted with multiplicity),- all eigenvalues of
are real, - there exists an orthonormal basis of
consisting of eigenvectors of .
In other words, we can write
If a symmetric matrix
- all eigenvalues of
are non-negative. for some matrix , where . The smallest value of is the rank of . for all . for some diagonal matrix with non-negative diagonal entries and some lower triangular matrix with diagonal elements equal to . is in the convex hull of the set of rank-one matrices for . for some diagonal matrix with non-negative diagonal entries and some orthonormal matrix . is symmetric and all principal minors of are non-negative. Here, by principal minors we mean the determinants of the submatrices of obtained by deleting the same set of rows and columns.
Semidefinite Programming
Let
A semidefinite program (SDP) is an optimization problem of the form
We can write and SDP in a way similar to a linear program as follows:
If the matrices
In a similar way to linear programs, the following are important structural and algorithmic questions for SDPs:
- When is a given SDP feasible? That is, is there a solution to the constraints at all?
- When is a given SDP bounded? Is there a minimum? Is it achievable? If so, how can we find it?
- Can we characterize optimality?
- How can we know that a given solution is optimal?
- Do the optimal solutions have a nice description?
- Do the solutions have small bit complexity?
- How can we solve SDPs efficiently?
To understand better these questions and the structure of SDPs, we will need to learn a bit about convex algebraic geometry.
Convex Algebraic Geometry
Spectrahedra
To understand the geometry of SDPs, we will need to understand their feasible regions, which are called spectrahedra and are described by Linear Matrix Inequalities (LMIs).
Definition 1 (Linear Matrix Inequality (LMI)): An LMI is an inequality of the form
Definition 2 (Spectrahedron): A spectrahedron is a set of the form
Note that spectrahedra are convex sets, since they are defined by LMIs, which are convex constraints. Moreover, several important convex sets are spectrahedra, including all polyhedra, circles/spheres, hyperbola, (sections of) elliptic curves, among others.
When considering SDPs, it is enough to work with a more general class of convex sets, which we call spectrahedral shadows. Spectrahedral shadows are simply projections of spectrahedra onto lower-dimensional spaces.
Testing Membership in Spectrahedra
To be able to solve SDPs efficiently, a first step is to be able to test membership in spectrahedra efficiently.
That is, given a spectrahedron
More succinctly, we have the following decision problem:
- Input: symmetric matrix
- Output: YES if
, NO otherwise.
An efficient algorithm for this problem is the symmetric gaussian elimination algorithm, which runs in time
As the product of lower (or upper) unitriangular matrices is again a lower (or upper) unitriangular matrix, we can see that the symmetric gaussian elimination algorithm will always output a diagonal matrix
To see that
The above proves that our algorithm is correct, and the running time is
Application: Control Theory
not required material - to be written here later - please see slides and references for this part
SDPs are used in many areas of mathematics and engineering, including control theory, combinatorial optimization, and quantum information theory. Today we will see an application of SDPs to control theory, in particular to the problem of stabilizing a linear, discrete-time dynamical system.