Most combinatorial optimization problems have a natural formulation as integer linear programs. Unfortunately it is hard to solve such programs, so we settle for less: we look for a **fractional** solution and then we round it to an integer one. This process usually gives a solution which is far from optimal.

Mathematicians and computer scientists have developed ways to improve the quality of solutions: they introduce new constraints that are valid for all integer solutions, but may exclude some fractional ones.

This process is formalized by the sum of squares inference system, which is a way to deduce such new constraints. This is the strongest system known in current literature and subsumes many successful techniques in approximation.

In the course we will discuss sums-of-squares system, some of its subsystems, and the performance of certain systematic ways to produce the new constraints: for some problems we can get good approximation algorithms, while for others the systematic approach is too expensive. In particular we will focus on **rank lower bounds**, which give a way to measure how hard a problem is for sums-of-squares.

The course is made of two parts:

I. In the first part we introduce sum of squares and its many subsystems. It is important to spend some time on the subsystems because most lower bounds in literature are for them. We will also briefly introduce semidefinite positive programming and semidefinite positive relaxations.

II. In the second part of the course we will study rank for optimization problems like -SAT, -XOR, Knapsack, Planted Clique, Densest -subgraph.