### A new KTH course on Sum of squares

It is a badly kept secret that the higher you climb the academic ladder, the more reckless you are with deadlines and duties. You find yourself with many tasks in the queue and the rationale of your schedule is “the closest deadline first”.

How do you find time for learning new stuff and reading up papers that you don’t have to review? A common approach is to teach a course about the topic you want to learn, so that anxiety and fear of miserable embarrassment would keep you on track.

For this reason next semester I will teach a course on “sum of squares”, which is the subject of exciting recent research. My main research area is proof complexity1 and in the last couple of years researcher in approximation algorithms realized that there is a nice proof theoretic interpretation of several technique they employ. Here at KTH there are strong groups in both Proof Complexity and in Approximation and Inapproximability2, so it seems ideal to try to bridge the gap and learn new exciting stuff.

## Description of the course

Most combinatorial optimization problems have a natural formulation as integer linear programs. Unfortunately it is hard to solve such programs, so we settle for less: we look for a fractional solution and then we round it to an integer one. This process usually gives a solution far from optimal.

Mathematicians and computer scientists have developed ways to improve the quality of solutions: they introduce new constraints that are valid for all integer solutions, but may exclude some fractional ones.

This process is formalized by the sum of squares inference system, which is a way to deduce such new constraints. This is the strongest system known in current literature and subsumes many successful techniques in approximation.

In the course we will discuss sum of squares and some of its subsystems, and the performance of certain systematic ways to produce the new constraints: for some problems we can get good approximation algorithms, while for others the systematic approach is too expensive. In particular we will focus on rank lower bounds, which give a way to measure how hard a problem is for sum of squares.

The course is divided in two parts:

In the first part we introduce sum of squares and its many subsystems. It is important to spend some time on the subsystems because most lower bounds in literature are for them. We will also briefly introduce semidefinite positive programming and semidefinite positive relaxations.

In the second part of the course we will study rank for optimization problems like $3$-SAT, $3$-XOR, Knapsack, Planted Clique, Densest $k$-subgraph.