Hard Theorems

'Cause when the proof gets tough… the tough get provin' (…proof complexity lower bounds!)

Sums-of-squares integer programming relaxation: Lecture 1

Lecture 1: Linear relaxations of integer programs.

Scribe: Massimo Lauria

Abstract

We can express combinatorial problems using integer programs and, since we can’t solve them, we consider relaxed programs in which integer constraints are relaxed to linear ones. Then we round fractional solutions to integer. We consider hierarchies of linear programs and discuss the quality of the corresponding solutions.}

Link: [Lecture notes 01]

Advertisements

Sums-of-squares integer programming relaxation (winter 2014)

In the warm winter of 2014, I gave a course on the Sums-of-squares integer programming relaxations. The course had the usual “teach it to learn it” attitude that students often have to endure, but it was very useful to get up to speed with Sums-of-squares the PhD students of the proof complexity group at KTH. Furthermore I have been blessed by guest lectures from Per Austrin, Johan Håstad and David Steurer.

Course URL: http://www.csc.kth.se/~lauria/sos14/

I will slowly upload the lecture notes here as I review them, but you can find them already in the course webpage.

Description of the course

Most combinatorial optimization problems have a natural formulation as integer linear programs. Unfortunately it is hard to solve such programs, so we settle for less: we look for a fractional solution and then we round it to an integer one. This process usually gives a solution which is far from optimal.

Mathematicians and computer scientists have developed ways to improve the quality of solutions: they introduce new constraints that are valid for all integer solutions, but may exclude some fractional ones.

This process is formalized by the sum of squares inference system, which is a way to deduce such new constraints. This is the strongest system known in current literature and subsumes many successful techniques in approximation.

In the course we will discuss sums-of-squares system, some of its subsystems, and the performance of certain systematic ways to produce the new constraints: for some problems we can get good approximation algorithms, while for others the systematic approach is too expensive. In particular we will focus on rank lower bounds, which give a way to measure how hard a problem is for sums-of-squares.

The course is made of two parts:

I. In the first part we introduce sum of squares and its many subsystems. It is important to spend some time on the subsystems because most lower bounds in literature are for them. We will also briefly introduce semidefinite positive programming and semidefinite positive relaxations.

II. In the second part of the course we will study rank for optimization problems like 3-SAT, 3-XOR, Knapsack, Planted Clique, Densest k-subgraph.

My Banff 2014 talk “Narrow proofs may be maximally long”

My talk at the Banff workshop Theoretical Foundations of Applied SAT Solving is now online. Please get ready your rotten tomatoes and click on the link to the video below.

Enjoy!

Talk: Narrow proofs may be maximally long (video)

“Type around the clock” or “wherever I lay my butt, that’s my office”

Morning workplace:

https://massimolauria.files.wordpress.com/2014/01/wpid-workplace_morning.jpeg

Afternoon workplace:

https://massimolauria.files.wordpress.com/2014/01/wpid-workplace_afternoon.jpeg

Evening workplace:

https://massimolauria.files.wordpress.com/2014/01/wpid-workplace_evening.jpeg

Night workplace:

https://massimolauria.files.wordpress.com/2014/01/wpid-workplace_night.jpeg

A sarcastic take on implicit type casting

I suggest everyone in computer programming to watch this video about implicit type casting:

https://www.destroyallsoftware.com/talks/wat

Visiting PhD student positions at KTH

From the KTH webpage at this url:

The Theory Group at KTH Royal Institute of Technology invites
applications for visiting PhD student positions during 2014. These
positions are intended for PhD students with a strong research record
who would like to spend part of their PhD program in another research
environment. The students will be hosted by Per Austrin, Johan Håstad,
and/or Jakob Nordström depending on profile and research interests.

PhD students working on complexity theory and/or algorithms should really take the chance to work or at least to come in contact with such remarkable researchers.

Furthermore, if people come to visit at the right time, there are a couple of additional treats:

  • A PhD-level course on Lasserre hierarchies and sum of squares proofs, from end of January to March. The webpage is still under construction, but you get the idea.
  • The Theory Group at KTH will be organizing the Swedish Summer School in Computer Science June 30 – July 4, 2014, with Boaz Barak and Ryan O’Donnell as lecturers. There is no webpage yet, but I’ll post more information later.

A new KTH course on Sum of squares

It is a badly kept secret that the higher you climb the academic ladder, the more reckless you are with deadlines and duties. You find yourself with many tasks in the queue and the rationale of your schedule is “the closest deadline first”.

How do you find time for learning new stuff and reading up papers that you don’t have to review? A common approach is to teach a course about the topic you want to learn, so that anxiety and fear of miserable embarrassment would keep you on track.

For this reason next semester I will teach a course on “sum of squares”, which is the subject of exciting recent research. My main research area is proof complexity1 and in the last couple of years researcher in approximation algorithms realized that there is a nice proof theoretic interpretation of several technique they employ. Here at KTH there are strong groups in both Proof Complexity and in Approximation and Inapproximability2, so it seems ideal to try to bridge the gap and learn new exciting stuff.

Description of the course

Most combinatorial optimization problems have a natural formulation as integer linear programs. Unfortunately it is hard to solve such programs, so we settle for less: we look for a fractional solution and then we round it to an integer one. This process usually gives a solution far from optimal.

Mathematicians and computer scientists have developed ways to improve the quality of solutions: they introduce new constraints that are valid for all integer solutions, but may exclude some fractional ones.

This process is formalized by the sum of squares inference system, which is a way to deduce such new constraints. This is the strongest system known in current literature and subsumes many successful techniques in approximation.

In the course we will discuss sum of squares and some of its subsystems, and the performance of certain systematic ways to produce the new constraints: for some problems we can get good approximation algorithms, while for others the systematic approach is too expensive. In particular we will focus on rank lower bounds, which give a way to measure how hard a problem is for sum of squares.

The course is divided in two parts:

In the first part we introduce sum of squares and its many subsystems. It is important to spend some time on the subsystems because most lower bounds in literature are for them. We will also briefly introduce semidefinite positive programming and semidefinite positive relaxations.

In the second part of the course we will study rank for optimization problems like 3-SAT, 3-XOR, Knapsack, Planted Clique, Densest k-subgraph.

More information on: http://www.csc.kth.se/~lauria/sos14/

Footnotes:

1 See this blog’s description.

2 Wikipedia is your friend.

John McCarthy (1927-2011)

Creator of LISP, he defined Artificial Intelligence and believed that computer could pass the Turing Test.

(cons "John"
    (cons "McCarthy"
        (cons 1927
            (cons 2011 ()))))

Sad month for computer science…

AI class at Stanford

Maybe you heard that Peter Norvig and Sebastian Thrun, professors at Stanford University, are giving an online course on Artificial Intelligence1. The guys are real experts in the field; this course has got tens of thousands of people enrolled for the “online version”, which runs in parallel to the actual class in Stanford.

The quality of the experiment is amazing: it is not just a bunch of lecture videos. Videos are specifically made for selfstudy, with integrated exercises and everything.

But one brilliant thing of the course, that I guess even the professors didn’t foresee, is the amazing series of “lecture notes” that are being produced at http://larvecode.tumblr.com/.

Seriously, even if you don’t care about AI go and watch what these people (or this person) are doing on a daily basis.

PS: seriouly, go now!

Footnotes:

Dennis Ritchie (1941-2011)

#include <stdio.h>

obituary();

main()
{
  register x;
  x=obituary("Dennis Ritchie",1941,2011);
  return x;
}

obituary(name,birth,death)
         char *name,
         birth,
         death;
{
  auto out;
  out=fprintf(stdout,"%s (%d-%d)\n",name,birth,death);
  return out;
}