Comments


Monday, January 11, 2010

http://blog.computationalcomplexity.org/2010/01/guest-post-on-ics-2010-1-of-3.html

 

Guest Post on ICS 2010 (1 of 3)
Posted by GASARCH
Innovations in Computer Science 2010 (post #1)

Guest post by Aaron Sterling

This is the first of three posts about ICS 2010, the much-discussed "concept conference," which took place at the Institute for Theoretical Computer Science (ITCS), Tsinghua University, Beijing, from January 5th-7th. I will provide my impressions in this post and one other, and Rahul Santhanam plans to contribute something as well.

First, I need to say that this was the best-run conference I have ever attended, and one of the best-organized events of any kind that I have ever participated in. The level of financial support for students and authors, the quality of food and lodging, and the remarkable closing ceremony (which included several music and dance acts, a Kung Fu demonstration, and -- my favorite -- a Face-Off performance) set a high bar for any other conference in the world. Local Arrangements Committee members don't often get mentioned in posts like these, but I believe the entire TCS community owes a debt of gratitude not just to PC Chair Andrew Yao, but also to Local Arrangements Chair Amy Yuexuan Wang, Conference Secretary Yuying Chang, and to everyone else who made this event happen. This feels like a turning point in the history of the field.

In the US, I have often gotten the impression that computer science departments and funding sources consider TCS to be of secondary importance. What a difference in Beijing! As a silly-yet-telling example, Sanjeev Arora told me that, for a conference in 2009, ITCS printed a sign in which the phrase "Theoretical Computer Science" appeared in the largest-size font ever. I believe the investment in theory on the part of the Chinese government and academia, contrasted to the malaise of departments in the United States, speaks volumes about the future, unless the United States changes direction significantly. I'll leave that topic to be discussed on a political blog, though. Suffice it to say, I think everyone was pleased to be treated like a first-class scientist, instead of like someone doing "impractical" things that are less worthy of support.

Perhaps the highlight of the technical program was the "derivatives paper," already covered at length by Richard Lipton and other bloggers, so I won't discuss it here. Many of the accepted papers were in algorithmic game theory, and I will limit myself to mentioning the two papers in that area I found the most exciting. These are "Bounding Rationality by Discounting Time" by Fortnow and Santhanam, and "Game Theory with Costly Computation: Formulation and Application to Protocol Security" by Halpern and Pass. Essentially, Halpern and Pass define a class of games with complexity functions attached, so it is possible to reason about concepts like equilibrium with respect to a particular measure of complexity. The Fortnow/Santhanam model embeds into this approach, as it considers one particular type of complexity function. On the other hand, the complexity function defined in Fortnow/Santhanam seems particularly natural, and they are able to obtain more specific results than Halpern/Pass, because they start with a less generalized model.

The conference started off with a bang: Benny Applebaum gave an excellent talk about cryptography obtained by using only local computation. This was "Cryptography by Cellular Automata or How Fast Can Complexity Emerge in Nature?" co-authored with Ishai and Kushilevitz. They constructed, for example, one-way functions with one step of cellular automata (i.e., after one step, it is computationally hard to invert the state of the system to the original state). As cellular automata can only communicate with their immediate neighbors, this has bearing on the parallel complexity of cryptography. One point that came up in discussion is that, unlike one-way functions, document signatures cannot be obtained by local computation only, because of the need to make global change to the output if a single bit of the input is changed.

The "Best Impromptu" Award goes to Avrim Blum, who, on three hours' notice, gave one of the most stimulating talks of the conference when he presented "A New Approach to Strongly Polynomial Linear Programming" by Barasz and Vempala, after the authors had a problem with their trip. The Barasz/Vempala concept is a hybrid of the Simplex Algorithm and the Interior Point Method for solving LP's. Rather than just trace the edges, or just go through the interior of the polytope, they take the weighted average of the "useful" edges near the current location, and follow the obtained "averaged" line until they hit another face in the polytope. It is unknown in general whether their algorithm runs in polynomial time, but it seems very interesting, because they have shown that, for each case for which Simplex runs in exponential time, their algorithm can solve that "hard case" in polynomial time. This is because their solution method is invariant under affine transformations of the problem statement, so it is robust even when the angles of the polytope are narrow, i.e., the constraints are very close to one another.

I will conclude this post by mentioning Bernard Chazelle's "Analytical Tools for Natural Algorithms." (Please see a previous guest post of mine, and comment 3 of that post by Chazelle, for some background.) His main philosophical message was: "Use algorithms to analyze algorithms" -- meaning that if one is trying to analyze the behavior of a nonlinear multi-agent system like ABC...Dx, where A,B,C, ... ,D are matrices whose identity depends on time and some kind of feedback loop, it is not helpful to consider the problem "just mathematically," by analyzing the operator ABC...D independent of x. Rather, one should consider the problem in the form A(B(C(Dx))), and design an algorithm to reason about this nested behavior. That algorithm can then (hopefully) be tweaked to prove similar results about related nonlinear multi-agent systems. To quote from his paper: "Theorems often have proofs that look like algorithms. But theorems are hard to generalize whereas algorithms are easy to modify. Therefore, if a complex system is too ill-structured to satisfy the requirements of a specific theorem, why not algorithmicize its proof and retool it as a suitable analytical device?"

In my next post, I'll sketch results from a few more papers, try to give some flavor of the discussion session at the end of the conference, and offer a few suggestions for the future. My apologies in advance to all the authors whose work I will be leaving out. Several attendees commented how accessible and well-presented the talks were -- and I noticed this too. (I think this was due in large part to the "call for concepts." Certainly when I prepared my own presentation, I felt motivated to communicate "high" concepts as well as I could, and I felt less pressure to include the Mandatory Scary Formula Slide(tm) to demonstrate my ability to perform rigorous research.) In any case, there is far more great material than I could possibly cover in two posts -- which is a very good problem for a new conference to have!