Combinatorial Optimization Research Center


December 5, 2005

9:00 AM, C. P. Davis Auditorium, Schapiro Bldg.


Pablo Parrilo

SOS/SDP methods: from optimization to games

In the last few years, techniques based on sum of squares (SOS) decompositions of multivariate polynomials, semidefinite programming (SDP), and results from real algebraic geometry have proved extremely useful in the formulation of hierarchies of convex relaxations for difficult polynomial optimization problems. In this talk, after reviewing the basic results, we show how these can be extended to a game theoretic setting. In particular, we discuss a class of zero-sum two-person games with an infinite number of pure strategies, where the payoff function is a polynomial expression of the actions of the players.

We show that the value of the game, and the corresponding optimal strategies, can be computed by solving a single semidefinite program. In addition, we show how the results can be applied, with suitable modifications, to a general class of semialgebraic games and problems with two quantifiers.

Catherine McGeoch
Amherst College

What matters most to performance? Case studies in algorithm engineering

Algorithm engineering describes a systematic process for transforming abstract algorithms into software products, with an emphasis on efficiency. This talk will address a question of critical interest to algorithm engineers: Where do speedups come from?

The notion of an algorithm design hierarchy was first articulated by Newell and Reddy and further developed by Bentley. The hierarchy is divided into four levels: Systems, Algorithms and Data Structures, Code Tuning, and Platforms. Drawing on several case studies, I will show how techniques drawn from various levels in this hierarchy can contribute to produce program speedups by factors ranging from two to six quadrillion.

Sven Leyffer

The Return of the Filter

Nonlinear optimization techniques enforce global convergence by combining optimality and feasibility into a single merit or penalty function. These functions require the algorithm to choose a penalty parameter, which may be problematic in practice.

In this talk, we review an alternative idea to promote global convergence, namely filter methods. A filter borrows ideas from multi-objective optimization and accepts a trial point whenever the objective or the constraint violation is improved compared to previous iterates.

We will survey different nonlinear optmization techniques where filter methods have been used successfully: ranging from derivative-free optimization over bundle mathods to interior point methods. We will outline the mechanism that underpin the convergence of filter methods, and show some recent numerical experience with inexact filter methods.

Edward Rothberg

It's a Beautiful Day in the Neighborhood --- Local Search in Mixed Integer Programming

Mixed integer programming and local search have experienced only a few chance meetings over the past 20 years, despite a clear overlap in the problem classes to which they are applied. This situation has changed recently, with notions from local search being successfully applied in the solution of a fairly broad class of MIP models. This talk will consider several approaches to integrating the technologies, presenting computational results for these approaches on a set of practical problems.

Jeff Linderoth

Using a Computational Grid for Optimization

We relate our experiences in using loosely coupled, heterogeneous, non-dedicated computing resources to solve optimization problems. These "computational grids" can be very powerful, but they are often difficult to use effectively.

We will discuss some software mechanisms that making building and using grids easier, and we will demonstrate that the CPU power of large-scale computational grids can be effectively harnessed by optimization algorithms.