Combinatorial Optimization Research Center

1st COLUMBIA OPTIMIZATION DAY

Wednesday, November 28, 2001

9:30 AM, Schapiro 412



ABSTRACTS





       Francisco Barahona
       IBM Research

       Branch & Cut based on the Volume algorithm

   We present a Branch-and-Cut algorithm where the Volume Algorithm is applied to the linear programming relaxations arising at each node of the search tree. This means we use fast approximate solutions to these linear programs instead of exact but slower solutions given by the traditionally used simplex method. We present computational results with the Max-Cut and Steiner Tree problems. We show evidence that one can solve these problems much faster with the Volume Algorithm based Branch and Cut code than with a dual simplex based one. This is joint work with Laszlo Ladanyi.




       Robert Bixby
       Rice University and ILOG Cplex

       MIP: A Progress Report

We will review the current computationl state-of-the-art for general purpose mixed integer programming (MIP) codes, starting with a brief, introductory look at linear programming (LP). LP is the computational kernel of the standard branch-and-cut approach MIP, and has made steady algorithmic progress over the last decade. Indeed, as we will illustrate, this progress has arguably exceeded that in computing machinery over this period. The story for MIP is a bit different. While the overall change has, in our view, been even more substantial, it has not come in a steady stream, but in an almost single, concentrated leap occuring over the last two to three years. The key has been the judicious combination of a whole collection of ideas that have accumulated over the last 20-30 years of research in combinatorial optimization and integer programming.




       Sebastian Ceria
       Axioma

       Solving integer programs in practice: the Axioma perspective

  Solving integer programming problems in practice involves trade-offs between quality and speed. Practitioners are usually not interested in "provable" optimality, but rather in predictable behavior and robustness in the solution process. For this reason, most practical implementations involve a combination of "optimal" methods and efficient heuristics. We will illustrate some of the concepts we apply when solving integer programs for our clients by using a concrete example from the financial services industry.




       Vasek Chvatal
       Rutgers University

       TSP cuts that do not follow the template paradigm

   The first computer implementation of the Dantzig-Fulkerson-Johnson cutting-plane method for solving the traveling salesman problem, written by Martin, used subtour inequalities as well as cutting planes of Gomory's type. The practice of looking for and using cuts that match prescribed templates in conjunction with Gomory cuts was continued in computer codes of Miliotis, Land, and Fleischmann.
Grötschel, Padberg, and Hong advocated a different policy, where the template paradigm is the only source of cuts; furthermore, they argued for drawing the templates exclusively from the set of linear inequalities that induce facets of advocated a different policy, where the template paradigm is the only source of cuts; furthermore, they argued for drawing the templates exclusively from the set of linear inequalities that induce facets of the TSP polytope.
These policies were adopted in the work of Crowder and Padberg, in the work of Grötschel and Holland, and in the work of Padberg and Rinaldi; their computer codes produced the most impressive computational TSP successes of the nineteen eighties. Eventually, the template paradigm became the standard frame of reference for cutting planes in the TSP.
I will outline a technique for finding cuts that disdains all understanding of the TSP polytope and bashes on regardless of all prescribed templates. Combining this technique with the traditional template approach in Concorde -- the computer code written by David Applegate, Bob Bixby, Bill Cook, and myself -- was a crucial step in our solution of a 13,509-city TSP instance and a 15,112-city TSP instance.





       William Cook
       Princeton University

       Optimization via Branch Decomposition

  Robertson and Seymour introduced branch-width as a new connectivity invariant of graphs in their proof of the Wagner conjecture. Decompositions based on this invariant provide a natural framework for implementing dynamic programming algorithms to solve graph optimization problems. In earlier work on the traveling salesman problem we used this framework in a heuristic algorithm to obtain near-optimal solutions to large-scale instances.
In this talk we will discuss the computational issues involved in using branch-width as as a general tool in discrete optimization. We will present applications to euclidean steiner tree problems, graph bipartition, maximum cut problems, and maximum stable set problems.




       David Johnson
       AT&T Research

       Heuristics for the Traveling Salesman Problem: The State of the Art

  During the last two years, I helped organize a "DIMACS Implementation Challenge" on heuristics for the Traveling Salesman Problem. Participants came from all over the world and included representatives of all the top research groups currently active in the field. They were asked to run their codes on a collection of test instances of various types, ranging in size from 1,000 to 1,000,000 cities and report their results, along with times for a benchmark code so that we could infer something about the relative speeds of their machines. In this talk I will describe the Challenge and identify the algorithmic approaches that worked best for various choice of how much running time one is willing to spend as a function of instance size. The range is from heuristics that take little more time than that needed to read an instance and still get within 50% of optimum to those that get within a few percent of optimum for a 100,000 city in seconds to those that get within a fraction of a percent of optimum for instances of this size in a few hours. For more information, see the Challenge website at http://www.research.att.com/~dsj/chtsp/.




       George Nemhauser
       Georgia Tech

       A Polyhedral Study of the Cardinality Constrained Knapsack Problem


  A cardinality constrained knapsack problem is a continuous knapsack problem in which no more than a specified number of nonnegative variables are allowed to be positive. This structure occurs, for example, in areas such as finance, location, and scheduling. Traditionally, cardinality constraints are modeled by introducing auxiliary 0-1 variables and additional constraints that relate the continuous and the 0-1 variables. We use an alternative approach, in which we keep in the model only the continuous variables, and we enforce the cardinality constraint through a specialized branching scheme and the use of strong inequalities valid for the convex hull of the feasible set in the space of the continuous variables. To derive the valid inequalities, we extend the concepts of cover and cover inequality, commonly used in 0-1 programming, to this class of problems, and we show how cover inequalities can be lifted to derive facet-defining inequalities. We present three families of non-trivial facet-defining inequalities that are lifted cover inequalities. Finally, we report computational results that demonstrate the effectiveness of lifted cover inequalities and the superiority of the approach of not introducing auxiliary 0-1 variables over the traditional MIP approach for this class of problems.
This is joint work with Ismael de Farias.




       Bruce Shepherd
       Lucent

       Revisiting the Lucchesi-Younger Theorem

  The Lucchesi-Younger Theorem gives a min-max formula for the problem of finding a mincost set of arcs in a digraph whose contraction (or reversal, actually) results in a strongly connected graph.
This result, together with standard blocking theory, imply that the problem of finding such a set of arcs can be found in polytime via linear programming. In 1981, Andras Frank gave a combinatorial algorithmic proof of the result (whose running time was $O(n^5)$).
Gabow subsequently gave a different algorithm with running time $O(n^2m)$.
We recast Frank's algorithm to achieve this faster running time but using only standard tools of the trade: ear decompositions, maximal arborescences, and negative cycle detection.
This is joint work with Adrian Vetta.




       Optimization Roundtable
       

       The next 10 years in Discrete Optimization

  During the last ten years we have seen a steadily growing fundamental development of Discrete Optimization, paralleled with an interest in effective implementation of advanced algorithmic techniques, and also paralleled by a very rapidly growing impact on practical applications. Topics that we will discuss include: What are the major theoretical and computational thrusts likely to be made during the next 10 years? Will optimization continue to assume a growing importance in the business world? What areas of (discrete) optimization, and which real-world applications are likely to receive the most attention?