摘要

In this work, we propose an algorithmic approach to improve mixed-integer models that are originally formulated as convex generalized disjunctive programs (GDPs). The algorithm seeks to obtain an improved continuous relaxation of the mixed-integer linear and mixed-integer nonlinear programming (MILP/MINLP) model reformulation of the GDP while limiting the growth in the problem size. There are three main stages that form the basis of the algorithm. The first one is a presolve, consequence of the logic nature of GDP, which allows us to reduce the problem size, find good relaxation bounds, and identify properties that help us determine where to apply a basic step. The second stage is the iterative application of basic steps, selecting where to apply them and monitoring the improvement of the formulation. Finally, we use a hybrid reformulation of GDP that seeks to exploit both of the advantages attributed to the two common GDP-to-MILP/MINLP transformations, the Big-M, and the Hull reformulation. We illustrate the application of this algorithm with several examples. The results show the improvement in the problem formulations by generating models with improved relaxed solutions and relatively small growth of continuous variables and constraints. The algorithm generally leads to reduction in the solution times.

  • 出版日期2015