Hello,

I am curious about a topic which I faced during my study. I was dealing with a mixed integer model and even if I did not add new variable(s) to the model, when I add a new constraint to the same model, I expect that the optimal point can be found easily with the new cuts on the same feasible region. However, the solution time of the constraint added model is longer than the original model. (The original model solution time is 15 minutes and the constraint added one is 8 hours) What is the reason behind this case? Why does the smaller feasible region find the solution much more difficult than the larger one? Are there any techniques to understand the constraint types that increases or decreases the solution time of the model?

I know, I asked many questions but I really want to understand the issue. Thanks in advance.

asked 12 Jul '12, 15:34

Pelin's gravatar image

Pelin
8535
accept rate: 0%

@Pelin: Maybe if your share your problem name/description, we could give you more specific insights.

(12 Jul '12, 16:44) Ehsan ♦

It would be good if we could aggregate a database of such metrics.

(13 Jul '12, 04:18) Geoffrey De ... ♦

@Ehsan, the problem has not a significant name but it seems like a covering problem with additional constraints. @Goeffrey, I totally agree with the database idea. It will be helpful. How could we do that? Can we generate that in this web site? What are the suggestions about the database?

(13 Jul '12, 05:48) Pelin

Could you please provide log output from both runs, otherwise this thread is not really useful, with MIP's you just end up guessing in the dark. Also if possible try to explain what the extra constraint is doing ?

(13 Jul '12, 05:52) Bo Jensen ♦

Also try to solve it with some of the other optimizers, so you can rule out the randomness effect. You can use the neos optimizer to upload and test your problem on other optimizers for which you don't own a license.

(13 Jul '12, 05:54) Bo Jensen ♦

The original set covering problem is known as an integer-friendly problem (i.e. its LP relaxation yields either integer solutions or B&B with few iterations could convert that solution to an optimal integer solution). Even its very large instances (more than 1000 nodes) are relatively easy to solve using good MIP solvers. Based on what other constraints you might have, I consider this normal behavior.

(13 Jul '12, 05:57) Ehsan ♦

Firstly, sorry for delaying answer. Now, @Bo Jensen, actually I asked this question for a general purpose. For my future studies, I wanted to keep in my mind the methods/ theories provided in this question. Thank you for your answer.

(16 Jul '12, 16:53) Pelin
showing 5 of 7 show 2 more comments

I can offer up at least four possible answers (not mutually exclusive) that I think are not covered above. (Apologies if I'm repeating anything previously said.)

  • The new constraint adds more work to every LP pivot. If it does not reduce the node count enough to repay that, solution time will increase.
  • If the original matrix was sparse but the new cut is dense, it will drive up pivot time even further.
  • It's possible (though perhaps not likely) that the new constraint introduces some numerical instability not previously present, which can force the solver to refactor the basis more often or introduce code that defends against the instability but slows execution.
  • The new constraint will alter both the structure of the search tree and the path taken through it. Sometimes this results in a longer path (more nodes) even if the LP hull is smaller. Although it's not exactly the same thing, picture the sequence of extreme point solutions when you solve an LP. Now add a constraint that cuts off one of the suboptimal corners visited. The LP hull has shrunk (without losing the optimal solution), but the number of basic solutions increases by at least one (the corner you cut off is replaced by two new corners).
link

answered 13 Jul '12, 17:53

Paul%20Rubin's gravatar image

Paul Rubin ♦♦
14.6k513
accept rate: 19%

@ Professor Rubin, thank you very much for your answer and sorry for my delaying answer. Your points match directly my question based on the reasons of increase on solution time.

(16 Jul '12, 16:59) Pelin

In theory, you can efficiently solve a MIP model if you have a formulation that defines a polytope where extreme points are always integer solutions and you do not have to worry about integrality constraints (the first and/or second chapter of Wolsey's book "Integer Programming" describes such situation as the best formulation of the problem - it happens in cases such as network flow models). However, finding such polytope is not easy: most often we have a polytope defined by the relaxed model and integrality constraints, and the solving process consists of adding constraints as cuts to rule out parts of the polytope that do not have integer solutions. Thus, it might be the case that you start with a problem formulation that is not hard to solve, and then add a certain constraint that induces the extreme points with best valuation of the objective function to be fractional. That is the case of the Traveling Salesman Problem if compared to the Assignment Problem: the latter can be solved by polynomial time algorithms, but the additional constraints of the former make the resulting problem hard to be solved.

link

answered 12 Jul '12, 16:36

Thiago%20Serra's gravatar image

Thiago Serra
1.2k413
accept rate: 1%

3

As a general rule of thumb, you could not guess solution time of a MILP model by just looking at its variables and constraints. Similar to @Thiago's example, VRP is very challenging optimization problem, but finding feasible solutions for it is very straightforward (for example, nearest neighbor heuristic). However, when you add time windows constraints, finding feasible solutions gets harder (we have fewer efficient heuristics for finding feasible solutions for this case).

(12 Jul '12, 16:43) Ehsan ♦

So, for a given problem and formulation, assume I want to solve problem more faster by writing some efficient constraints which includes optimal solution. How can I be sure that my cut is an efficient one? Or do I need to just try all formulation that I wrote? I think there could be some methods for efficient cuts. (I don't mean the cutting-plane algorithms, but maybe a stronger formulation for a problem could be an example for it)

(12 Jul '12, 16:59) Sertalp Bilal
1

Cutting-plane algorithms somehow automate that process for the general case: they are usually based on inference rules that work for a number of problems. In the case of a specific problem, it might be the case that you need to try as many models as you can conceive. A good indicator in this case would be if your new constraint dominates you former one and also rules out a bit more of the relaxed polytope (in fact, that is the only measure that matters).

(12 Jul '12, 17:06) Thiago Serra

Firstly, thank you all for your answers. To be clear and specific, my question is to understand the behavior of the constraint that either increases or decreases the solution time. Is there some clues or theories that help to understand the characteristics of the constraints?

(12 Jul '12, 17:11) Pelin

That is one of the main concerns of polyhedral combinatorics. You can start with the first chapters of Wolsey's book. For a wider coverage (but maybe outdated), you can check his book with Nemhauser.

(12 Jul '12, 17:17) Thiago Serra

Hi again, actually I have finished Wolsey's book (Integer Programming) in one of our courses. Therefore, I know exactly what you mean in these chapters but the answer of my question as far as I know is not there. But I do not know the book with Nemhauser. Maybe I should check it. Thank you all for your answers

(13 Jul '12, 05:16) Pelin
2

Wolsey's is a very dense book. I though I had understood it completely after one course with it, but I realized years later that I had to read it over and over to figure how to apply some methods.

Besides, there are other answers to your question. You may look for adaptive search methods for an empirical attempt to select the best algorithm or heuristic for a given problem (Kate Smith-Miles survey and the work of Leyton-Brown and others at UBC are a good start); and also for the theoretical+empirical work of Carla Gomes and others on backdoor variables and easy-hard-easy phase transitions.

(13 Jul '12, 07:41) Thiago Serra

@ Thiago Serra, sorry for my delaying answer and thank you for your answer. I will look up these studies =)

(16 Jul '12, 16:55) Pelin
showing 5 of 8 show 3 more comments
Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "Title")
  • image?![alt text](/path/img.jpg "Title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Tags:

×71
×56
×37

Asked: 12 Jul '12, 15:34

Seen: 2,734 times

Last updated: 14 Oct '13, 15:57

OR-Exchange! Your site for questions, answers, and announcements about operations research.