Hello, I am curious about a topic which I faced during my study. I was dealing with a mixed integer model and even if I did not add new variable(s) to the model, when I add a new constraint to the same model, I expect that the optimal point can be found easily with the new cuts on the same feasible region. However, the solution time of the constraint added model is longer than the original model. (The original model solution time is 15 minutes and the constraint added one is 8 hours) What is the reason behind this case? Why does the smaller feasible region find the solution much more difficult than the larger one? Are there any techniques to understand the constraint types that increases or decreases the solution time of the model? I know, I asked many questions but I really want to understand the issue. Thanks in advance. asked 12 Jul '12, 15:34 Pelin
showing 5 of 7
show 2 more comments

I can offer up at least four possible answers (not mutually exclusive) that I think are not covered above. (Apologies if I'm repeating anything previously said.)
answered 13 Jul '12, 17:53 Paul Rubin ♦♦ @ Professor Rubin, thank you very much for your answer and sorry for my delaying answer. Your points match directly my question based on the reasons of increase on solution time.
(16 Jul '12, 16:59)
Pelin

In theory, you can efficiently solve a MIP model if you have a formulation that defines a polytope where extreme points are always integer solutions and you do not have to worry about integrality constraints (the first and/or second chapter of Wolsey's book "Integer Programming" describes such situation as the best formulation of the problem  it happens in cases such as network flow models). However, finding such polytope is not easy: most often we have a polytope defined by the relaxed model and integrality constraints, and the solving process consists of adding constraints as cuts to rule out parts of the polytope that do not have integer solutions. Thus, it might be the case that you start with a problem formulation that is not hard to solve, and then add a certain constraint that induces the extreme points with best valuation of the objective function to be fractional. That is the case of the Traveling Salesman Problem if compared to the Assignment Problem: the latter can be solved by polynomial time algorithms, but the additional constraints of the former make the resulting problem hard to be solved. answered 12 Jul '12, 16:36 Thiago Serra 3
As a general rule of thumb, you could not guess solution time of a MILP model by just looking at its variables and constraints. Similar to @Thiago's example, VRP is very challenging optimization problem, but finding feasible solutions for it is very straightforward (for example, nearest neighbor heuristic). However, when you add time windows constraints, finding feasible solutions gets harder (we have fewer efficient heuristics for finding feasible solutions for this case).
(12 Jul '12, 16:43)
Ehsan ♦
So, for a given problem and formulation, assume I want to solve problem more faster by writing some efficient constraints which includes optimal solution. How can I be sure that my cut is an efficient one? Or do I need to just try all formulation that I wrote? I think there could be some methods for efficient cuts. (I don't mean the cuttingplane algorithms, but maybe a stronger formulation for a problem could be an example for it)
(12 Jul '12, 16:59)
Sertalp Bilal
1
Cuttingplane algorithms somehow automate that process for the general case: they are usually based on inference rules that work for a number of problems. In the case of a specific problem, it might be the case that you need to try as many models as you can conceive. A good indicator in this case would be if your new constraint dominates you former one and also rules out a bit more of the relaxed polytope (in fact, that is the only measure that matters).
(12 Jul '12, 17:06)
Thiago Serra
Firstly, thank you all for your answers. To be clear and specific, my question is to understand the behavior of the constraint that either increases or decreases the solution time. Is there some clues or theories that help to understand the characteristics of the constraints?
(12 Jul '12, 17:11)
Pelin
That is one of the main concerns of polyhedral combinatorics. You can start with the first chapters of Wolsey's book. For a wider coverage (but maybe outdated), you can check his book with Nemhauser.
(12 Jul '12, 17:17)
Thiago Serra
Hi again, actually I have finished Wolsey's book (Integer Programming) in one of our courses. Therefore, I know exactly what you mean in these chapters but the answer of my question as far as I know is not there. But I do not know the book with Nemhauser. Maybe I should check it. Thank you all for your answers
(13 Jul '12, 05:16)
Pelin
2
Wolsey's is a very dense book. I though I had understood it completely after one course with it, but I realized years later that I had to read it over and over to figure how to apply some methods. Besides, there are other answers to your question. You may look for adaptive search methods for an empirical attempt to select the best algorithm or heuristic for a given problem (Kate SmithMiles survey and the work of LeytonBrown and others at UBC are a good start); and also for the theoretical+empirical work of Carla Gomes and others on backdoor variables and easyhardeasy phase transitions.
(13 Jul '12, 07:41)
Thiago Serra
@ Thiago Serra, sorry for my delaying answer and thank you for your answer. I will look up these studies =)
(16 Jul '12, 16:55)
Pelin
showing 5 of 8
show 3 more comments

@Pelin: Maybe if your share your problem name/description, we could give you more specific insights.
It would be good if we could aggregate a database of such metrics.
@Ehsan, the problem has not a significant name but it seems like a covering problem with additional constraints. @Goeffrey, I totally agree with the database idea. It will be helpful. How could we do that? Can we generate that in this web site? What are the suggestions about the database?
Could you please provide log output from both runs, otherwise this thread is not really useful, with MIP's you just end up guessing in the dark. Also if possible try to explain what the extra constraint is doing ?
Also try to solve it with some of the other optimizers, so you can rule out the randomness effect. You can use the neos optimizer to upload and test your problem on other optimizers for which you don't own a license.
The original set covering problem is known as an integerfriendly problem (i.e. its LP relaxation yields either integer solutions or B&B with few iterations could convert that solution to an optimal integer solution). Even its very large instances (more than 1000 nodes) are relatively easy to solve using good MIP solvers. Based on what other constraints you might have, I consider this normal behavior.
Firstly, sorry for delaying answer. Now, @Bo Jensen, actually I asked this question for a general purpose. For my future studies, I wanted to keep in my mind the methods/ theories provided in this question. Thank you for your answer.