Hi I work with MIP problems where variables have different magnitudes. For examples, some variables represent flows between 0 and 100 m³/s, heights that are usually between 0 and 100 m, while volume variables may take values up to 10^6 m³. The problems take into account many storage facilities with different capacities (between 100 and 10^6 m³). What's the best approach to deal with such variables in a MIP problem? Is it better to have variables with the same magnitude in a MIP problem or to normalize them? Normalizing volume variables or even changing their unit may create large or small coefficients in the coefficient matrix, making the problem harder to solve by a MIP solver. Any ideas? Thanks in advance. Francois asked 11 Sep '17, 10:29 Francois 
I'm not sure there is a one size fits all answer, but my best guess is that the priority should be avoiding a large spread in the magnitudes of the constraint coefficients (including the constant terms). I don't think the sizes of the variable values are all that critical. So if the coefficients in the original formulation vary by, say, five orders of magnitude (not too bad), and normalizing the variables would make that 10 or 12 orders of magnitude (worrisome), I would avoid normalizing. Note that bounds on the variables are a bit different. If you have to say \(x\le 10^8\) for some variable, putting that in as a bound (rather than as a constraint) should be fairly harmless. Regarding the daisy chaining of additional variables, I've never seen that advocated, and I've never tried it myself, so I don't know how well it would work. If your solver does initial presolving, it's possible the presolving might eliminate the intermediate variables and take you back to where you started. You might keep that arrow in your quiver as an option in case you run into numerical stability problems that cannot be mitigated otherwise. answered 11 Sep '17, 16:03 Paul Rubin ♦♦ @PaulRubin Thank you very much for your answer. Actually, the constraint that would give me headaches is the bathymetry definition using a piecewise linear formulation. A volume "V = V1 a1 + V2 a2 + ... + Vn an", where "V1, V2, ... Vn" are the possible volumes from a table, V is the volume variable and "a1, a2, ... an" are variables. I saw a reallife example where the Vi's were strictly increasing between 50 and 400000. Could a nested formulation like "V = V1 a1 + V2 b2", "b2 = a2 + V3/V2 b3", ..., "b(n1) = a(n1) + Vn/V(n1) an" be a good alternative?
(11 Sep '17, 16:37)
Francois
Four orders of magnitude difference among the coefficients is unlikely to upset a good quality solver. I would just enter it asis. Have you solved such an instance and seen evidence of numerical instability?
(11 Sep '17, 17:23)
Paul Rubin ♦♦
@PaulRubin The problem I want to solve has up to 20000 constraints. The bathymetry constraint is one of them. In the other constraints, the lowest coefficient is generally close to 0.01. We use Gurobi to solve these problems. Everything worked well until Gurobi 7.5. Presolve in Gurobi 7.5 makes some problems infeasible. I was also told to reduce the range of the coefficients to improve stability. This is why I'm looking for formulation alternatives.
(12 Sep '17, 09:04)
Francois
I don't have any experience with Gurobi, but if it has parameters for controlling what presolve does, you might try fiddling with them. I once ran into an issue with CPLEX presolve that was overcome by dialing down symmetry reduction. (They've since fixed whatever the cause was.) You can certainly try the nested constraints idea, but there's no guarantee it fixes the issue ... and no guarantee that coefficient magnitudes are actually contributing to the problem. Any idea what changed in Gurobi 7.5 that would explain it?
(12 Sep '17, 11:23)
Paul Rubin ♦♦
@PaulRubin According to Gurobi, while previous versions were able to solve the model, Gurobi 7.5 has difficulties solving it, even converging to "optimal" solutions where some variables were out of bounds (negative values for positive variables). Turning aggregate off makes Gurobi able to solve it again, but slowly. And according to Gurobi support, the model numerics is pretty bad, which causes the bad behavior of the presolve and the outofbound solution. I'm looking for ways to improve that part.
(12 Sep '17, 11:49)
Francois
You mentioned coefficients ranging between 50 and 400,000. If some variables have consistently small coefficients across all constraints, and some have consistently large ones, you can try rescaling the variables (e.g., change units of one variable from cubic meters to hundreds of cubic meters). I suspect you've already considered that. I suggest you try your nesting idea and, if it fails, come back and let us know. Without seeing the model, it may be hard to come up with other suggestions.
(12 Sep '17, 14:32)
Paul Rubin ♦♦
Also, keep in mind that bad numerics can occur for reasons other than scaling of coefficients. Someone once showed me a constraint matrix composed entirely of zeros and ones that had quite bad numerics.
(12 Sep '17, 14:33)
Paul Rubin ♦♦
@PaulRubin Thanks for your comments. I'll try to find the causes of the bad numerics. Regards.
(12 Sep '17, 14:57)
Francois
showing 5 of 8
show 3 more comments

Would it be a good idea to replace a constraint like y <= 2000 x with the following constraints : x1 = 10 x, x2 = 10 x1, x3 = 10 x2 and y <= 2 x3 ?