I'm trying to solve a basic QP problem, but came across some really weird behaviour that I can't understand. So from the basic form \[\min \frac{1}{2} x^T Q x\] \[\sum_{i=1}^N x_i=n\] where Q is a randomly generated positive matrix with size N*N. I can easily solve the problem when n is 1~5. When n gets larger it gets harder to solve which is logical since the amount of possible permutations grow. However when n=N/2. I can easily solve it again. The same thing goes for n=(N/2)+3. Now shouldn't this be the hardest to solve? For a problem with the Q matrix size 64*64 I can solve it to optimum in 1 second if n=32. While if n=16 the problem seems nearly intractable. I have tried the normal quadratic form in Cplex as well as counting the diagonal with SDP. 
I suspect that in the QP relaxation of this MIQP model, you will get solution values where the x_i take on value close to n/N. So, perhaps the difference with n = 16 and n = 32 is that with n = 16, the smaller relaxation values tend to cause the optimizer to branch down first, while with n = 32, the optimizer branches up more often. More generally, on a models with sums of binaries being set equal to a positive integer, branching up tends to be more powerful than branching down. Also, since the variables are binary, you might try experimenting with the equivalent, linearized version of the model that is an MILP. Specifically, you could introduce variables z_ij = x_i*x_j by adding the constraints z_ij <= x_i for all i > j Note that for i = j, x_i = x_i^2, so you don't need to introduce z_ii, and for i < j, z_ij = z_ji. Once you do this, the fact that exactly n of the x_i variables equal 1 implies that exactly n*(n1)/2 of the z_ij variables can be 1; you could try adding that constraint and see if it tightens the formulation. I haven't tried it on this particular model, but I had some success using that approach on a similar type of QP. Now, this approach doesn't scale all that well, as you are introducing N(N1)/2 new variables and 3N(N1)/2 additional constraints. But, for values of N <= 200, the size of the model should be manageable, as long as it makes the model significantly easier for branch and cut to solve.
answered 28 Jun '11, 14:37 Ed Klotz 
While it's possible that there's something structural about n=N/2 and n=(N/2)+3, there are several things to assess before starting to make any conclusions. First, how exhaustive has been your sample of pseudorandom coefficients? Second, what is the difficult part  solving the continuous QP relaxations or traversing the MIP tree? Third, have you tuned the solver settings? Fourth, have you tested other MIQP solvers? answered 10 Mar '11, 14:47 Greg Glockner I have tested Gurobi as well, I have not really tuned any parameters. I have even written the problem in MILP form. The Mip tree is the difficult part. I have also tried with symmetric matrices with integers a part from the pseudornadom ones.
(10 Mar '11, 17:09)
Buxley
There's much that can be done to tune the search of the MIP tree. Might be worth investigating what can be done to tune the MILP performance first, then work to the MIQP.
(10 Mar '11, 17:32)
Greg Glockner
@Greg! Did not notice your comment before... My experience with tuning is rather bad since I usually solve huge problems and have never really noticed any real difference with the parameters, except for normal weird MIP behavior (Sometimes it works sometimes it doesn't). Usually I think Gurobi and Cplex automatically choose pretty good parameters so I really don't know which ones to tune...
(13 Apr '11, 02:10)
Buxley
1
@Buxley: Don't know about Gurobi, but CPLEX has an automagical tuning feature (so probably Gurobi does too). You can feed it a sample problem (preferably one of modest dimensions), set some parameters like time limit, emphasis (optimality/feasibility/balance), and let it run on autopilot. It will come up with what it thinks are better than default parameter settings. That takes you off the hook for figuring out which parameters to tweak (which is an NP annoying problem unto itself).
(14 Apr '11, 10:43)
Paul Rubin ♦♦
@Paul , I have used the tuning in CPLEX. However I didn't know that you can use it on models that will not get solved quickly. Does the time limit mean that I can e.g. put 2 hours as time limit and leave it on over the night and it will test which parameters give the best gap in that time? That seems pretty useful (and I agree on the NPannoying part!).
(15 Apr '11, 02:59)
Buxley
1
@Buxley: You can simultaneously set an overall time limit (for the total tuning run) and a perreplication time limit; so you can set 2 hours per replication and 24 hours total and get approximately 12 replications. (This of course assumes you set the parameters correctly. The last time I did a tuning run, I buggered something, and it ran for literally a week and a half.)
(15 Apr '11, 17:03)
Paul Rubin ♦♦
showing 5 of 7
show 2 more comments

Just to make sure: your x_i are binary vars and by positive you mean positive definite?
yes in both cases!