Speeding up zeroobjective binary LP
AnsweredI have a binary LP problem, about 9100 binary variables and 8500 constraints after presolving (set to "Aggressive"). I have no objective function, and there is a decent chance that the model is infeasible (although I hope to find a solution). I have the latest instance running for a 11 days without finding a feasible solution. Currently 3415602 explored and 454289 unexplored nodes.
I was wondering if there are any settings that could help speed up this computation. Tuning is not able to give any improvements, since no feasible solutions are found in any trial. Currently I have:
model.setParam('MIPFocus', 1)
model.setParam('Symmetry', 2)
model.setParam('Presolve', 2)
model.setParam('Heuristics', .25)
model.setParam('Threads', 12)
I turned up the Heuristics after running a few weeks at the default without finding anything. I have also tried changing some of the priority levels for the variables, which fall into 4 main types (I can give more details on this if it's relevant).
I have seen some general information saying it may help to use variable probing and node presolve, but I cannot find information on these settings in Gurobi.

Do you have smaller versions of the problem that you are able to solve? If so, the tuning tool applied on those smaller versions may identify parameters that are useful also for the larger problem.
0 
I have some problems that are... well, semirelated, and MUCH smaller (most solve in 2000 seconds or less, have around 350 variables and 280 constraints). They seem to have highly variable run times, so I've set the tuning to do five runs starting with mostly default settings (only setting Presolve=2, PreDepRow=PreSparsify=1).
One of the suggested speedups came from setting NumericFocus to 2. Another came from setting Heuristics to 0, although that did not speed up every related model.
I think that a lot of the difficulty of the model comes from the fact that I have no objective function, and all constraints are equality constraints. I'm not sure if there are some settings for Heuristics, or for the types of cuts to use, that will help with this sort of problem.
I'm running on a machine with 24 cores and plenty of RAM, and I'm happy to let it run for months if I have some sort of idea that it will eventually finish. But I don't entirely know how to predict this.
0 
Sometimes it helps to use the feasRelax model, see https://www.gurobi.com/documentation/8.1/refman/py_model_feasrelaxs.html#pythonmethod:Model.feasRelaxS
By calling
m.feasRelaxS(0, False, False, True)
you are turning your model 'm' into a feasRelax model for which the constraints can be violated and in which the objective is to minimize the sum of the violations. Then, it will be very easy for Gurobi to find an initial feasible solution (with lots of constraint violations, so a high objective function penalty). During the optimization process, you will now hopefully see new incumbents with smaller violation penalty. If you are lucky, this will find a solution with 0 violation penalty, which corresponds to a feasible solution of your original model. Another lucky case would be if the dual bound becomes positive, which is a proof that your original model is infeasible (no feasRelax solution with penalty 0 exists).
Regards,
Tobias
0
Please sign in to leave a comment.
Comments
3 comments