Optimal solution found early but long time to close gap
AnsweredHello,
I have a class of MILP problems for which I know the solution a priori. As such, I know that gurobi finds the optimal solution relatively early, but the problem is large and thus long time is spent proving optimality.
I was thinking of writing a callback that checks if the solution has changed for N nodes and/or T seconds (based on some rule of thumb I have). For large T or N after which the best found solution is the same, I would assume that this is optimal or near optimal. At this point, I can terminate the solving process. But instead of terminating, are there parameters that I can change or strategies that I can exploit so that gurobi stops focusing on finding a better solution and proves optimality faster?
Kind regards,
Marta
-
Official comment
This post is more than three years old. Some information may not be up to date. For current information, please check the Gurobi Documentation or Knowledge Base. If you need more help, please create a new post in the community forum. Or why not try our AI Gurobot?. -
Hi Marta,
At this point, I can terminate the solving process. But instead of terminating, are there parameters that I can change or strategies that I can exploit so that gurobi stops focusing on finding a better solution and proves optimality faster?
You can tell Gurobi to focus more on proving optimality by setting the MIPFocus parameter to 2 or even better 3. Setting the Heuristics parameter to 0 will turn off all heuristics searching for feasible points. Other parameters which might drive Gurobi to a better best bound are Presolve and Cuts. I would recommend running the Gurobi parameter tuning tool for at least a day on a smaller (should optimally take a couple of minutes to solve) model to see whether it can find a set of parameters which might improve convergence.
If you know the best solution a priori, you should definitely provide it to Gurobi as a MIP start which might as well provide a performance improvement.
Out of curiosity, could you share one such model with its optimal solution point? The way of how to share files in the Forum is described in Posting to the Community Forum.
Best regards,
Jaromił0 -
Hi Jaromił,
thank you for your reply.
I did run the tuning tool, and it did suggest to set MIPFocus to 2 and Cuts to 2, which made the solving process of a few instances I have considerably faster. However, I would like to try and reduce the computational time further, especially on the larger instances.
I am reluctant to pass the actual solution as a MIPStart, because I am now in a testing/validating phase, while I will soon move to a case where I don't actually know the solution a priori.
I have a couple of questions still:
1) I found this discussion on stackexchange (https://or.stackexchange.com/questions/4062/settings-for-a-faster-solution-of-a-milp-gurobi-python) where a user seems to have a similar problem. Another user suggests "to examine the Log and see if the problem is primarily solved with Cuts or Branching or both". How do I infer this from the log? (if I look at my log it seems that the solution is always found by heuristics, but I am not sure how to tell if it is the cuts or the branching that help the most with the bound).
2) Would it make sense to write a callback that changes the parameters over the course of the optimization process, instead of sticking to the parameters found by the autotune? To give a more concrete example, since the heuristics find a very good solution very early, I would set "Heuristics" to 0.8 or 0.9 for e.g., the first third of the TimeLimit, and then terminate and change it to 0, and restart a new optimization from there, hoping that the bestBd converges faster in the second part of the optimization.
I uploaded the files as requested.
Kind regards,
Marta
0 -
hello Jaromil,
I'm having a very similar situation to Marta. I have three instances of the same problem
A. ~300 variables and ~600 constraints (due to the physics involved)
B. ~600 variables and ~1200 constraints
C. ~1000 variables and ~2000 constraints
When running my MINLP problem for A it runs in seconds. Once I use B it runs for more than 3 hours and it says that Gap is still 0.5% (independent of the scripting language I use MATLAB or Python).
I have not tried C as that one really runs slowly even for the initialization. Given that I do not know which methods. I have used GAMS and the answers are obtained in seconds for A and B. For C, it is too large for the trial version.
So the question holds
"to examine the Log and see if the problem is primarily solved with Cuts or Branching or both". How do I infer this from the log?"
regards,
MarioCG
0 -
Hi Mario,
Regarding the restricted license, please refer to How do I resolve a "Model too large for size-limited Gurobi license" error?
Could you share the statistics of your models, i.e., the first ~20-30 lines of the log file? Please note that a problem's size is not a good indicator on its complexity, especially not for MI(N)LPs.
to examine the Log and see if the problem is primarily solved with Cuts or Branching or both". How do I infer this from the log?"
When you look at the log of model A, do you see that the optimization waited more for the optimal solution to be found or to prove the BestBd? You can check this by looking at the Incumbent and BestBd columns in the log file. You can also share model B with the Community as described in Posting to the Community Forum.
Best regards,
Jaromił0 -
Hello Jaromil,
thank you for the quick response. I'm running similar instances to these benchmarks.
MILP_HP_EX1, MILP_HP_EX2 and MILP_HP_EX3
best regards,
MarioCG
0 -
Hi Mario,
The 3 models are from water energy modelling and are known to be complex and hard to solve and so far no solver was able to prove true optimality (0% gap) on these models (at least not in anywhere acceptable time). I recommend to aim for a MIPGap of 1% to not spend too much time waiting for the proof of optimality. If you have any idea of how to improve the performance on those models and actually achieve true optimality (0% gap), I would be greatly interested.
Best regards,
Jaromił0 -
Hello,
After further reading, I saw that the main issue are the NonConvex constraints as discussed in the literature of the subject. As you mention there is Gap remaining even for known solutions, software and methods.
Nice practical suggestion though. I'll just set the MIPGap to 1%, that should help me reduce dramatically the running time.Thank you very much for your help.
MarioCG
0
Post is closed for comments.
Comments
8 comments