User Cuts cut off too much (done after one cut) - Problem with Presolver
OngoingHello,
I am currently working on a project where we try to solve a bilinear program, therefore a non-convex non-linear optimization problem. We want to enhance the solution time by creating cutting planes and cutting off Node-Relaxations.
We use a callback-function that creates cuts with "cbCut", the cuts cannot cut off feasible solutions, they are only for getting a better bound for the problem.
Now I ran into the following problem: As soon as I create the first cut, the solver stops and says he is optimal with the last found incumbent solution. In the log below the question, we indicate with the "------------"-line, where exacly that one cut was created (it is a maximization problem).
The given optimal solution is not correct, since running the same model without cutting planes gives a better solution (in the example below the actual optimum is: 48367.6685). Our first thought was, that our cut cuts off too much.
But the same issue appeared for nearly all (not all) instances we tried (around 20), that after creating one cutting plane, the solver said he was optimal with the incumbent solution.
After double-triple checking the logic of our cuts, we tried a few different things and got the following behavior.
- By setting the Parameter model.Params.Presolve = 0, the same cuts do not cut off too much and the solver converges against the correct optimal solution
- By manually adding the calculated cutting planes to the model before solving (addConstr(...)), we made sure that the cuts do not cut off too much
- By setting the Parameter model.Params.LazyConstraints = 1, although we are not using any lazy cuts or constraints anywhere, the same cuts do not cut off too much and the solver converges against the correct optimal solution
We now think that somewhere our cuts and the presolver collide and together create a wrong bound, even if our cuts are correct. Since setting Parameter LazyConstraints = 1 shuts off some presolving-techniques, all behaviors hint to a problem while presolving.
We didn't start with a MIP but with a non-convex NLP, but the presolver reformulates and solves our model as a MIP. So backed with our issue behavior we now ask ourselves the question, if it is possible that our generated cuts cut off a MIP-Node of the internally reformulated MIP even if the cuts are correct for our NLP and don't cut off any feasible solutions out of the original solution space of the NLP? Do you know if that is possible and might that be the source of our problem?
Sincerely,
Basti Kreuz
Log:
Nodes | Current Node | Objective Bounds | Work
Expl Unexpl | Obj Depth IntInf | Incumbent BestBd Gap | It/Node Time
0 0 48367.6685 0 38 33780.1583 48367.6685 43.2% - 0s
H 0 0 38069.126861 48367.6685 27.1% - 0s
-------------------------
0 0 48367.6685 0 38 38069.1269 48367.6685 27.1% - 4s
Cutting planes:
User: 1
RLT: 30
Explored 1 nodes (949 simplex iterations) in 4.51 seconds
Thread count was 1 (of 4 available processors)
Solution count 3: 38069.1 33780.2 -0
Optimal solution found (tolerance 1.00e-04)
Best objective 3.806912686091e+04, best bound 3.806912686091e+04, gap 0.0000%
User-callback calls 276, time in user-callback 4.07 sec
-
Official comment
This post is more than three years old. Some information may not be up to date. For current information, please check the Gurobi Documentation or Knowledge Base. If you need more help, please create a new post in the community forum. Or why not try our AI Gurobot?. -
Hi Basti!
Did you set the parameter PreCrush to 1 as explained in the documentation about Model.cbCut()? This is important to make your cuts behave as intended.
Cheers,
Matthias0 -
Hi Matthias,
First of all thank you for the quick response!
Since the documentation said: "which would result in your cut being silently ignored" and our issue is not that cuts are not applied, but rather wrong applied, I ignored that at first.
Later I tried setting it to 1 but it didn't change anything, the issue still appeared.
Best regards,
Basti
0
Post is closed for comments.
Comments
3 comments