Early Branching in Branch-and-Cut
Hi,
I'm have implemented a branch-and-cut algorithm using Gurobi, but in order to improve its performance, I want to detect when no significant changes happens during the (LP<->separation) loop, and force it to branch instead of trying to add more cutting planes. I guess this is called "early branching", but I'm not sure...
Currently I'm trying to compare the LP value obtained in the previous iteration of the loop and compare. But sometimes, strange things happens. My problem is a maximization one, but sometimes, after adding a cutting plane, the LP value gets higher! Nevertheless, the Branch-and-Cut seems to be ok, since it always find an optimal solution in all my instances.
Thanks,
Matheus
-
Hi Matheus,
Do you have a log file that shows a few iterations of the initial cutting plane algorithm where this happens? Thanks!
Eli
0 -
Hi Eli,
Thanks for the fast response! This is the log presented by Gurobi:
Optimize a model with 26 rows, 50 columns and 100 nonzeros
Variable types: 0 continuous, 50 integer (50 binary)
Coefficient statistics:
Matrix range [1e+00, 4e+02]
Objective range [1e+01, 4e+02]
Bounds range [1e+00, 1e+00]
RHS range [1e+00, 1e+00]
Presolve removed 25 rows and 25 columns
Presolve time: 0.00s
Presolved: 1 rows, 25 columns, 25 nonzeros
Variable types: 0 continuous, 25 integer (25 binary)
Found heuristic solution: objective 2812.0000000Root relaxation: objective 2.942000e+03, 5 iterations, 0.00 seconds
Nodes | Current Node | Objective Bounds | Work
Expl Unexpl | Obj Depth IntInf | Incumbent BestBd Gap | It/Node Time0 0 2942.00000 0 1 2812.00000 2942.00000 4.62% - 0s
0 0 2942.00000 0 1 2812.00000 2942.00000 4.62% - 0s
H 0 2 2909.0000000 2942.00000 1.13% - 0s
0 2 2942.00000 0 1 2909.00000 2942.00000 1.13% - 0s
H 8 12 2920.0000000 2942.00000 0.75% 7.4 0s
H 18 20 2929.0000000 2942.00000 0.44% 7.2 0s
H 47 31 2937.0000000 2942.00000 0.17% 7.5 0s
H 78 43 2939.0000000 2942.00000 0.10% 7.1 0s
* 151 43 12 2941.0000000 2942.00000 0.03% 6.7 0s
* 441 91 23 2942.0000000 2942.00000 0.00% 8.1 1sCutting planes:
User: 320
Lazy constraints: 184Explored 523 nodes (4186 simplex iterations) in 1.51 seconds
Thread count was 4 (of 8 available processors)Solution count 8: 2942 2941 2939 ... 2812
Optimal solution found (tolerance 1.00e-04)
Best objective 2.942000000000e+03, best bound 2.942000000000e+03, gap 0.0000%But I guess this log doesnt really show the behavior I am describing. Since it might be the case that I'm getting the information wrong, let me first describe how I'm getting some parameters:
- In order to know the current node in the Branch-and-Bound tree I'm using getDoubleInfo(GRB_CB_MIPNODE_NODCNT) (or the MIPSOL equivalent) in the callback.
- To get the objective value I'm just iterating through the values of the "current optimal" fractional solution and computing the objective function.
- To compute the number of cuts added I just use a counter that increments when I use the "addLazy" method. Gurobi probably ignores some of the constraints I add with this.
Then, in the GRBCallback I'm printing these parameters. The following is the output for the same instance.
Optimize a model with 26 rows, 50 columns and 100 nonzeros
Variable types: 0 continuous, 50 integer (50 binary)
Coefficient statistics:
Matrix range [1e+00, 4e+02]
Objective range [1e+01, 4e+02]
Bounds range [1e+00, 1e+00]
RHS range [1e+00, 1e+00]
node: 0
curr solution cost 1391
cuts added 0
Presolve removed 25 rows and 25 columns
Presolve time: 0.00s
Presolved: 1 rows, 25 columns, 25 nonzeros
Variable types: 0 continuous, 25 integer (25 binary)
node: 0
curr solution cost 2812
cuts added 12
Found heuristic solution: objective 2812.0000000Root relaxation: objective 2.942000e+03, 5 iterations, 0.00 seconds
Nodes | Current Node | Objective Bounds | Work
Expl Unexpl | Obj Depth IntInf | Incumbent BestBd Gap | It/Node Time0 0 2942.00000 0 1 2812.00000 2942.00000 4.62% - 0s
node: 0
curr solution cost 2938
cuts added 12
node: 0
curr solution cost 2942
cuts added 29
0 0 2942.00000 0 1 2812.00000 2942.00000 4.62% - 0sRecall that my problem is a maximization one, so its strange to me why the "curr solution cost" got higher after adding cuts.
0 -
HI Matheus,
What does your callback function look like? In particular, I'm curious about the values of "where" that you're using in the callback function. Note that "where == GRB_CB_MIPSOL" is called when a new MIP incumbent solution is found. "where == GRB_CB_MIPNODE" is called when Gurobi is exploring a MIP node, and only when a relaxation solution is available. Based on this:
node: 0
curr solution cost 2812
cuts added 12
Found heuristic solution: objective 2812.0000000it looks like you are adding lazy constraints at "where == GRB_CB_MIPSOL" (e.g., when a new heuristic solution is found). Are you also using "where == GRB_CB_MIPNODE" to add lazy constraints?
Also, what does the code for recovering the "current optimal fractional solution" look like?
Thanks!
Eli
0 -
Hi Eli,
The beginning of my code is the following:
// integer solution
if (where == GRB_CB_MIPSOL){
solution_value = &CutsCallback::getSolution;
nodeCount = getDoubleInfo(GRB_CB_MIPSOL_NODCNT);
//cout << "integer" << endl;
}
// fractional solution
else if ((where == GRB_CB_MIPNODE) && (getIntInfo(GRB_CB_MIPNODE_STATUS) == GRB_OPTIMAL)){
solution_value = &CutsCallback::getNodeRel;
isFrac = true;
nodeCount = getDoubleInfo(GRB_CB_MIPNODE_NODCNT);
//cout << "frac" << endl;
}
else{
return;
}Indeed, I'm adding lazy constraints in an integer solution, since my model has an exponential number of constraints and I'm adding those through the separation routine.
Thanks,
Matheus
0 -
Hi Matheus,
We will work with you directly through our support portal to resolve this. Thanks!
Eli
0 -
Hi Matheus,
This behavior is expected.
- At the root node, Gurobi finds a potential new heuristic solution with objective value 2938. This calls your
where == GRB_CB_MIPSOL
callback. - In your callback, you add 12 lazy constraints to the problem. The new solution is cut off by these constraints, meaning it is not feasible to the problem. As such, the objective value 2938 isn't printed in the log as a new incumbent solution. Instead, Gurobi sticks with the incumbent objective value of 2812 it found after presolve.
- At the root node, Gurobi also calls your
where == GRB_CB_MIPNODE
callback. 2942 is just the value of the objective value corresponding to the relaxation solution at the root node.
Your original concern was that the "LP value gets higher" after adding inequalities, despite the fact that your problem is a maximization problem. The upper bound of 2942 stays the same throughout the entire solve. The value of 2938 you print corresponds to a heuristic solution that Gurobi finds that is subsequently cut off by your lazy constraints; this is not an upper bound on the optimal objective value and thus shouldn't be compared to 2942.
I hope this helps. Thanks!
Eli
0 - At the root node, Gurobi finds a potential new heuristic solution with objective value 2938. This calls your
Please sign in to leave a comment.
Comments
6 comments