Skip to main content

Implementation of Benders Decomposition with Callbacks

Answered

Comments

10 comments

  • Official comment
    Simranjit Kaur
    Gurobi Staff Gurobi Staff
    This post is more than three years old. Some information may not be up to date. For current information, please check the Gurobi Documentation or Knowledge Base. If you need more help, please create a new post in the community forum. Or why not try our AI Gurobot?.
  • Jaromił Najman
    Gurobi Staff Gurobi Staff

    Hi Pramesh,

    You could add a constraint which forces the objective to be above some specific value. This may however negatively affect the performance as described in the Knowledge Base article How do I pass an objective bound to Gurobi?

    Best regards,
    Jaromił

    0
  • Jacob Jin
    Gurobi-versary
    Curious
    Collaborator

    Hi, I have the same question:

    if where == GRB.Callback.MIPSOL:
    ### Retrieve the cuurent value of the master problem
    r_1 = np.array(
    [[model.cbGetSolution(model.getVarByName('r_' + str(i) + str(j))) for j in range(N + 1)] for i in
    range(N + 1)])
    z_1 = np.array(
    [[[model.cbGetSolution(model.getVarByName('z_' + str(i) + str(j) + str(l))) for l in range(L)] for j in
    range(N + 1)]
    for i in range(N + 1)])

    ### Retrieve the current valur of master problem as the lower bound
    LB = model.cbGet(GRB.Callback.MIPSOL_OBJ)

    ### Solve the dual variable with the cuurent z_1 and return the dual problem's status and value
    status, value_list, UB = self.sub_problem(z_1)
    if status == 2:
    ### add optimality cut

    ### When the UB and LB are not equal, add the optimality cut
    if abs(UB - LB) >= 0.01:
    model.cbLazy(model.getVarByName('q') >= lin)
    #### model.write('test.lp')

    else:
    ### If not, the model ends
    model.terminate()

    I slove the sub_dual problem to get an upper bound and retrieve the current master problem's value to get a lower bound. However it converges to a wrong value(which is larger than the optimal minimal value). When I use the model.write('test.lp') and find that the constraint is not "model.getVarByName('q') >= lin". 

     

    I don't know what's wrong...???

    0
  • Jaromił Najman
    Gurobi Staff Gurobi Staff

    Hi Jacob,

    Lazy constraints added in a callback will not be part of the model written via the write method. This is because these lazy constraints are added during the optimization process.

    If you want to check whether your lazy constraint is the correct one, you could print the linear expression \(\texttt{lin}\)

    if abs(UB - LB) >= 0.01:
    print(lin)
    model.cbLazy(model.getVarByName('q') >= lin)

    This should help you in debugging your model.

    Best regards,
    Jaromił

    0
  • Jacob Jin
    Gurobi-versary
    Curious
    Collaborator

    Hi, Jaromil! Thank you for replying me. I have some issues to consult...

    (1) I have printed the linear expression and it's right. To testify whether the callback function works well, I add the linear expression as a user cut to the master problem. And

    model.cbLazy(model.getVarByName('q') >= lin)

    retrieves a value different from the master problem's value after adding a user cut.

    (2) I want to know how to retrieve the current node's value which gives a lower bound of the master problem.

    LB = model.cbGet(GRB.Callback.MIPSOL_OBJ)

    The code above gives the current objective value of a new feasible solution instead of the overall lower bound. Therefore, when I print the LB, the LB's value is not non-decreasing(which is -5000, 60.2, 0.6, 0.6, 0.4...) And it doesn't work(converge to a false value) if I make the model terminated when

    abs(UB - LB) >= 0.01

    I should change many code as below:

    if status == 2:
    ### add optimality cut
    ### When the UB and LB are not equal, add the optimality cut
    if abs(UB - LB) >= 0.01:
    model.cbLazy(model.getVarByName('q') >= lin)
    #### model.write('test.lp')
    else:
    ### If not, the model ends
    ### model.terminate()
    pass
    0
  • Jacob Jin
    Gurobi-versary
    Curious
    Collaborator

    I don't know why...? Can you give me some hints? Thank you so much~

    0
  • Jaromił Najman
    Gurobi Staff Gurobi Staff

    Hi Jacob,

    (1) If you know what should be or should be not the correct solution of the model with the previously added lazy constraint, one way to investigate would be to construct a model copy before starting the optimization, then adding the lazy constraint to this copy as a normal constraint and finally write the copied model to a file to investigate in more detail. You can achieve this by

    # in your callback function
    if abs(UB - LB) >= 0.01:
    model.cbLazy(model.getVarByName('q') >= lin)
    # add the same constraint to the copy
    # it is possible because we are currently not optimizing model._copy
    model._copy.addConstr(model._copy.get(VarByName('q') >= lin_copy)
    model._copy.write("copyLP.lp")


    # [...]
    model._copy = model.copy()
    model._copy_vars = model._copy.getVars()
    model.optimize(mycallback)

    The tricky part is that you have to construct a \(\texttt{lin_copy}\) object using the variables from the copied model \(\texttt{model._copy}\). This should be a valid way to better debug your model.

    (2) You can get the current best bound value via MIPSOL_OBJBND

    LB = model.cbGet(GRB.Callback.MIPSOL_OBJBND)

    The current best feasible point value can be obtained via MIPSOL_OBJBST.

    Best regards,
    Jaromił

    0
  • Jacob Jin
    Gurobi-versary
    Curious
    Collaborator

    Hi, Jaromił!

    It works, but I don't know why my former code worked. Apparently, it retrieve the false LB... Is there any detailed explanation about the work process of callback function(I have viewed https://www.gurobi.com/documentation/9.5/refman/py_cb_s.html)

    You are so cool!!! How can we become a Gurobi expert like you 😆 

    0
  • Jaromił Najman
    Gurobi Staff Gurobi Staff

    Hi Jacob,

    Happy to hear that your code works now. There is not much more to add about callbacks to what already is stated in the documentation. There really is not much "magic" behind the work process of a callback function. Whenever the algorithm is in a given state, a callback query is performed and checked whether the user asked for any data in their callback function. In a given state, e.g., MIPSOL, you can then access specific information which are needed by the solver itself in any case. Of course, one can raise the complexity of a callback function arbitrarily but the process behind it, i.e., how Gurobi works with it, remains the same. You might be interested in our recent webinar on advanced heuristics implemented via callbacks.

    You are so cool!!! How can we become a Gurobi expert like you 😆

    Thank you :) I really don't want to sound like one of my old Professors but it indeed helps to play around with optimization problems/models/techniques. This is the easiest way to learn what does work and what does not. With some practice, you can then begin to think of the actual reasons why something does work and why something else does not.

    Best regards,
    Jaromił

     

    0
  • Jacob Jin
    Gurobi-versary
    Curious
    Collaborator

    Hi, Jaromił!

    When I retrieve the false LB, 

    if status == 2:
    ### add optimality cut
    ### When the UB and LB are not equal, add the optimality cut
    if abs(UB - LB) >= 0.01:
    model.cbLazy(model.getVarByName('q') >= lin)
    #### model.write('test.lp')
    else:
    ### If not, the model ends
    ### model.terminate()
    pass

    if abs(UB - LB) <= 0.01, I do nothing instead of terminating the model. And It converges to correct value. I said it's "magic" because I don't know whether the callback function can compare the true LB with UB automatically? I printed the detailed callback function's working process(every true LB, false LB, UB) and found that the right code and wrong code gave the same working process(the same number of callback, the same converge value).  It puzzles me so much。。。

    0

Post is closed for comments.