Skip to main content

LP Warm Start (Python)

Open

Comments

4 comments

  • Nicholas Parham
    Gurobi-versary
    First Comment
    First Question

    My temporary fix has been to add a single binary variable to the model, forcing Gurobi to use the variable start attributes since it is now technically a MIP. I am still hoping for a better resolution to this issue.

    0
  • Nicholas Parham
    Gurobi-versary
    First Comment
    First Question
    import gurobipy as gp
    import numpy as np
    import random

    random.seed(2023)

    env = gp.Env(empty = True)
    env.setParam('OutputFlag', 0)
    env.setParam('Presolve', 0)
    env.setParam('LPWarmStart', 2)
    env.start()

    model = gp.Model(env = env)

    X = np.array([model.addVar(vtype = gp.GRB.CONTINUOUS, lb = 0, ub = 10) for _ in range(10)])
    x = np.array([random.randint(5, 10) for _ in range(10)])
    # model.addVar(vtype = gp.GRB.BINARY) # uncomment for MIP

    model.setObjective(np.sum(x * X), gp.GRB.MAXIMIZE)

    model.addConstr(X[0] <= X[3] - 4)
    model.addConstr(X[2] <= X[3] - 2)
    model.addConstr(X[5] <= X[0] - 4)
    model.addConstr(X[9] <= X[8] - 6)
    model.addConstr(X[6] <= X[4] - 1)




    model.optimize()
    print('Iterations:', int(model.IterCount))
    print('Objective:', model.getObjective().getValue())
    print()


    model.optimize()
    print('Iterations:', int(model.IterCount))
    print('Objective:', model.getObjective().getValue())
    print()




    model.remove(model.getConstrs())
    model.update()
    model.addConstr(X[0] <= X[3] - 4)
    model.addConstr(X[2] <= X[3] - 2)
    model.addConstr(X[5] <= X[0] - 4)
    model.addConstr(X[9] <= X[8] - 6)
    model.addConstr(X[6] <= X[4] - 1)
    model.optimize()
    print('Iterations:', int(model.IterCount))
    print('Objective:', model.getObjective().getValue())
    print()
    0
  • Riley Clement
    Gurobi Staff Gurobi Staff

    Hi Nicholas,

    My understanding is that if you modify a model so that the basic solution remains basic, then resuming the optimization can take advantage of a warm start.  If you delete a constraint that is active at the solution then I expect the solver to ignore the solution and start from scratch.  The solver doesn't try and figure out if the  constraint you deleted is later added back to create the original model, and from the solvers perspective these constraints are not the same object, even though they are mathematically identical.

    If you wish to modify the LHS of constraint, i.e. the coefficient matrix, then using the Model.chgCoeff method would be the best approach, and more likely that the solver can take advantage of warm starting.  I would suggest adding all variables you may possibly need, to the original model, even if they are not participating in any constraints.

    Just a couple of questions/comments based on your first post:

    I have the same exact LP and, in fact, the previous optimal solution is still stored in the model start attribute.

    Can you clarify what you mean by this?  The Start attribute is only valid for MIP models, as you point out later in the post.

    Or at least combine the features of MIP and LP warm starts so they are not separate starting vectors? Why are they even separated?

    A MIP solution is in general not a valid solution for the LP relaxation, and a solution to the LP relaxation is in general not valid solution for a MIP, so I can't see a way that they could be combined.

    Hope this helps, and we'll see if anyone else has some comments, as I think there could be more to say.

    - Riley

    0
  • Nicholas Parham
    Gurobi-versary
    First Comment
    First Question

    Riley Clement thanks for the response and it definitely helps. Makes sense how the solver is handling things internally.

    For LP warm start with the Start attribute, I was adding a dummy binary as Maliheh Aramon mentions here. It seemed to effectively reduce the number of simplex iterations, but the MIP overhead was enough to negate any runtime improvements. I'll try interacting with the LP the way it was intended, which so far seems to be working for me.

    0

Please sign in to leave a comment.