How to solve large LP
ユーザーの入力を待っています。Hi,
I want to solve a large LP problem with 1e5 variables, 1e9 constraints, and 1e12 non-zeros. I hope to solve this problem optimally, but any relaxed solution could also be helpful. I check Gurobi's Lazy constraint feature; it requires generating the model before any optimization process. But, it is impossible to do that with my problem. I hoped to find a similar callback option for adding these constraints as Lazy constraints, like what you have with MIP.
I wondered if you have any suggestions better than the approach explained here. If not, do also have any suggestions on how to resolve the issue explained here (other than his approach)?
Thanks
-
Hi Ramin,
Do I understand correctly: Your model is very large but it is not possible to generate the whole model to begin with. You have to start with some smaller model and keep adding constraints depending on the current solution point. Is this correct?
As you already noticed, there is no lazy constraint feature for LPs.
I wondered if you have any suggestions better than the approach explained here.
Could you tell why the approach explained in Is warm start implicit in resolving an LP does not work for you?
You can solve your starting model and keep adding constraints and re-solving the model using Gurobi's warm start. Note that you can set the LPWarmStart parameter to always work with the (hopefully smaller) presolved model.
If not, do also have any suggestions on how to resolve the issue explained here (other than his approach)?
The answer is already given in the post. When you copy a model, you cannot work with old variables anymore. Thus, you will need to have a name - variable mapping and work with the getVarByName method. However, this method is quite slow for big models so I would avoid that and try the first approach.
Best regards,
Jaromił0 -
Hi Jaromił,
Thanks for your response. My intuition was that re-solving the model was not as efficient as adding extra constraints using Callback. But, I realized that this, in fact, is what I was looking for.
There is only one approach in this post. I did not get what you meant by the first approach. By the way, I am not copying my model. Here is what I am doing
x = {index: model.addVar(vtype=GRB.BINARY, name="x(%i)" %(index)) for index in my_list}
# Adding some constraints
model.optimize
model.addConstr(quicksum(model.getVarByName(x[index].VarName) for index in my_list ) <= 1 )
model.optimizeCould you explain to me what your suggestion was here?
Thanks,
Ramin
0 -
Hi Ramin,
In your code you don't have to call getVarByName, you can directly access the variable via \(\texttt{x[index]}\) as long as you work with the same model. You can also use the addVars method to let Gurobi generate a tupledict automatically
x = model.addVars(my_list, vtype=GRB.BINARY, name="x")
# Adding some constraints
model.optimize()
model.addConstr(quicksum(x[index] for index in my_list ) <= 1 )
model.optimizeCould you explain to me what your suggestion was here?
You should experiment with the LPWarmStart parameter. You can set the parameter before your first optimization via
model.setParam("LPWarmStart",X) # X is some value in {0,1,2}
I would think that setting LPWarmStart=2 should work best for your approach but you should definitely test all 3 values to be sure.
Best regards,
Jaromił0 -
Hi Jaromił,
Thanks for the information. I tried this
model.addVars(my_list, vtype=GRB.BINARY, name="x")
I got this error after adding a new constraint to the optimized model, as described above.
gurobipy.GurobiError: Variable not in model
I must add that my_list is a list of tuples. So do you have any idea what could be a potential issue?
Thanks,
Ramin0 -
Hi Ramin,
Could you please share a minimal working example resulting in the error?
Best regards,
Jaromił0
サインインしてコメントを残してください。
コメント
5件のコメント