Skip to main content

Disable MIP Presolve but preserve other Presolve

Awaiting user input

Comments

3 comments

  • Jaromił Najman
    Gurobi Staff Gurobi Staff

    This is currently not possible.

    Best regards,
    Jaromił

    0
  • ce jekl
    Curious
    Conversationalist

    Thanks.

    Another question:

    1. Is possible to let gurobi parallelize primal heuristics after root lp? My problem often solved at the root node with primal heuristics for a long time and I wonder if I can accelerate the process by parallelism. Is there other advice for the situation? (BTW, lower the "heuristics" value will worse the performance.

    2. I try to solve the same problem from lp/mps with python api and gurobi_cl, but the solving time is consistently different in many problems. solving time of gurobi_cl is less than python api like 5%, for all of the problems, which doesn't make sense to me. I can understand that gurobi_cl may be faster than python api in reading or modeling process, but IMO they should have the same solving time.

    0
  • Jaromił Najman
    Gurobi Staff Gurobi Staff

    1. Is possible to let gurobi parallelize primal heuristics after root lp? My problem often solved at the root node with primal heuristics for a long time and I wonder if I can accelerate the process by parallelism. Is there other advice for the situation? (BTW, lower the "heuristics" value will worse the performance.

    Gurobi runs primal heuristics in parallel per default. Since your model depends on primal heuristics finding a good solution, it definitely makes sense that lowering the Heuristics value worsens the performance, because the Heuristics parameter controls the amount of primal heuristics performed. You should try experimenting with the NoRelHeurTime parameter to run the no relaxation heuristic before the root node relaxation is solved. You should also try setting MIPFocus to 1 and turning off Cuts to shift the focus to finding feasible solutions. You could also increase the value of the Heuristics parameter.

    2. I try to solve the same problem from lp/mps with python api and gurobi_cl, but the solving time is consistently different in many problems. solving time of gurobi_cl is less than python api like 5%, for all of the problems, which doesn't make sense to me. I can understand that gurobi_cl may be faster than python api in reading or modeling process, but IMO they should have the same solving time.

    In the Python API, do you only read in the model and solve it or do you also construct the model in Python? Did you try running one of such problematic model multiple times, e.g., 100-1000 times? Could you share one of such problematic models where you observe the performance discrepancy? See Posting to the Community Forum for details on sharing files in the Forum.

    Best regards,
    Jaromił

    0

Please sign in to leave a comment.