• Gurobi Staff

Hi Tushar,

The size of a model is not a good indicator of its complexity. There are small MIP instances in the MIPlib2017 which are not still not solved.

Your model has many SOS constraints, looks like it has only SOS constraints correct? From the log, it is hard to tell whether it is the primal or the dual bound pushing the convergence forward. Did you try not providing a mip start? You could also try the NoRelHeurTime parameter to run the no relaxation heuristic before the first LP solve to possibly find a better feasible point early in the run. Experimenting with the BranchDir and VarBranch parameters might help.

If it's fine for you, you could also share the model and someone might have a closer look. It is not possible to upload files in the Community Forum but we discuss a workaround for this in Posting to the Community Forum.

Best regards,
Jaromił

Hi Jaromił,

1. My model does have constraints other than the SOS constraints too. Somehow, the line showing the model size didn't get copied correctly, and it is shown here:

Optimize a model with 5286 rows, 6286 columns and 25069 nonzeros

Also, what exactly do 25069 nonzeros mean? Does it indicate the number of nonzero elements in the coefficient matrix? I looked here , but it's not explicitly mentioned.

2. Yes, I tried without MIP start (that's what I tried before experimenting with other parameters). However, the performance was slightly worse or almost similar without the MIP start.

3. I modified my model and saw the performance improve slightly. I had a constant term in the objective function, leading to slower convergence since it indirectly affects the gap.

4. I tried NoRelHeurTime, and it does seem to help the most in increasing convergence by finding a good feasible solution initially; however, the models still take time to converge or do not converge at all to the set MIPGap. It usually gets stuck around ~5% gap. Also, the VarBranch parameter seems to make no difference.

5. The model I am trying to solve is a bilevel model, which I am reformulating using a package in Julia language, which eventually feeds it to Gurobi. I will see if I can somehow get the mps format file.

With the new model size data shown in point 1 above, do you still think it is not uncommon for models of this size to have difficulty getting converged even if there are no numerical issues in the data? Please let me know if you have some other comments that can be helpful in such a scenario.

Thanks for the help!

• Gurobi Staff

Hi Tushar,

Also, what exactly do 25069 nonzeros mean? Does it indicate the number of nonzero elements in the coefficient matrix? I looked here , but it's not explicitly mentioned.

Correct, it's the number of nonzeros in the coefficient matrix.

With the new model size data shown in point 1 above, do you still think it is not uncommon for models of this size to have difficulty getting converged even if there are no numerical issues in the data?

One cannot say anything about the complexity of a model just from its model statistics. This market-split model from MIPlib2017 is way smaller than yours and is unsolved since several years.

If running NoRelHeurTime improves the primal bound faster than default parameters, then I would stick to it and now focus on the dual bound. You could try experimenting with the BranchDir, Presolve, PreSparsify, and MIPFocus=2/3 parameters.

Maybe our recent Tech Talk on Converting Weak to Strong MIP Formulations hold something useful for your case.

Best regards,
Jaromił

Thanks Jaromił. I will check the parameters you mentioned and the talk too.