• Gurobi Staff

HI Lukas,

A node is postponed if numerical trouble were encountered during the solution of the relaxation of this node. This may be due to very large/small variable bounds or very large/small coefficients. Instead of trying to do computations with possibly wrong numerical values, Gurobi postpones this particular node with the hope that it can prune this node or solve the numerical problem through information gathered in the B&B tree.

Could you post the first lines of your LOG file showing the model statistics, i.e, the number of rows, columns, and nonzeros together with the coefficient ranges?

Best regards,
Jaromił

Hi Jaromił,

This is the size of my model:

Optimize a model with 663497 rows, 331973 columns and 2427819 nonzerosModel fingerprint: 0xb4fd3671Model has 296 SOS constraintsVariable types: 111748 continuous, 220225 integer (220224 binary)

I realize that this is a rather large model. I'm giving Gurobi 128 threads to optimize this model, though it looks like most of the time it's doing something single-threaded. The coefficients don't look too bad:

Coefficient statistics:Matrix range [1e+00, 3e+03]Objective range [1e+00, 5e+00]Bounds range [1e+00, 3e+03]RHS range [1e+00, 3e+03]

I did have some numeric trouble earlier with coefficients of up to 10^8, reformulated parts of my model, and now at least the root relaxation doesn't give me the "numerical trouble encountered" message anymore.

Is the model just too large?

Thanks, Lukas

• Gurobi Staff

Hi Lukas,

The size of the model should not be the reason for the postponed nodes. You are saying that you reformulated parts of your model. What exactly did you do to perform the reformulation? Note that reformulating a constraint

$x = 10^8 y$

as

\begin{align} x &= 10^4 x' \\ x' &= 10^4 y \end{align}

does not help as you are not really scaling the coefficient but rather trying to hide the large coefficient.

It may also happen that your coefficient matrix is almost singular causing numerical issues.

Our Numerical Guide might catch your interest.

Best regards,
Jaromił

Thanks again! Indeed, presolving seems to introduce some large variable bounds:

gurobi> m = read("gurobi.mps")Read MPS format model from file gurobi.mpsReading time = 1.51 seconds: 663497 rows, 331973 columns, 2427819 nonzerosgurobi> m.printStats()Statistics for model Unnamed :Linear constraint matrix : 663497 Constrs, 331973 Vars, 2427819 NZsVariable types : 111748 Continuous, 220225 Integer (220224 Binary)SOS constraints : 296Matrix coefficient range : [ 1, 2830.51 ]Objective coefficient range : [ 1, 5 ]Variable bound range : [ 1, 2830.51 ]RHS coefficient range : [ 1, 2830.51 ]

And then, after presolving:

gurobi> m = m.presolve()Presolve removed 591 rows and 148 columns (presolve time = 5s) ...Presolve added 0 rows and 146 columnsPresolve removed 15 rows and 0 columnsPresolve time: 6.37sgurobi> m.printStats()Statistics for model _pre :Linear constraint matrix : 663482 Constrs, 332119 Vars, 2323938 NZsVariable types : 111748 Continuous, 220371 Integer (220370 Binary)Matrix coefficient range : [ 1, 2830.51 ]Objective coefficient range : [ 1, 5 ]Variable bound range : [ 1, 9.97035e+07 ]RHS coefficient range : [ 1, 2830.51 ]

My best guess is that for some of the variables for which I did not supply a bound, presolving infers some (very large) upper bound. Two questions related to this:

• Is this even bad? Are auto-inferred variable bounds added to the constraint matrix? If so, this would of course deteriorate matrix conditioning in this case.
• Can I somehow prevent Gurobi from doing that? You can see that presolving doesn't do too much for this model, so completely disabling presolving might be a good idea in this case, but I think it would be great if one could just forbid Gurobi to worsen the matrix conditioning in this way.
• Gurobi Staff

Hi Lukas,

Auto-inferred bounds are added to the presolved model. You can certainly try turning off Presolve by setting the parameter to 0 and you could also try setting the parameter Aggregate to 0. Do you provide bounds for all variables, especially the discrete ones? If not, try providing bounds for those to see whether this helps.

Best regards,
Jaromił

Hi Jaromlł,

Thanks. The variables for which bounds were inferred were indeed variables of my model. Putting some reasonable upper bounds on them solved the problem with the huge bounds after presolving. I still can't quite get this model to behave, but I'll open a different question for other problems.

Thanks again,

Lukas