Reduce memory usage of gurobipy Model
AnsweredHello,
I am working with quite large linear models and finding myself with problems regarding memory usage. Building the model I observe a huge increase in memory consumption when I use model.addMVar. After I finished building the model (before solving it), I use guppy3 to check what is the driver for the memory usage and get following results:
Partition of a set of 196740289 objects. Total size = 12602398603 bytes.
Index Count % Size % Cumulative % Kind (class / dict of class)
0 45998000 23 4783792000 38 4783792000 38 dict of gurobipy.Var
1 45998000 23 2207904000 18 6991696000 55 gurobipy.Var
2 19216000 10 1998464000 16 8990160000 71 dict of gurobipy.Constr
3 65288898 33 1828137928 15 10818297928 86 int
4 19216000 10 922368000 7 11740665928 93 gurobipy.Constr
5 9826 0 529945320 4 12270611248 97 list
6 65983 0 200351768 2 12470963016 99 numpy.ndarray
7 267297 0 35609664 0 12506572680 99 str
8 61452 0 21082168 0 12527654848 99 dict (no owner)
9 200914 0 14452120 0 12542106968 100 tuple
<2420 more rows. Type e.g. '_.more' to view.>
As I understand it gurobipy internally creates a huge amount of dictionaries which is where most memory is allocated. I was wondering if there is any way to decrease the amount of memory the model itself is using? When I solve the model I don´t observe a memory increase anywhere near the increase when building the model.
-
Hi Christian,
Could you please provide a minimal reproducible example? Which Gurobi version are you using (please use the latest version if you are not already using it)?
Best regards,
Jaromił0 -
Hi Jaromil,
sure! I was working with gurobi 9.5.2. Updating to 10.0 did however only slightly increase the memory usage.
Here is an example. It definetly does not make any sense but will show the memory allocation.import numpy as np
import gurobipy as gp
from gurobipy import GRB
import scipy.sparse as sp
import guppy
n_var = 90000000
n_constraints = 40000000
model = gp.Model()
lb = np.zeros(n_var, dtype=float)
ub = np.full(n_var, 100, dtype=float)
obj = np.ones(n_var, dtype=float)
vtypes = np.full(n_var, GRB.CONTINUOUS)
gp_vars = model.addMVar(n_var, lb, ub, obj, vtypes)
rows = np.arange(n_constraints)
cols = np.arange(n_constraints)
val = np.ones(n_constraints, dtype=float)
A=sp.csr_matrix((val, (rows, cols)), shape=(n_constraints, n_var))
sense = np.full(n_constraints, '>')
rhs = np.ones(n_constraints, dtype=float)
model.addMConstr(A, gp_vars, sense, rhs)
h=guppy.hpy()
print(h.heap())Output
Partition of a set of 390325818 objects. Total size = 28287024970 bytes.
Index Count % Size % Cumulative % Kind (class / dict of class)
0 90000000 23 9360000000 33 9360000000 33 dict of gurobipy.Var
1 90000000 23 4320000000 15 13680000000 48 gurobipy.Var
2 40000000 10 4160000000 15 17840000000 63 dict of gurobipy.Constr
3 81 0 3800008386 13 21640008386 77 numpy.ndarray
4 130009159 33 3640271656 13 25280280042 89 int
5 40000000 10 1920000000 7 27200280042 96 gurobipy.Constr
6 3960 0 1045994480 4 28246274522 100 list
7 103196 0 13057412 0 28259331934 100 str
8 74405 0 5307456 0 28264639390 100 tuple
9 21552 0 3825727 0 28268465117 100 types.CodeType
<1004 more rows. Type e.g. '_.more' to view.>The increase in memory happens when the variables are created (model.addMVar). And mostly the dictionaries of gurobi variables are using most (~33%) of the storage.
0 -
Hi Christian,
The memory consumption you see is currently expected. While our APIs are designed to be only a thin layer on top of the C code, special objects such as MVars require additional handling on the Python layer. These objects are pretty heavy because they are already a dictionary of their attributes. Then, we have to also store additional pointers to vars contained in the MVar object. This all comes on top of the actual values, variable types, lower and upper bounds, and names of each variable.
We are actively working on reducing the memory requirements in these cases.
Best regards,
Jaromił0 -
Hi Jaromil,
That´s what I expected. I guess that´s another downside of using python. Thank you very much for your explanation!
Best regards,
Christian0
Please sign in to leave a comment.
Comments
4 comments