RAM overload when setting the constraints
Awaiting user inputI am trying to solve a bipartite tree matching problem with Gurobi. I have solved many of these so far; the constraints and whole code are set properly and everything works. Issue arises from the case when I am setting large number of constraints (several thousand). On my own machine, code will freeze (Mac with 8GB RAM), on Google Colab code will fail while it is setting constraints as it hits the limit of 12.7GB RAM that Colab provides free of charge. I have a request pending for Gurobi Cloud as I understand I could potentially offset the resource use to the cloud if Gurobi would give me trial (I am using academic licence as the whole work done is for my thesis).
I did try to offset memory use by writing it to the hard drive and this did not work. I understand that the constraints could be written, theoretically, in a simpler manner in order to consume less memory although I am not sure where to even start.
Is there a conventional or straight-forward solution to the RAM issues without digging into the math and trying to rewrite the constraints?
-
Hi Nikola,
If I understand correctly, out-of-memory occurs when model building. Then, using the Gurobi cloud will not help.
I think there is no other way than trying to identify what exactly causes the large memory consumption.
Several thousand constraints do not sound huge. What is the exact number of variables and constraints?
You could use a memory-profiler to analyze which lines in your code cause the issue and share them.
Best regards,
Marika0 -
Hi Marika, use of RAM is progressively growing as the constraints are being set. I can see that there are 941738 variables and two sets of constraints; 4042 and 3462. So setting first set of constraints of 4042 will eat up all 12GB of RAM. This is the case when I run my problem in Google Colab. When I run it on my local machine, RAM is not the issue as both sets of constraints are set in a reasonable time (each one takes about 20-30 minutes). After that, solving does not start, instead no output is created and after a while Python issues a kill signal and program is stopped. Unfortunately the issue with memory-profile is that it suffocates execution of the code by prolonging time of constraint setting process to 15 hours or so per set. So in short, when running code on Colab, RAM is eaten up when setting first set of constraints. When running code locally, both sets of constraints are solved and code breaks when presolving is supposed to start.
0 -
How many variables have a non-zero coefficient in one constraint (on average)?
Could you reduce the problem so that you can write an MPS file? Then, compare the memory consumption when building the (reduced) model within your API to the memory consumption when reading the MPS file with Gurobi. Are both similar in size?0
Please sign in to leave a comment.
Comments
3 comments