Efficiency in building large number of constraints
AnsweredHello, I am building a coverture model where for medium instances I have to generate more than two million of constraints.
I have a region (an urban area) which is divided into 'lots' (small squares), each lot has a demand for 4G and 5G technologies and a set of nearby antennas have to provide that service.
Since there can be one million lots and two technologies (I have to impose a constraint for each lot and technology, so that the demand is met).
Adding the constraint is similar to the following:
m.addConstrs((gp.quicksum(vTrafficOfCell[a, t, l] for a in antennas if \
a in antenna_lighting_lot_technology[l, n])
>= demand[l, t] for l in lots for t in techonologies),
"DemandFullfilment")
vTrafficOfCell[a, t, l] is the demand of lot l for technology t to be met by antenna a.
antenna_lighting_lot_technology[l, n] is a list o lists where for every lot l and every technology we have a list with the antennas that can provide the corrsponding demand.
I had never built a model with such a big number of constraints and I do now know whether building over two million of constraints must take that long of there are more efficient ways to do it.
Many thanks in advance.
Álvaro.
-
That seems like the best way to do it using the Python API. Have you already tried this?
Alternatively, you can maybe also try using the Python Matrix API (see an example and some relevant articles).Depending on the formulation, with several million constraints, it may be the case the model building is actually the bottleneck.
You may also find some useful tips in the article: How do I improve the time to build my model?
Cheers,
David0 -
Hi Álvaro,
Responsible for the slow model building is most likely not Gurobi itself but the inner Python for-loop. The embedded if-clause is probably pretty expensive and to speed up the whole process, you should try simplifying or preparing your data structures to avoid this if-clause.
Cheers,
Matthias0 -
Many thanks, David and Matthias.
I will take a look at what David mentions.
I will try first to avoid the loop within the addconstraint preparing the data to confirm that.
I assume that serveral million constraints should not be an issue. Am I right?
Regards,
Álvaro.
0 -
Well, model instances with a few million constraints are pretty large but constructing such models should not be an issue. To benchmark your model construction, you could also write out the model file as MPS and compare how much time is needed to read that file in.
Cheers,
Matthias0 -
Hi,
I removed the if from the quicksum as Matthias suggested and the building time has improved.
Now the problem is preprocessing data, but that is nothing related with Gurobi. When I get that data efficiently processed, I will see how fast I can build the model.
Many thanks,
Álvaro.
0
Please sign in to leave a comment.
Comments
5 comments