To make things more clear. The code runs fine if I remove thesse constraints.

m.addConstr(bat_SOC <= c_bat_up) #Ensures that our SoC never goes above max limit.
m.addConstr(c_bat_down <= bat_SOC) #Ensures that our SoC never goes below min limit

However if I do this then the battery isen't functioning like a battery anymore as it can use energy it dosen't have. For every if step there is some part that charges the battery

bat_SOC += (bat_del_in * in_eff - bat_del_own) / 3600 #For regulating battery levels

Here for example the battery state of charge is changed depending on how much energy it charged/discharged.

• Gurobi Staff

Can you say something on how many iterations each loop in your code has? How many value pairs does big.iterrows() return? As far as I can see, for each value pair you generate 3600*[2 to 5] constraints, depending on the case in which the "freq" value gets you in. So, this is roughly 10000 constraints for each value pair in big.iterrows(). If those are many, then it is not a surprise that you run out of memory.

First off. Thank you for your reply Tobias! I really need help with this.

Lets see. The first for loop will in total be run 24 times and the secound for loop will be run 24 * 3600 = 84600 times
So one hour is 1 value in the first loop and the secound loop then runs 3600 times.

bid.iterrows returns four values every time it's run.

Your analysis is correct. The number of constraints turn out exactly like you say.
However when I remove the battery constraint mentioned above the model solves in about 5 secounds.

What would be your course of action? Is there something I can do or is the solution to simply get a stronger machine?

Regards, Bill Edwall.

PS: Thank you for taking the time to help me!

• Gurobi Staff

I did not look too closely at your model, but my assumption is that the battery constraints are linking the time steps with each other. May it be that without those constraints you pretty much get independent problems for each second? In this case, the model would be really easy to solve, because it can be solved as 84600 independent tiny problems. So, if this was true, then the model is clearly much more difficult with the linking constraints.

What I don't quite understand: you are getting an out-of-memory error while building the model? So, you do not even reach the optimize() call?

Maybe it helps to call m.update() once in a while, for example, directly before the "x in range(3600)" loop. Without an update() call, the data are cached in an intermediate storage, which may consume slightly more memory than the final storage into which they are transfered when you perform an update() (or call optimize()).

• Gurobi Staff

Moreover, I would try what happens if you don't do this for a whole day, but only for, say, 6 hours or so.

That makes so much sense! What you're saying is very true! The battery constraint is the glue of the model and without it it's possible to solve like the small problems that you mentioned.

You're correct. I never reach the optimize part (atleast I don't see the answer that the model usually puts out when it has optimized the model).

I'll try the m.update() suggestion right away! I had no idea it worked like that.

I think solving the problem for a couple of hours makes a lot of sense. It shoulden't even be that hard to change my code to try and see what happens if I try this.

Thank you so much for your help!! I really tried to solve these things myself but I'm a little out of my depth with this being my first time using Python and Gurobi.

I got this message after 10 min

Do you think I should leave it in? I put it right above the for x in range(3600) for loop like you suggested?
Normally the program ran for 40min - 1h before crashing so I don't know which is better.

Regards, Bill.

EDIT:
Update, lowering the hours looked at to 5 did the trick and the program runs! Here is the log

Does this log tell you anything of value? Either way, I'd just like to say thank you so much for your help!

• Gurobi Staff

Wow, okay, this makes sense. Check the size of the model! With 5 hours, you already have 647 million non-zero coefficients in your model. Every non-zero consumes already 12 bytes, and this is only for representing the model. When the solving process starts, we need to copy the model into other data structures and then represent it in memory at least three times. This is just way too huge for your available memory!

The good thing is that from a combinatorial point of view, your model seems really trivial to solve. The initial LP relaxation already provides an integral solution, which means that this is an optimal solution. But you have to do something to reduce the number of non-zeor coefficients. 647 million non-zeros for just 90000 rows is very uncommon. This means more than 7000 non-zeros per row.

As far as I understand, for every second you extend the bad_SOC linear expression. I guess this means that for each second you calculate the current amount of energy as the sum of changes over all seconds prior to the current time. This will of course grow very quickly.

Instead, you could introduce a new variable for each second and then just say that the energy stored in the current second is equal to the energy of the previous second plus the delta in the current second. This would make these constraints to have only three non-zero coefficients, of course with the downside of introducing one more variable per second. But this should be far better than getting these very very dense constraints.

Regards,

Tobias

Thank you so much Tobias!
The model now solves in around 2 minutes. Quite the improvement from my previous 45 min!

EDIT: I made a new post and I belive I solved the issue I had. Look here if you're curious.
https://support.gurobi.com/hc/en-us/community/posts/360058627091-Reduce-expression-into-a-single-value

Hello again Tobias. It appears that the implementation I tried did not work.
Setting the value to the variable simply made the variable change it's own value to acomodate for keeping within constraints.
I'll post an almost functioning test code below so that you can see how I've currently implemented the solution.(I explain how to make it slow but solvable later on)
The problem is that I don't know how to execute what you said. How do I make the variable simply look at the value of that one secound? The way I'm doing it right now makes it so that the full expression extends every secound. When I try using things like gp.quicksum() I get this problem TypeError: 'gurobipy.LinExpr' object is not iterable

To make this simple code run I change bat_SOC_test to bat_SOC1 in the last constraint.
However this makes the constraint extremely dense when looking at many timesteps, therefor my final model is overburdend. Help on how to get around this problem would be extremly usefull to me.

All the best, Bill.