Maximum of a Variable has to be reached for a certain amount of intervals
AnsweredHi everyone,
I am currently trying to optimize the capacity of a battery storage, which is being charged by renewable energy sources and should be used for smoothing the output.
State of Charge = SOC; b[i] = binary for decision of charging
I now want to add a constraint that states: the maximum soc (which is 0.9 * cap) should be reached for a certain amount of time in the timeseries.
To achieve this, I introduced a binary variable, which should be 1 if max_soc is reached and 0 if thats not the case. Then I want to sum up the variable and set a minimum.
I tried to model it via big_M method, but this mostly results in a type error.
The other way I tried doesn't give the expected result, as it always states, that is_soc_90 = 1.
I normally get the following TypeError:
Is there a way to find model this any other way? Since normally I get errors because of unsupported operand types for TempConstr and tupledict/int or tupledict and int.
Thanks in advance for your help.
-
Hi Maximilian,
The error:
unsupported operand types for TempConstr and tupledict/int or tupledict and int
means that the object type does not correspond to the method, as the sentence refers to. In such cases, it is recommended to check the type of the object in the method just before the code in which the error occurs. For example, if the following method call produces an error:
m.addConstr(soc[i] >= cap * 0.89 + eps - M * (1 - is_soc_90), name="bigM_constr1")
print and verify the objects contained in this:
print(type(soc[i]))
print(type(cap))
# For other objects as well
# ...
m.addConstr(soc[i] >= cap * 0.89 + eps - M * (1 - is_soc_90), name="bigM_constr1")You will probably find an object that is tupledict. (I think is_soc_90 is suspicious, because it is referenced by is_soc_90[i] in other parts)
Thanks,
Ryuta0 -
Thank you for the quick reply. I did forget the [i] at the end of is_soc_90. Either way it seems to not solve the whole problem as is_soc_90 is 1 for every interval, even though that shouldn't be the case.
eps = 0.0001
M = 10 + eps
m.addConstr(szb[i] >= cap * 0.9 + eps - M * (1 - is_soc_90[i]))
m.addConstr(szb[i] <= cap * 0.9 + M * is_soc_90[i]))Is there anything wrong with the formulation, or is there maybe a way to model the same thing via m.addGenConstrIndicator?
0 -
Hi,
The szb[i] is a new variable. Is this a different from soc[i] ? Also, what is the result value of szb[i] (soc[i]?) where is_soc_90 is 1? It might be not a problem that the is_soc_90 is set to 1 if the following conditions are satisfied:
szb[i] >= cap * 0.9 + eps
Are there any other factors that would cause is_soc_90 to be 0?
Thanks,
Ryuta-1 -
Sorry, that's on me. szb[i] should be soc[i]. The thing is, that is_soc_90 = 1 for every single value of soc[i]. But of course that should not be the case, since:
soc[i] == cap * 0.9 + eps
which the big_M constraints should model, is not the case most of the times. E.g. for i = 1; soc = 1.25, cap * 0.9 = 11.25 but is_soc_90 is still 1 (For this case it should definitely be 0).
is_soc_90 = m.addVars(T, vtype=GRB.BINARY)
m.addConstr(soc[i] >= cap * 0.9 + eps - M * (1 - is_soc_90[i]))
m.addConstr(soc[i] <= cap * 0.9 + M * is_soc_90[i])
m.addConstr(gp.quicksum(is_soc_90[i] for i in range(T)) >= full_load_hours)These are the only times is_soc_90[i] is used in the code, so I don't think it should be affected by anything else. But still the proposed way of setting the binary variable to 1 if soc[i] == cap * 0.9 and to 0 otherwise doesn't work. This is the article I'm referring to: https://support.gurobi.com/hc/en-us/articles/4414392016529-How-do-I-model-conditional-statements-in-Gurobi
0 -
Hi,
That's a little strange. Could you please share the full code? Note that uploading files directly in the Community Forum is not possible but we discuss an alternative in Posting to the Community Forum.
Thanks,
Ryuta0 -
Here is the full optimization code: the timeseries which are given to the optimization are all arrays of the same length and don't pose any problems.
0 -
Hi,
Thank you for sharing the code! Unfortunately, I cannot run the code on my side, because of the lack of some input data. If it is difficult to share reproducible code (code with data) here, could you share the MPS file? Please see: How do I export a model from Gurobi?
Thanks,
Ryuta0 -
Since I'm not allowed to share the exact data, I created some random data that's similar to the data I'm using.
https://www.filemail.com/d/rsphnksqpruklry
The data is being retrieved from this excel file by using:
file_path = r'\file_path'
df_5min = pd.read_excel(file_path, parse_dates=['period_end'], index_col='period_end')
a = df_5min['5_min_energy']
x = df_5min['hourly_avg_power_repeated']
c = df_5min['hourly_prices_repeated']0 -
As additional information: The Packages I'm using generally are numpy, pandas and gurobipy
0 -
Hi,
Thank you for sharing random data! It took some code modifications(resolving SyntaxError), but it worked here too.
Since you have the condition "soc[i] <= 0.9 * cap", the position of the eps seems to be better with the second constraint.m.addConstr(soc[i] >= cap * 0.9 - M * (1 - is_soc_90[i])) m.addConstr(soc[i] <= cap * 0.9 - eps + M * is_soc_90[i])
After optimization, I print the values like this:
m.optimize()
for i in range(1, T):
print(i, cap.X, cap.X*0.9, soc[i].X,is_soc_90[i].X)Here are the first few lines resulting from this:
1 12.500124999999999 11.250112499999998 1.2500125 0.0
2 12.500124999999999 11.250112499999998 1.2500125 0.0
3 12.500124999999999 11.250112499999998 1.2500125 0.0
4 12.500124999999999 11.250112499999998 1.2500125 0.0
5 12.500124999999999 11.250112499999998 3.3188087599379172 0.0
6 12.500124999999999 11.250112499999998 5.718215272059207 0.0
7 12.500124999999999 11.250112499999998 8.03982343583327 0.0
8 12.500124999999999 11.250112499999998 9.543406888595491 0.0
9 12.500124999999999 11.250112499999998 9.718422794913664 0.0
10 12.500124999999999 11.250112499999998 11.250112499999998 1.0
11 12.500124999999999 11.250112499999998 11.250112499999998 1.0
12 12.500124999999999 11.250112499999998 11.250112499999998 1.0
13 12.500124999999999 11.250112499999998 11.250112499999998 1.0
14 12.500124999999999 11.250112499999998 11.250112499999998 1.0
15 12.500124999999999 11.250112499999998 11.250112499999998 1.0
16 12.500124999999999 11.250112499999998 11.250112499999998 1.0
17 12.500124999999999 11.250112499999998 8.08795736590396 0.0
18 12.500124999999999 11.250112499999998 5.820744117953488 0.0
19 12.500124999999999 11.250112499999998 3.67501530633216 0.0
20 12.500124999999999 11.250112499999998 2.230677477885841 0.0According to this, when soc[i] == cap * 0.9 is satisfied, it seems to be is_soc_90[i] = 1 else is_soc_90[i] = 0. Is this the behavior you want?
Thanks,
Ryuta0 -
I don't know why, but for some reason it works for the code I send to you, but it doesn't seem to work with my original code. Anyway the problem is fixed now, it does work. Thank you very much for your help.
I do however have another question which arises from the results I got with the condition working.
My aim was to optimize the capacity by having a constraint that states that the storage has to be used on full for a certain amount of time. I assumed that if I raised the number of hours the storage should be used, the capacity would just be lower in the end.
But if I change the amount of full_load_hours in the code, the capacity stays the same for all of them, but the other variables (e.g. r and deviation), which shouldn't change, change their values. Even if I change the full_load_hours to 0 the capacity stays the same. The only way to change the capacity now, is if i delete the big_M constraints.
Does this maybe have to do with the fact that there are too many constraints related to cap? Or is it maybe not possible to optimize cap for the whole problem in hindsight, because it is only updated for every [i] interval?
0 -
Hi,
On my side, the cap value is 12.50012500 when H=4, and it will be 12.159504406436357 after I set H = 24.(Not sure if this is the correct way to set this up. And for H greater than 25, the problem becomes Infeasible. )
H = 24
full_load_hours = H * 12 #5 min to full hourSo, The impact of full_load_hours may be small in this data. A more extreme data set might show a larger impact.
Thanks,
Ryuta0 -
In addition, when H=23, cap=12.500125 is obtained, at which time the objective function is 309.77262. Here, as we restrict the value of cap to smaller values, the objective value worsens to 310.31921 for cap=12 and 312.71921 for cap=11. From this it appears that a small cap leads to worse objective values.
I assumed that if I raised the number of hours the storage should be used, the capacity would just be lower in the end.
I probably don't understand all of your optimization problems. However, it might be a good idea to reconsider the rationale for this idea as well.
Thanks,
Ryuta0 -
I solved the problem now. The value of M was too low and restricted the soc[i] from getting any bigger than ~12.5. I assigned a higher value to M and it seems to work the way I initially wanted it to.
Thank you very much for your help.
0
Please sign in to leave a comment.
Comments
14 comments