Ban a specific combination of variables
AnsweredHi Gurobi Community,
I'm currently working on an optimization problem where I need to ban specific combinations of variables. The scenario involves a set of decision variables representing quantities of different sizes in a production plan. I have a set of markers (combinations of these sizes) in a database that I want to exclude from the solution space.
Here's a simplified version of my code:
import gurobipy as gp
from gurobipy import GRB
# Define the total pieces
total_pieces = {'XS': 0, 'S': 5, 'M': 5, 'L': 0, 'XL': 0, 'XXL': 0}
# Define the markers in the database
filtered_markers = {
(0, 5, 5, 0, 0, 0): 1, # Example database marker
}
avg_cons = 0.7
# Create a new model
model = gp.Model("test_model")
# Add decision variables for marker x
x_XS = model.addVar(vtype=GRB.INTEGER, name="x_XS", lb=0)
x_S = model.addVar(vtype=GRB.INTEGER, name="x_S", lb=0)
x_M = model.addVar(vtype=GRB.INTEGER, name="x_M", lb=0)
x_L = model.addVar(vtype=GRB.INTEGER, name="x_L", lb=0)
x_XL = model.addVar(vtype=GRB.INTEGER, name="x_XL", lb=0)
x_XXL = model.addVar(vtype=GRB.INTEGER, name="x_XXL", lb=0)
xplies = model.addVar(vtype=GRB.INTEGER, name="xplies", lb=0)
# Length and consumption variables
length_marker_x = model.addVar(vtype=GRB.CONTINUOUS, name="length_marker_x", lb=0)
total_fabric_consumption = model.addVar(vtype=GRB.CONTINUOUS, name="total_fabric_consumption")
average_consumption = model.addVar(vtype=GRB.CONTINUOUS, name="average_consumption")
total_pieces_var = model.addVar(vtype=GRB.CONTINUOUS, name="total_pieces_var")
# Set the objective to minimize average fabric consumption
model.setObjective(average_consumption, GRB.MINIMIZE)
# Constraints for the marker length based on total pieces and average consumption
model.addConstr((x_XS + x_S + x_M + x_L + x_XL + x_XXL) * avg_cons == length_marker_x)
# Constraints to match pieces and not exceed 12 meters per marker
model.addConstr(length_marker_x <= 12, "fabric_per_layer_x")
model.addConstr((x_XS * xplies) >= 0.95 * total_pieces['XS'], "xs_min")
model.addConstr((x_XS * xplies) <= 1.05 * total_pieces['XS'], "xs_max")
model.addConstr((x_S * xplies) >= 0.95 * total_pieces['S'], "s_min")
model.addConstr((x_S * xplies) <= 1.05 * total_pieces['S'], "s_max")
model.addConstr((x_M * xplies) >= 0.95 * total_pieces['M'], "m_min")
model.addConstr((x_M * xplies) <= 1.05 * total_pieces['M'], "m_max")
model.addConstr((x_L * xplies) >= 0.95 * total_pieces['L'], "l_min")
model.addConstr((x_L * xplies) <= 1.05 * total_pieces['L'], "l_max")
model.addConstr((x_XL * xplies) >= 0.95 * total_pieces['XL'], "xl_min")
model.addConstr((x_XL * xplies) <= 1.05 * total_pieces['XL'], "xl_max")
model.addConstr((x_XXL * xplies) >= 0.95 * total_pieces['XXL'], "xxl_min")
model.addConstr((x_XXL * xplies) <= 1.05 * total_pieces['XXL'], "xxl_max")
# Total number of pieces and total fabric consumption constraints
model.addConstr(total_pieces_var ==
(x_XS * xplies) + (x_S * xplies) + (x_M * xplies) + (x_L * xplies) + (x_XL * xplies) + (x_XXL * xplies), "total_pieces_var")
model.addConstr(total_pieces_var >= 1 * sum(total_pieces.values()), "total_order_min")
model.addConstr(total_pieces_var <= 1.05 * sum(total_pieces.values()), "total_order_max")
# Constraint to approximate the average consumption
model.addConstr(total_fabric_consumption == average_consumption * total_pieces_var, "average_consumption")
# Total fabric consumption
model.addConstr(total_fabric_consumption == (length_marker_x * xplies) + 0.06 * xplies, "total_fabric_consumption")
# Part that I'm struggling with: Exclude markers in the database
for key in filtered_markers.keys():
match_indicator = model.addVar(vtype=GRB.BINARY, name=f"match_indicator_{key}")
# Constraints to set match_indicator to 1 if all conditions match
model.addConstr((x_XS == key[0]) >> (match_indicator == 1), name=f"XS_match_{key}")
model.addConstr((x_S == key[1]) >> (match_indicator == 1), name=f"S_match_{key}")
model.addConstr((x_M == key[2]) >> (match_indicator == 1), name=f"M_match_{key}")
model.addConstr((x_L == key[3]) >> (match_indicator == 1), name=f"L_match_{key}")
model.addConstr((x_XL == key[4]) >> (match_indicator == 1), name=f"XL_match_{key}")
model.addConstr((x_XXL == key[5]) >> (match_indicator == 1), name=f"XXL_match_{key}")
# Ensure match_indicator remains 0, effectively excluding these markers
model.addConstr(match_indicator == 0, name=f"exclude_marker_{key}")
# Optimize the model
model.optimize()
# Print the results
if model.status == GRB.OPTIMAL:
print(f"Optimal solution found:")
print(f"XS: {x_XS.X}")
print(f"S: {x_S.X}")
print(f"M: {x_M.X}")
print(f"L: {x_L.X}")
print(f"XL: {x_XL.X}")
print(f"XXL: {x_XXL.X}")
print(f"Total pieces: {total_pieces_var.X}")
print(f"Total fabric consumption: {total_fabric_consumption.X}")
print(f"Average consumption: {average_consumption.X}")
print(f"Length of the marker: {length_marker_x.X} m")
print(f"Xpliesr: {xplies.X}")
else:
print("No optimal solution found.")
Problem Description:
I need to exclude the specific combination (0, 5, 5, 0, 0, 0)
from being used in the optimization. I've attempted to do this by using binary match_indicator
variables and Gurobi's indicator constraints to set the match indicator to 1
when the values match the database entry. However, I'm encountering issues with this approach, and the solver does not seem to exclude the combination effectively.
I know the example constraints using >>
are not correct, but I wanted to illustrate the concept I tried.
Key Questions:
- Is there a better way to exclude a specific combination of integer variables in Gurobi?
- Are there specific issues or limitations with using indicator constraints in this context?
- How can I ensure that the banned combination
(0, 5, 5, 0, 0, 0)
is never chosen?
Any guidance or suggestions would be greatly appreciated. Thank you!
-
Hi Leonardo,
One solution is to reformulate your x variables as binary variables that encode the value of the original integer variables as an additional index. Just like it's done in our Sudoku example: sudoku.py. Instead of assigning an integer variable ranging from 1-9 to every field, you assign binary variables for every field and every possible value. While this may seem counterintuitive, it's often easier for the solver to deal with such a model despite the increased number of variables.
Then, adding the constraint that disallows a certain combination is straightforward:
(1-x_XS_0) + x_S_5 + x_M_5 + (1-x_L_0) + (1-x_XL_0) + (1-x_XXL_0) <= 5
You could even implement this is a dynamic way and disallow only solutions after they have been found by adding a callback as demonstrated in our TSP example: tsp.py. This might not be necessary, though, as long as the number of forbidden combinations is not huge.
Cheers,
Matthias0 -
Hi Matthias,
Thank you for the quick response and the suggestion. I implemented the method you recommended, where each potential value for the sizes is represented by a binary variable. Additionally, I used the constraint
(1 - x_XS_0) + x_S_5 + x_M_5 + (1 - x_L_0) + (1 - x_XL_0) + (1 - x_XXL_0) <= 5
to ban the specific combination.Here's the updated version of the model:
import gurobipy as gp
from gurobipy import GRB
# Define the total pieces
total_pieces = {'XS': 0, 'S': 5, 'M': 5, 'L': 0, 'XL': 0, 'XXL': 0}
# Define the banned combination of markers
banned_combination = {'XS': 0, 'S': 5, 'M': 5, 'L': 0, 'XL': 0, 'XXL': 0}
avg_cons = 0.75
possible_values = range(6) # Assuming values range from 0 to 5
# Create a new model
model = gp.Model("test_model")
# Define binary variables for each size and each possible value
x_XS = {val: model.addVar(vtype=GRB.BINARY, name=f"x_XS_{val}") for val in possible_values}
x_S = {val: model.addVar(vtype=GRB.BINARY, name=f"x_S_{val}") for val in possible_values}
x_M = {val: model.addVar(vtype=GRB.BINARY, name=f"x_M_{val}") for val in possible_values}
x_L = {val: model.addVar(vtype=GRB.BINARY, name=f"x_L_{val}") for val in possible_values}
x_XL = {val: model.addVar(vtype=GRB.BINARY, name=f"x_XL_{val}") for val in possible_values}
x_XXL = {val: model.addVar(vtype=GRB.BINARY, name=f"x_XXL_{val}") for val in possible_values}
xplies = model.addVar(vtype=GRB.INTEGER, name="xplies", lb=0)
# Length and consumption variables
length_marker_x = model.addVar(vtype=GRB.CONTINUOUS, name="length_marker_x", lb=0)
total_fabric_consumption = model.addVar(vtype=GRB.CONTINUOUS, name="total_fabric_consumption")
average_consumption = model.addVar(vtype=GRB.CONTINUOUS, name="average_consumption")
total_pieces_var = model.addVar(vtype=GRB.CONTINUOUS, name="total_pieces_var")
# Set the objective to minimize average fabric consumption
model.setObjective(average_consumption, GRB.MINIMIZE)
model.addConstr(
(
gp.quicksum(val * x_XS[val] for val in possible_values) +
gp.quicksum(val * x_S[val] for val in possible_values) +
gp.quicksum(val * x_M[val] for val in possible_values) +
gp.quicksum(val * x_L[val] for val in possible_values) +
gp.quicksum(val * x_XL[val] for val in possible_values) +
gp.quicksum(val * x_XXL[val] for val in possible_values)
) * avg_cons == length_marker_x
)
# Constraints to match pieces and not exceed 12 meters per marker
model.addConstr(length_marker_x <= 12, "fabric_per_layer_x")
model.addConstr(
gp.quicksum(val * x_XS[val] for val in possible_values) * xplies >= 0.95 * total_pieces['XS'], "xs_min")
model.addConstr(
gp.quicksum(val * x_XS[val] for val in possible_values) * xplies <= 1.05 * total_pieces['XS'], "xs_max")
model.addConstr(
gp.quicksum(val * x_S[val] for val in possible_values) * xplies >= 0.95 * total_pieces['S'], "s_min")
model.addConstr(
gp.quicksum(val * x_S[val] for val in possible_values) * xplies <= 1.05 * total_pieces['S'], "s_max")
model.addConstr(
gp.quicksum(val * x_M[val] for val in possible_values) * xplies >= 0.95 * total_pieces['M'], "m_min")
model.addConstr(
gp.quicksum(val * x_M[val] for val in possible_values) * xplies <= 1.05 * total_pieces['M'], "m_max")
model.addConstr(
gp.quicksum(val * x_L[val] for val in possible_values) * xplies >= 0.95 * total_pieces['L'], "l_min")
model.addConstr(
gp.quicksum(val * x_L[val] for val in possible_values) * xplies <= 1.05 * total_pieces['L'], "l_max")
model.addConstr(
gp.quicksum(val * x_XL[val] for val in possible_values) * xplies >= 0.95 * total_pieces['XL'], "xl_min")
model.addConstr(
gp.quicksum(val * x_XL[val] for val in possible_values) * xplies <= 1.05 * total_pieces['XL'], "xl_max")
model.addConstr(
gp.quicksum(val * x_XXL[val] for val in possible_values) * xplies >= 0.95 * total_pieces['XXL'], "xxl_min")
model.addConstr(
gp.quicksum(val * x_XXL[val] for val in possible_values) * xplies <= 1.05 * total_pieces['XXL'], "xxl_max")
# Total number of pieces and total fabric consumption constraints
model.addConstr(total_pieces_var ==
((gp.quicksum(val * x_XS[val] for val in possible_values) * xplies)) +
((gp.quicksum(val * x_S[val] for val in possible_values) * xplies)) +
((gp.quicksum(val * x_M[val] for val in possible_values) * xplies)) +
((gp.quicksum(val * x_L[val] for val in possible_values) * xplies)) +
((gp.quicksum(val * x_XL[val] for val in possible_values) * xplies)) +
((gp.quicksum(val * x_XXL[val] for val in possible_values) * xplies)), "total_pieces_var")
model.addConstr(total_pieces_var >= 1 * sum(total_pieces.values()), "total_order_min")
model.addConstr(total_pieces_var <= 1.05 * sum(total_pieces.values()), "total_order_max")
# Constraint to approximate the average consumption
model.addConstr(total_fabric_consumption == average_consumption * total_pieces_var, "average_consumption")
# Total fabric consumption
model.addConstr(total_fabric_consumption == (length_marker_x * xplies) + (0.06 * xplies), "total_fabric_consumption")
# Constraints to ensure only one value is selected per size
model.addConstr(gp.quicksum(x_XS.values()) == 1)
model.addConstr(gp.quicksum(x_S.values()) == 1)
model.addConstr(gp.quicksum(x_M.values()) == 1)
model.addConstr(gp.quicksum(x_L.values()) == 1)
model.addConstr(gp.quicksum(x_XL.values()) == 1)
model.addConstr(gp.quicksum(x_XXL.values()) == 1)
model.addConstr(
(1 - x_XS[0]) + x_S[5] + x_M[5] + (1 - x_L[0]) + (1 - x_XL[0]) + (1 - x_XXL[0]) <= 5,
name="exclude_banned_combination"
)
# Optimize the model
model.optimize()
# Print the results
if model.status == GRB.OPTIMAL:
print(f"Optimal solution found:")
result = {"XS": 0, "S": 0, "M": 0, "L": 0, "XL": 0, "XXL": 0}
for val in possible_values:
if x_XS[val].X == 1:
result["XS"] = val
if x_S[val].X == 1:
result["S"] = val
if x_M[val].X == 1:
result["M"] = val
if x_L[val].X == 1:
result["L"] = val
if x_XL[val].X == 1:
result["XL"] = val
if x_XXL[val].X == 1:
result["XXL"] = val
for size in ["XS", "S", "M", "L", "XL", "XXL"]:
print(f"{size}: {result[size]}")
print(f"Total pieces: {total_pieces_var.X}")
print(f"Total fabric consumption: {total_fabric_consumption.X}")
print(f"Average consumption: {average_consumption.X}")
print(f"Length of the marker: {length_marker_x.X} m")
print(f"Xplies: {xplies.X}")
else:
print("No optimal solution found.")Despite the implementation, the model still ends up selecting the banned solution. Any insights or suggestions on what might be going wrong would be greatly appreciated.
Thanks again for your help!
0 -
Despite the implementation, the model still ends up selecting the banned solution. However, I found a solution for the constraints to adapt to every combination. I used the following constraint:
for key in filtered_markers.keys():
for marker in ['x', 'y', 'z', 'w']:
# Construct the exclusion expression dynamically for each marker
exclusion_expr = (
binary_vars[marker]['XS'][key[0]] +
binary_vars[marker]['S'][key[1]] +
binary_vars[marker]['M'][key[2]] +
binary_vars[marker]['L'][key[3]] +
binary_vars[marker]['XL'][key[4]] +
binary_vars[marker]['XXL'][key[5]]
)
# Add the constraint to exclude this specific combination for the current marker
model.addConstr(exclusion_expr <= 5, name=f"exclude_marker_{marker}_{key}")With this approach, I could dynamically create constraints for each combination, ensuring that banned solutions are not selected. If you have any additional suggestions or insights, they would be greatly appreciated.
Thanks again for your help!
1 -
Hi Leonard!
Please excuse the confusion! You need to write exclusion constraint as
x_XS_0 + x_S_5 + x_M_5 + x_L_0 + x_XL_0 + x_XXL_0 <= 5
to properly catch the forbidden combination. Just as you did in your modified example. I mistakenly interpreted the x_0 values as "0" when there is nothing selected. They are set to 1, of course, to signal that "0" is active.
Cheers,
Matthias0 -
Hi Matthias,
Thank you for the clarification! Your explanation helped a lot, and I was able to solve the problem successfully. I appreciate your assistance and prompt response.
Best regards,
Leonardo0
Please sign in to leave a comment.
Comments
5 comments