Difference objective function in code yields difference result which one is the correct one?
AnsweredHi,
I am just new to Python and Gurobi. This post is the first question that I would like to ask. I hope my question format is okay.
Actually, I have a problem proofing whether my code is correct or not since the result I got from my code yields different results if I put the objective function in a different line.
I have a formulation that I need to solve like this
$$ \min_x z = f(x)
s.t.
\sum_{i = 1}^n \delta_{ij} \geq 1, j = 1, ..., m
\sum_{j = 1}^m \delta_{ij} = 1, i = 1, ..., n
$$
Let me put the code here, where version 1 is like this:
#Set maximum iteration for solving the problem
MAX_ITER_TIMES = 10
# Initial condition
A = [18., 15., 11., 9., 14., 13., 9., 14., 13.]
B = [60., 54.50, 64.50]
N = len(A) # number of items
M = len(B) # number of available rolls
# Create the assignment matrix
assignment_matrix = np.zeros((N, M))
def updateObj(self):
self.setObjective((gp.quicksum(cost_matrix[i][j] * delta_ij[i, j] for i in range(N) for j in range(M))))
def solve(self, flag = 0):
self.Params.OutputFlag = 1
self.optimize()
def update_constraints(self):
self.addConstrs((gp.quicksum(delta_ij[i,j] for i in range(N)) >= 1) for j in range(M))
self.addConstrs((gp.quicksum(delta_ij[i,j] for j in range(M)) == 1) for i in range(N))
# Calculate the cost matrix
cost_matrix = np.zeros((N, M))
for i in range(N):
for j in range(M):
cost_matrix[i, j] = (B[j] - A[i]) / B[j]
# Create the objective function theta_ij * delta_ij
model.setObjective(gp.quicksum(cost_matrix[i][j] * delta_ij[i, j] for i in range(N) for j in range(M)))
model = gp.Model("General_Assignment_Problem")
# Create the decision variable the delta_ij
delta_ij = model.addVars(N, M, vtype = GRB.BINARY, name = 'delta_ij')
# Create constraints
model.addConstrs((gp.quicksum(delta_ij[i,j] for i in range(N)) >= 1) for j in range(M))
model.addConstrs((gp.quicksum(delta_ij[i,j] for j in range(M)) == 1) for i in range(N))
# Find the optimal solution
model.optimize()
# Create the assignment matrix
assignment_matrix = np.zeros((N, M))
for i in range(N):
for j in range(M):
if delta_ij[i, j].x == 1:
assignment_matrix[i, j] = delta_ij[i, j].x
print(assignment_matrix)
With version 1 I put the model.setObjective like the code above, and I got this result
[[0. 1. 0.] [0. 0. 1.] [1. 0. 0.] [0. 1. 0.] [1. 0. 0.] [0. 0. 1.] [0. 0. 1.] [0. 1. 0.] [0. 0. 1.]]
Optimal solution found
However, when the objective function is switched to this condition, see the snippet code below
model = gp.Model("General_Assignment_Problem")
# Create the decision variable the delta_ij
delta_ij = model.addVars(N, M, vtype = GRB.BINARY, name = 'delta_ij')
# Create the objective function theta_ij * delta_ij
model.setObjective(gp.quicksum(cost_matrix[i][j] * delta_ij[i, j] for i in range(N) for j in range(M)))
The result I got is like this:
[[0. 1. 0.] [0. 1. 0.] [0. 1. 0.] [1. 0. 0.] [0. 1. 0.] [0. 1. 0.] [0. 0. 1.] [0. 1. 0.] [0. 1. 0.]] Optimal solution found.
To be honest, in my perspective, I do prefer the result in version 1. However, I am not sure that I was doing it correctly. Could you give me a suggestion on whether my approach is correct or not?
Thank you
-
In your code (version 1), you have model.setObjective() before the model definition model=gp.Model(). If you do not get an error, the model object is probably defined earlier (as well as variables delta_ij). With the next call model=gp.Model(), the model is overwritten, i.e., an empty model is created.
So, I guess in the first version you have a 0 objective, and in the second version the objective is considered.To troubleshoot, I would recommend writing your model into an LP file with model.write("Test.lp") directly before calling model.optimize(). In this way, you can check what model is actually optimized and you could compare the lp files (name them differently) for both of your versions.
0 -
Hi, Thank you for your reply and insight.
I already solve the problem after troubleshooting again. Thank you for the direction once again!
0
Please sign in to leave a comment.
Comments
2 comments