Skip to main content

Infeasibility -> Constraint Relaxation

Answered

Comments

10 comments

  • Marika Karbstein
    Gurobi Staff Gurobi Staff

    Hi Cedric,

    If I understand correctly, your model with equality constraints is infeasible but you think it will be feasible if you replace equality with greater-than-or-equality. You then want to know which of these constraints are not satisfied with equality.
    It is not directly possible but I think you can do the following:

    if gurobi_model.status == GRB.INFEASIBLE:
      torelax = part_constraint.values()
    # change from "=" to "<="
      for con in torelax:
          con.Sense = '<'
      conpens = [1] * len(torelax)
      gurobi_model.feasRelax(relaxobjtype=2, minrelax=False, vars=None, lbpen=None, ubpen=None, constrs=torelax, rhspen=conpens)
      # replace constraint again with equality
      for con in torelax:
          con.Sense = '='
      gurobi_model.optimize(my_callback)

    In this way e.g. ax = b is replaced by ax – slack = b. (slack >=0)

    Using relaxobjtype= 2 of Model.feasRelaxS() inserts continuous slack variables for each constraint and binary variables to count the number of violated constraints. The names of these binary variables are “ArtNB_” followed by the name of the original constraint if they are introduced for less-equal constraints and “ArtPB_” followed by the name of the constraint if they are introduced for greater-equal constraints. So you could check which of these variables is 1 in the optimal solution of the feasibility relaxation, they point to the violated constraints. The variables with names "ArtN_" (or "ArtP_") followed by the name of the constraint give you the amount of violation. Some details can also be found in How do I change variable and/or constraint bounds to make an infeasible model feasible using feasRelax?

    I hope this helps,
    Marika

    0
  • CEDRIC JUSTIN
    Conversationalist
    First Question

    Thank you Marika.

    Why are you changing twice con.sense ? I understand going from "=" to "<" but not sure why you replace it again with equality later. What is the purpose?

    For ArtN and ArtP, I had already seen the suggested ref and was able to get it work. This said, given my intent, I should only be looking at ArtP since ArtN would correspond to elements not covered (i.e, <1)

    0
  • Marika Karbstein
    Gurobi Staff Gurobi Staff

    If the sense of the constraint is not changed back to '=', we would have \(ax - s \leq b\) with \(s\geq 0\) being the slack variable. This would also allow solutions that imply \(ax < b\) with \(s=0\). But I thought you only want to allow relaxing to greater-than equality? With \(ax-s=b\) only solutions with \(ax \geq b\) and \(s \geq 0\) are allowed.

    The model should then only contain ArtN variables.

    0
  • CEDRIC JUSTIN
    Conversationalist
    First Question

    Yes, typo on my part, I want to switch from "=" equality to '>=" inequality constraints i.e. going from a set-partitioning problem to a set covering problem.

    So if I understand correctly, the lines below is just to force Gurobi to introduce slack variables, and then you force again the equality to ensure we get only ArtN? I think I need to experiment. Thanks!

     for con in torelax:
            con.Sense = '<'
    0
  • Marika Karbstein
    Gurobi Staff Gurobi Staff

    Yes. For '\(\leq\)' constraints, Gurobi introduces ArtN-variables to amount a "negative slack" for the LHS that ensures that the constraints can always be satisfied, for '\(\geq\)' constraints Gurobi introduces ArtP-variables to amount a "positive slack" for the LHS to ensure that the constraints can always be satisfied. For '=' constraints both '\(\leq\)' and '\(\geq\)' constraints are considered.
    When changing '=' to '\(\leq\)' and using feasibility relaxation, only ArtN variables are introduced by Gurobi and we only count violations of '\(\leq\)', i.e., whenever we have '>' in the original constraint a slack variable needs to be positive. No slack is considered for '<'. But since we do not want to allow '<' in the original constraint, the Sense needs to be changed back to '='. 

    0
  • CEDRIC JUSTIN
    Conversationalist
    First Question

    This makes sense, thanks Marika. I have another related question to minrelax:

    gurobi_model.feasRelax(relaxobjtype=2, minrelax=False, vars=None, lbpen=None, ubpen=None, constrs=torelax, rhspen=conpens)

    If I set minrelax=False, then Gurobi will solve the problem that tries to minimize the cost of the violation. However, if I want to solve the original problem, I need to set minrelax=True as per the Gurobi documentation.

    Now, the interesting thing is that Gurobi is able to find a solution when minrelax=False, basically violating one equality constraint of the original problem. However, with minrelax=True, then it states that the relaxed model is infeasible. How can that be since we know from minrelax=False that at least one solution to the relaxed problem does exist?

    In fact, when I look at the .lp file that is being written, it shows the following:

    Minimize
      249.080625 Delta_pairing[10064_10065]
       + 253.9410416666666 Delta_pairing[10064_10202]

    Subject To
     Set_Partitioning_Constraint[10064]: Delta_pairing[10064_10065]
       + Delta_pairing[10064_10202] - ArtN_Set_Partitioning_Constraint[10064]
       = 1


     Set_Partitioning_Constraint[10065]: Delta_pairing[10064_10065]
       - ArtN_Set_Partitioning_Constraint[10065] = 1

     Set_Partitioning_Constraint[10202]: Delta_pairing[10064_10202]
       - ArtN_Set_Partitioning_Constraint[10202] = 1

     feasobj: ArtN_Set_Partitioning_Constraint[10064]
       + ArtN_Set_Partitioning_Constraint[10065]
       + ArtN_Set_Partitioning_Constraint[10202] <= 0

    Bounds
    Binaries
     Delta_pairing[10064_10065] Delta_pairing[10064_10202]
    End

    So, why does setting minrelax = True introduce this new (feasobj) constraint (it is not there if I set minrelax = False)? I thought the ArtN slack variables could take any value greater or equal to 0 but this last constraint forces them all to zero?

    feasobj: ArtN_Set_Partitioning_Constraint[10064]
       + ArtN_Set_Partitioning_Constraint[10065]
       + ArtN_Set_Partitioning_Constraint[10202] <= 0

    Without this additional constraint, the relaxed problem would be feasible with:

    ArtN_Set_Partitioning_Constraint[10064] = 1

    ArtN_Set_Partitioning_Constraint[10065] = 0

    ArtN_Set_Partitioning_Constraint[10202] = 0

    which in turn indicates that the first equality constraint of the original problem is violated but satisfied in the relaxed problem, while the other two constraints are not violated neither in the original nor in the relaxed problem.

    0
  • Marika Karbstein
    Gurobi Staff Gurobi Staff

    From your description, I would also assume the RHS of the feasobj constraint should be 1.
    However, since you have relaxobjtype=2 the feasobj constraint should involve the artificial binary variables ArtNB_*.
    Next to the continuous variables
    ArtN_Set_Partitioning_Constraint[10064]
    ArtN_Set_Partitioning_Constraint[10065]
    ArtN_Set_Partitioning_Constraint[10202] 
    the LP should also contain the binary variables
    ArtNB_Set_Partitioning_Constraint[10064]
    ArtNB_Set_Partitioning_Constraint[10065]
    ArtNB_Set_Partitioning_Constraint[10202] 
    and the latter three should be part of the feasobj-constraint.

    Which Gurobi version do you use?
    Could you share the log file and/or the code?
    Note that uploading files in the Community Forum is not possible but we discuss an alternative in Posting to the Community Forum.

     

    0
  • CEDRIC JUSTIN
    Conversationalist
    First Question

    Marika,

    Thanks for your feedback. I have uploaded some of the code

    I am using Gurobi 10 as shown below in the console

    Gurobi Optimizer version 10.0.0 build v10.0.0rc2 (win64)

    CPU model: 11th Gen Intel(R) Core(TM) i7-1195G7 @ 2.90GHz, instruction set [SSE2|AVX|AVX2|AVX512]
    Thread count: 4 physical cores, 8 logical processors, using up to 8 threads

    Optimize a model with 5 rows, 5 columns and 14 nonzeros
    Model fingerprint: 0x0f314cf1
    Variable types: 0 continuous, 5 integer (5 binary)
    Coefficient statistics:
      Matrix range     [1e+00, 1e+00]
      Objective range  [2e+02, 3e+02]
      Bounds range     [1e+00, 1e+00]
      RHS range        [1e+00, 1e+00]
    Presolve removed 2 rows and 2 columns
    Presolve time: 0.00s

    Explored 0 nodes (0 simplex iterations) in 0.00 seconds (0.00 work units)
    Thread count was 1 (of 8 available processors)

    Solution count 0

    Model is infeasible
    Best objective -, best bound -, gap -

    User-callback calls 46, time in user-callback 0.00 sec

    Solve phase I feasrelax model to determine minimal relaxation

    Optimize a model with 5 rows, 10 columns and 19 nonzeros
    Model fingerprint: 0x8e351c88
    Variable types: 5 continuous, 5 integer (5 binary)
    Coefficient statistics:
      Matrix range     [1e+00, 1e+00]
      Objective range  [1e+00, 1e+00]
      Bounds range     [1e+00, 1e+00]
      RHS range        [1e+00, 1e+00]
    Found heuristic solution: objective 0.0000000

    Explored 0 nodes (0 simplex iterations) in 0.00 seconds (0.00 work units)
    Thread count was 1 (of 8 available processors)

    Solution count 1: 0 

    Optimal solution found (tolerance 3.00e-02)
    Best objective 0.000000000000e+00, best bound 0.000000000000e+00, gap 0.0000%
    Warning: variable name "Delta_pairing[10038 10039]" has a space
    Warning: to let Gurobi read it back, use rlp format
    Gurobi Optimizer version 10.0.0 build v10.0.0rc2 (win64)

    CPU model: 11th Gen Intel(R) Core(TM) i7-1195G7 @ 2.90GHz, instruction set [SSE2|AVX|AVX2|AVX512]
    Thread count: 4 physical cores, 8 logical processors, using up to 8 threads

    Optimize a model with 6 rows, 10 columns and 24 nonzeros
    Model fingerprint: 0x815ab6a8
    Variable types: 5 continuous, 5 integer (5 binary)
    Coefficient statistics:
      Matrix range     [1e+00, 1e+00]
      Objective range  [2e+02, 3e+02]
      Bounds range     [1e+00, 1e+00]
      RHS range        [1e+00, 1e+00]

    User MIP start did not produce a new incumbent solution
    User MIP start violates constraint Set_Partitioning_Constraint[10038] by 1.000000000
    MIP start from previous solve did not produce a new incumbent solution
    MIP start from previous solve violates constraint Set_Partitioning_Constraint[10038] by 1.000000000

    Presolve removed 0 rows and 7 columns
    Presolve time: 0.00s

    Explored 0 nodes (0 simplex iterations) in 0.00 seconds (0.00 work units)
    Thread count was 1 (of 8 available processors)

    Solution count 0

    Model is infeasible
    Best objective -, best bound -, gap -

     

    As you can see, the original problem is infeasible. So I am trying to change all the equality 'partitioning constraints to some greater or equal  'coverage constraints' following your technique and my understanding of the Gurobi documentation.

    I can use either relaxobjtype=0 or relaxobjtype=2. Both leads to the same issue down the road. relaxobjtype=0 is simpler initially so this is what I use while relaxobjtype=2 will help me figure out where the original constraints are violated. For now, I have set it to 0.
     
    The formulation of the original problem then looks like this:
    \ 
    \ LP format - for model browsing. Use MPS format to capture full model detail.
    Minimize
      247.255 Delta_pairing[10040_10041] + 245.005 Delta_pairing[10042_10041]
    Subject To
     Set_Partitioning_Constraint[10040]: Delta_pairing[10040_10041] = 1
     Set_Partitioning_Constraint[10041]: Delta_pairing[10040_10041]
       + Delta_pairing[10042_10041] = 1
     Set_Partitioning_Constraint[10042]: Delta_pairing[10042_10041] = 1
    Bounds
    Binaries
     Delta_pairing[10040_10041] Delta_pairing[10042_10041]
    End

    As we can see, the model is infeasible due to the second constraint. We need both Delta to be one but the sum of the two Delta shall be one as well.

     Set_Partitioning_Constraint[10041]: Delta_pairing[10040_10041]
       + Delta_pairing[10042_10041] = 1

     

    Hence we need to relax the problem. The relaxed problem with the inequalities in lieu of the equalities introduce the slack variables Art and looks like this:

    \
    \ LP format - for model browsing. Use MPS format to capture full model detail.
    Minimize
      247.255 Delta_pairing[10040_10041] + 245.005 Delta_pairing[10042_10041]
    Subject To
     Set_Partitioning_Constraint[10040]: Delta_pairing[10040_10041]
       - ArtN_Set_Partitioning_Constraint[10040] = 1
     Set_Partitioning_Constraint[10041]: Delta_pairing[10040_10041]
       + Delta_pairing[10042_10041] - ArtN_Set_Partitioning_Constraint[10041]
       = 1
     Set_Partitioning_Constraint[10042]: Delta_pairing[10042_10041]
       - ArtN_Set_Partitioning_Constraint[10042] = 1
     feasobj: ArtN_Set_Partitioning_Constraint[10040]
       + ArtN_Set_Partitioning_Constraint[10041]
       + ArtN_Set_Partitioning_Constraint[10042] <= 0
    Bounds
    Binaries
     Delta_pairing[10040_10041] Delta_pairing[10042_10041]
    End
    The formulation of the model constraints seem fine as well. Until we reach the new feasobj which introduces the constraint below which leads to infeasibility. Indeed this constraint indicates that the sum of all the slack variables shall be zero or less. Basically, it forces all ArtN slack variables to zero. i.e. the slack variables become useless as this is reverting to the original non-relaxed problem, hence the infeasibility. And this is why I don;t understand. From the Gurobi documentation I read, I don;t understand why this new constraint is added.
    feasobj: ArtN_Set_Partitioning_Constraint[10040]
       + ArtN_Set_Partitioning_Constraint[10041]
       + ArtN_Set_Partitioning_Constraint[10042] <= 0
     
    I am setting minrelax=True because I want to relax the problem, and find a solution to my original problem ,with the original objective function, while minimizing the constraint violations.
     
    If I were to change  minrelax from True to False, I get the following code and the following model:
    gurobi_model.feasRelax(relaxobjtype=0, minrelax=False, vars=None, lbpen=None, ubpen=None, constrs=constraint_to_relax, rhspen=constraint_penalty)
    \ LP format - for model browsing. Use MPS format to capture full model detail.
    Minimize
      ArtN_Set_Partitioning_Constraint[10040]
       + ArtN_Set_Partitioning_Constraint[10041]
       + ArtN_Set_Partitioning_Constraint[10042]
    Subject To
     Set_Partitioning_Constraint[10040]: Delta_pairing[10040_10041]
       - ArtN_Set_Partitioning_Constraint[10040] = 1
     Set_Partitioning_Constraint[10041]: Delta_pairing[10040_10041]
       + Delta_pairing[10042_10041] - ArtN_Set_Partitioning_Constraint[10041]
       = 1
     Set_Partitioning_Constraint[10042]: Delta_pairing[10042_10041]
       - ArtN_Set_Partitioning_Constraint[10042] = 1
    Bounds
    Binaries
     Delta_pairing[10040_10041] Delta_pairing[10042_10041]
    End
    And then the relaxed problem becomes feasible by having the following:
     Delta_pairing[10040_10041] = 1, 
    Delta_pairing[10042_10041] = 1
    ArtN_Set_Partitioning_Constraint[10041] = 1
    and thus:
    Delta_pairing[10040_10041] + Delta_pairing[10042_10041] -ArtN_Set_Partitioning_Constraint[10041]  = 1 
    Gurobi finds a solution as expected. However, even if the solution is fine, this is not the optimization I want because it returns 1. I want it to return the minimized original objective function (hence why I want minrelax=True)
     
    Any pointer would be very helpful and thanks for your help so far!
     
     
     
    0
  • Marika Karbstein
    Gurobi Staff Gurobi Staff

    Hi Cedric,

    Oh, yes, unfortunately, feasRelax is working a bit differently when changing minrelax=True. 
    With minrelax=True the first phase optimization is already done with the feasRelax call. This is different from minrelax=False, here no optimization is started when calling feasRelax. 
    That means with our modification of the constraints, feasRelax is called already after the Sense of the constraints is changed from '=' to '\(\leq\)'. Then setting all variables (original and slack) to 0 is a valid solution. Hence, the constraint

    feasobj: ArtN_Set_Partitioning_Constraint[10040]
       + ArtN_Set_Partitioning_Constraint[10041]
       + ArtN_Set_Partitioning_Constraint[10042] <= 0

    is added and after that, the constraints are changed back to '=' which yields an infeasible model. 

    So, I think in your case, you need to do the whole chain manually. But what might help here is the multi-objective feature of Gurobi. 
    So, I would add the slack variables to the constraints manually and then define the first objective to minimize the slack variables and the second objective to minimize the original objective of the model. 
    See also Model.setObjectiveN() and workforce5.py for an example.

    I hope this helps!
    Marika

    0
  • CEDRIC JUSTIN
    Conversationalist
    First Question

    Thanks Marika for all your insights. I had started doing the slack-variable relaxation by hand before you replied and that allowed me more customization. I'll use indeed a multi-objective optimization with hierarchy.

     

    0

Please sign in to leave a comment.