Skip to main content

Different solutions from the pool Solutions

Answered

Comments

3 comments

  • Eli Towle
    Gurobi Staff Gurobi Staff

    Hi Mishelle,

    Here are a few ideas:

    • Can you add any constraints to prevent multiple nonzero values of \( z \) at an optimal solution?
    • Set PoolSearchMode to 2. Gurobi will find the \( n \) best solutions to your problem, where \( n \) is determined by the PoolSolutions parameter.
    • Let \( y^* \) be the optimal \( y \) vector obtained after solving, and let \( S = \{i \colon y^*_i = 1\} \). You can add a disjunctive cut to cut off all solutions with the same \( y \) variable values as \( y^* \), then re-solve the problem. In general, the cut you would add is:

    $$\begin{align*}\sum_{i \in S} (1 - y_i) + \sum_{i \notin S} y_i &\geq 1.\end{align*}$$

    For example, if your solution has \( y^*_0 = 1 \), \( y^*_1 = 0 \), and \( y^*_2 = 1 \), you have \( S = \{0, 2\} \). Thus, you add the inequality \( (1 - y_0) + y_1 + (1 - y_2) \geq 1 \) to the model. This inequality removes all solutions with \( y = (1, 0, 1) \), but does not affect any other solutions. When you re-solve the problem, the new solution will have different \( y \) values (if such a solution exists). You can repeat this process as many times as you want to find a set of solutions with unique \( y \) values.

    Thanks,

    Eli

    1
  • Markus Groissböck
    Gurobi-versary
    First Comment

    Hi Eli,

    Thanks for this detailed description of cut implementation.

    Is there on top of this a possibility to retrieve something like "the most different" result on a predefined KPI?

    Thanks,
    Markus

    0
  • Eli Towle
    Gurobi Staff Gurobi Staff

    I guess this depends on the KPI. "Most different" suggests the KPI should be incorporated into the objective function. As an example: instead of adding the constraint

    $$\begin{align*}\sum_{i \in S} (1 - y_i) + \sum_{i \notin S} y_i \geq 1,\end{align*}$$

    we could maximize the expression

    $$\begin{align*}\sum_{i \in S} (1 - y_i) + \sum_{i \notin S} y_i\end{align*}$$

    to find the solution that is the farthest distance from \( y^* \) (with respect to the L1 norm). To maximize the L1 distance from \( y^* \) from among the optimal solutions to your model, you can use the above expression as the second objective in a hierarchical multi-objective optimization model.

    1

Please sign in to leave a comment.