Gurobi is taking too much time in converging the non-linear constraints in MIQCP problem

Answered

Comments

4 comments

  • Jaromił Najman

    Hi Onam,

    Could you elaborate on how exactly the two models differ? If I understood correctly, you are adding \(x^8\) terms in the second model currently without bounds on \(x\) or the respective auxiliary variable, is this correct?

    Note that tight variable bounds are essential when working with non-linear functions. Especially, when working with high degree polynomials. This is due to the fact, that Gurobi uses a piecewise linear approximation of the nonlinear terms, which is highly dependent on variable bounds.

    Best regards,
    Jaromił

    0
    Comment actions Permalink
  • Onam Sinha

    Hi Jaromil Najman,

    Thank you very much for this reply. Both models are actually the same, both having up to x^8 terms in their objective function. The only difference is that in the second model I have a variable ratio term which isn't present in the first model. Also, I am keeping bounds on all the decision variables.

    Actually, I am creating few intermediate variables which are quadratic in nature. In the main problem I need to keep bounds on those intermediate variables but currently, I am not doing so. I am just creating those variables to check if creating those variables takes more time which is consequently increasing convergence time. What I found out is that If all the interim variables involve quadratic(or bilinear) terms only, the code is converging in no time but if I am creating a variables ratio term the code is taking too much time(Not converging in even one hour). So, the first code is the result of without variable-ratio term, and the second is the output of what I got when I used variable-ratio term. I want to reduce the time of convergence in the second code.

    0
    Comment actions Permalink
  • Jaromił Najman

    Hi Onam,

    Could you explicitly state the variable ratio terms that you are adding or in best case all terms that differ between the 2 problems? You can also provide the MPS/LP files as described in Posting to the Community Forum in order to make the issue reproducible for the Community.

    It is also possible that your problem is prone to optimization path variability. You can check this by running hte model for different values of the Seed parameter and see whether the 1st model is always solved so fast and whether the 2nd model is always solved so slow.

    Best regards,
    Jaromił

    0
    Comment actions Permalink
  • Onam Sinha

    Let me give you a brief introduction to the problem. I have three different ways of selling any product, their corresponding prices are what needs to be found out after optimizing the objective variable. Let us name them x, y, z. 

    f(x) .. Linear function of x. 
    f(e^x) .. exponential function of x, which I have expanded. 
    c1, c2, c3 are constant values given from the file.

    Net Revenue(NR)  =  f(e^x)*(52-y-z)  +  f(e^x) * y * (1-f(x))  +  f(e^x)*z*(1-f(x))
    Gross Margin(GM) =  f(e^x)* (52-y-z)*(1-c1) +  f(e^x) * y * (1-f(x))*(1-c2)  +  f(e^x)*z*(1-f(x))*(1-c3)

    I have other intermediate variables also which I am keeping in both the models. But GM/NR is created only in the second model which is a ratio of the variables. My question is why this variable creation taking so much time and is there anyways how we can make the second model run faster? My objective variable is NR(maximization problem). 

     

    Best regards,
    Onam

    0
    Comment actions Permalink

Please sign in to leave a comment.

Powered by Zendesk