Skip to main content

Best way to obtain a good solution in a limited time?

Answered

Comments

5 comments

  • Thomas Opfer
    Gurobi-versary
    Collaborator

    I have no real answer for your question, but please keep in mind, that in general for dual simplex, the first feasible solution that you find is already optimal. So it makes no sense to stop dual simplex early. (I am not sure whether feasible solutions could appear during phase 1.)

    Comparing barrier and primal simplex, I would say that it is not unexpected that barrier is faster for huge problems. If primal simplex does not yield any solution, it does not even have completed phase 1. Do you know any feasible basis that you could provide?

    I am not an expert for barrier, so I unfortunately have no idea how to make it faster.

    0
  • Jaromił Najman
    Gurobi Staff Gurobi Staff

    Hi Saied,

    Could you post the first ~30 lines of the LOG file showing model statistics?

    In general, for models of huge size, we strongly recommend to use a powerful machine. Note that the Barrier algorithm scales well with more available Threads.

    While Barrier is faster for huge models in most cases, the primal and dual simplex may still outperform the Barrier algorithm in some cases so they might be worth a try.

    What exactly do you mean that the Barrier returns a solution that is quite far from optimal? Did you hit the time limit?

    Best regards,
    Jaromił

    0
  • Saied Samiedaluie
    Gurobi-versary
    First Comment
    First Question

    Thanks very much Thomas and Jaromil for your reply.

    Below is an output of the model for a large problem that I am solving. I excluded some of the lines to save space.

    I understand Presolve removes about half of the rows and columns. I am not sure about what Ordering does--That takes about 12~13 hours. This is one of large size problems I am running and after a few days I still have not got any solution. For smaller problem where I could find the optimal solution in less than a day, I also ran the model with TimeLimit=7200. The objective value of the solution I got from the model with time limit has an optimality gap of 80%. This is what I meant the solution with a time limit is far from from the optimal solution. 

     

    Optimize a model with 22964736 rows, 17133360 columns and 294956952 nonzeros
    Model fingerprint: 0x4dcabd77
    Coefficient statistics:
    Matrix range [1e-03, 1e+00]
    Objective range [4e-01, 1e+02]
    Bounds range [0e+00, 0e+00]
    RHS range [1e+00, 1e+00]
    Presolve removed 0 rows and 0 columns (presolve time = 15s) ...
    Presolve removed 3274568 rows and 1928 columns (presolve time = 17s) ...
    Presolve removed 7104872 rows and 3617560 columns (presolve time = 32s) ...
    ...
    ...
    Presolve removed 11141444 rows and 8612216 columns (presolve time = 220s) ...
    Presolve removed 11917740 rows and 8672040 columns (presolve time = 230s) ...
    Presolve removed 11917740 rows and 8672040 columns
    Presolve time: 231.55s
    Presolved: 11046996 rows, 8461320 columns, 100264887 nonzeros
    Elapsed ordering time = 5s
    Elapsed ordering time = 10s
    ...
    ...
    Elapsed ordering time = 5950s
    Elapsed ordering time = 5955s
    Ordering time: 21074.18s
    Elapsed ordering time = 21202s
    Elapsed ordering time = 21205s
    ...
    ...
    Elapsed ordering time = 43650s
    Elapsed ordering time = 43657s
    Ordering time: 45443.30s

    Barrier statistics:
    AA' NZ : 5.224e+08
    Factor NZ : 1.736e+10 (roughly 150.0 GBytes of memory)
    Factor Ops : 4.224e+13 (roughly 2000 seconds per iteration)
    Threads : 32

    Objective Residual
    Iter Primal Dual Primal Dual Compl Time
    0 4.21027885e+06 4.22046698e+04 1.20e+02 1.09e+02 1.49e+03 54350s
    1 4.21752321e+06 2.25356303e+07 1.15e+02 1.51e+03 1.36e+03 57129s
    2 3.08841467e+06 8.81020972e+07 8.56e+01 4.56e+02 1.04e+03 60091s
    3 6.30770137e+05 1.09856440e+08 1.64e+01 2.02e+01 2.04e+02 63265s
    4 4.33075722e+05 1.06769647e+08 1.09e+01 1.29e+01 1.38e+02 66676s
    5 3.29436078e+05 1.00745914e+08 8.02e+00 7.35e+00 1.01e+02 69867s
    ...
    ...
    54 6.80502839e+04 9.83146751e+04 7.32e-05 3.25e-05 1.89e-03 226726s
    55 6.84800378e+04 9.74003585e+04 6.92e-05 3.00e-05 1.80e-03 230159s


     

    0
  • Jaromił Najman
    Gurobi Staff Gurobi Staff

    Hi Saied,

    Even after presolve, your model has over 100 million nonzeros which is a massive amount and is expected to take an extremely long time to solve.

    You could try experimenting with the parameters Presolve, BarOrder, AggFill, and BarCorrectors. My first guess would be to set the parameter Presolve to 2 to activate a more aggressive presolve routine. Next, I would try experimenting with the values for the other parameters via try-and-error as there are no real rules-of-thumb for these. Providing more Threads might also help as the Barrier algorithm scales well with core count.

    Did you consider different approaches for the solution of this large problem? Maybe a decomposition approach might be appropriate.

    Best regards,
    Jaromił

    0
  • Saied Samiedaluie
    Gurobi-versary
    First Comment
    First Question

    Thank you Jaromil.

    Yes, we are actually considering other approaches to solve this problem as it seems this is in fact a huge problem and takes a very long time to solve even though we are running the jobs on high-performance servers. We just wanted to make sure that we are trying all the computational tricks that might help decrease the solution time. 

    I will try changing the parameters to see if they have an impact on the speed of the barrier method. Thank you for your help. I really appreciate it. 

     

    0

Please sign in to leave a comment.