Skip to main content

Large-scale optimization with Gurobi

Answered

Comments

2 comments

  • Official comment
    Simranjit Kaur
    Gurobi Staff Gurobi Staff
    This post is more than three years old. Some information may not be up to date. For current information, please check the Gurobi Documentation or Knowledge Base. If you need more help, please create a new post in the community forum. Or why not try our AI Gurobot?.
  • Jaromił Najman
    Gurobi Staff Gurobi Staff

    Hi Beatrice,

    I doubt that a problem of this size can be solved on a "normal" machine in acceptable time by any solver due to the number of nonzeros alone.

    You say that you have a \(1,222,515 \times 1,222,515\) dense matrix. This means that you have 1,494,542,925,225 numbers which have to be stored in memory. Let's assume that you require only the upper/lower triangular matrix leaving you with ~7.5e11 numbers which have to be stored. An optimization solver most likely stores these numbers as doubles requiring 8 bytes per number totaling in 750GB alone to store the numbers of your matrix. Note that the calculation made here is very optimistic as usually more than 8 bytes are used to store each matrix entry, but it's enough for this example. Now, you have to think about the algorithms that are used. Each solver would have to store additional data (not only this one matrix) which may easily result in 10x or 100x more memory required. Then, the algorithms have to deal with this huge piece of data to perform their iterations resulting in very high computation time per iteration.

    I would recommend to think of a way to decompose your problem into multiple smaller subproblems if possible and then try to apply an SDP or general MILP solver.

    Best regards,
    Jaromił

    0

Post is closed for comments.