Gurobi needs about 5 hours to solve a MIP model on a 6-core-machine on Windows. On a cluster (6 cores, Linux) the same model takes about 33 hours until its done. Furthermore, an easier MIP-model is done within 10 minutes on both operation systems. Is there any comprehensible reason for these significantly different calculation times in the first case?
Please sign in to leave a comment.