
Michel Soares
Optimization Specialist at BITKA Analytics
- Total activity 113
- Last activity
- Member since
- Following 0 users
- Followed by 0 users
- Votes 2
- Subscriptions 59
Comments
Recent activity by Michel Soares-
Hi Laaziz, First make sure you are using Gurobi 11: there were a lot of changes for non-linear models and it seems to have an impact on your models. Your first model (opt-model9) is feasible for Gu...
-
Hi Laaziz, A .lp model should be fine. No need for the python code.
-
Hi Nima, From the logs, looking only at possible parameter changes, I would suggest trying Presolve = 2 and MIPFocus = 1. Also, updating your Gurobi to 11.0 might improve your results as well.
-
Hi Laaziz, I cannot see how that would be possible, I believe it would take a in-depth look to understand it. Can you share the logs and the LP file?
-
MIQCP is a convex optimization problem with integer variables. All convex optimization and integer programming optimization theory still holds and a lot of the technique Gurobi's uses is directly i...
-
Hi, Indeed there are additional steps involved in equal constraints. However, I cannot see how it would be better to replace it by two inequalities, it should be worse because it increases the numb...
-
As far as I understand, Aggregate and AggFill will mostly remove variables. These are sometimes referred to as "Substitute implied free variables". Disabling Aggregate increases your model solve ti...
-
Integer models are solved through branch and cut combined with several heuristics. What you described makes part of the algorithm, having this gap between a relaxed solution and the best integer fe...
-
Hi, I am not sure if it would help, but it seems like you do not need Bp[edge], you can use direction_AB and direction_BA instead. It should remove some variables and make your model more dense, wh...
-
Hi, You can get the presolved model this way: reduced_model = model.presolve() reduced_model.write("reduced_model.lp") You may be able to directly modify the reduced_model directly as you wish, I h...