How to effectively model "stability of a continuous variable" without causing long runtimes
Hi everyone,
I'm trying to build a model that tries to stabilize a ratio variable over time, by minimizing the absolute value of its differences between each timestep, but even with a small toy case it's spiraling into a very long solve and it's struggling to close the MIPGap.
An example below, I set up several MVars as follows:
- numerator (non-negative integer variable) with dimension: 5 types * 10 days
- ratio (non-negative continuous variable) with dimension: 5 types * 10 days
- ratio_difference (non-negative continuous variable) with dimension: 5 types * 9 days
The denominator is already known, as a constant:
- denominator (non-negative integer constant) with dimension: 5 types * 10 days
So I write the constraints as such:
- Defining the ratios: numerator == ratio * denominator
- Defining the absolute value of the ratio difference day-by-day:
- ratio_difference >= ratio[:, 1:] - ratio[:, :-1]
- ratio_difference >= ratio[:, :-1] - ratio[:, 1:] - There are other constraints to the numerator, but I've omitted them here
And the objective will be minimizing the sum of ratio_difference.
With this setup, the model really struggles to close the MIP gap and produces a log like below.
Would appreciate any suggestion on how to more efficiently model such a "stability", so it's less likely to run into long runtime issues. Thanks so much in advance!
Please sign in to leave a comment.
Comments
0 comments