-
Notifications
You must be signed in to change notification settings - Fork 92
[Utilities] Drop MOI.RawParameters when resetting a CachingOptimizer #1229
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This is currently a mess. We have a documented option: MathOptInterface.jl/src/MathOptInterface.jl Lines 110 to 114 in f71ed83
which is not implemented as optimizer attributes are in fact not passed in copy_to but rather passed by the CachingOptimizer outside the copy_to function.Note that we do have a warning for starting values though: MathOptInterface.jl/src/Utilities/copy.jl Lines 284 to 291 in f71ed83
I would probably add this warning option to CachingOptimizer instead of copy_to . Then you check supports before calling set and throws a warning if it's not supported and the user did not give the option to ignore this.
So model = Model(Cbc.Optimizer)
set_silent(model)
set_optimizer(model, Clp.Optimizer)
MOI.get(model, MOI.Silent()) returns model = Model(Cbc.Optimizer, warn_option_name=false) # do we give the option here ?
set_silent(model)
set_optimizer(model, Clp.Optimizer, warn_option_name=false) # or here ?
MOI.get(model, MOI.Silent()) returns One issue with this is that many solvers simply do |
Passing |
I know it would be very breaking, but what if RawParameter takes name and value? |
We could do the following:
The issues is that if some solver define custom optimizer attributes, we have the same issue. So I don't thing that |
We have MathOptInterface.jl/src/Utilities/cachingoptimizer.jl Lines 646 to 657 in 1315e31
which we haven't really utilized yet. |
Edit: I resolved to copy all |
""" | ||
function reset_optimizer(m::CachingOptimizer, optimizer::MOI.AbstractOptimizer) | ||
@assert MOI.is_empty(optimizer) | ||
m.optimizer = optimizer | ||
m.state = EMPTY_OPTIMIZER | ||
for attr in MOI.get(m.model_cache, MOI.ListOfOptimizerAttributesSet()) | ||
# Skip attributes which don't apply to the new optimizer. | ||
if attr isa MOI.RawParameter |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So if I undersand correctly, this is a workaround for solvers not implementing supports
correctly for RawParameter? Otherwise, this would we caughr by the second condition
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Or is it because RawParameter with the same name might have different meaning?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be clarified in a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
because RawParameter with the same name might have different meaning?
This.
Closes #1220.
@blegat: An alternative is to not pass any OptimizerAttributes. If you're passing a new empty optimizer, it seems weird that the parameters would follow.
What do we expect to happen in the following at the JuMP level?