Skip to content

[Utilities] Drop MOI.RawParameters when resetting a CachingOptimizer #1229

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Apr 5, 2021

Conversation

odow
Copy link
Member

@odow odow commented Feb 2, 2021

Closes #1220.

@blegat: An alternative is to not pass any OptimizerAttributes. If you're passing a new empty optimizer, it seems weird that the parameters would follow.

What do we expect to happen in the following at the JuMP level?

model = Model(Cbc.Optimizer)
set_silent(model)
set_optimizer(model, Clp.Optimizer)
MOI.get(model, MOI.Silent())  # true or false?

@blegat
Copy link
Member

blegat commented Feb 2, 2021

This is currently a mess. We have a documented option:

then an [`UnsupportedAttribute`](@ref) error is thrown. Unsupported *optimizer*
attributes are treated differently:
* If `warn_attributes` is `true`, a warning is displayed, otherwise,
* the attribute is silently ignored.

which is not implemented as optimizer attributes are in fact not passed in copy_to but rather passed by the CachingOptimizer outside the copy_to function.
Note that we do have a warning for starting values though:
if !MOI.supports(dest, attr, supports_args...)
@warn(
"$attr is not supported by $(typeof(dest)). This " *
"information will be discarded."
)
continue
end
end

I would probably add this warning option to CachingOptimizer instead of copy_to. Then you check supports before calling set and throws a warning if it's not supported and the user did not give the option to ignore this.

So

model = Model(Cbc.Optimizer)
set_silent(model)
set_optimizer(model, Clp.Optimizer)
MOI.get(model, MOI.Silent())

returns true if Clp supports MOI.Silent and throws a warning otherwise.

model = Model(Cbc.Optimizer, warn_option_name=false) # do we give the option here ?
set_silent(model)
set_optimizer(model, Clp.Optimizer, warn_option_name=false) # or here ?
MOI.get(model, MOI.Silent())

returns true if Clp supports MOI.Silent and ignores it silently otherwise.

One issue with this is that many solvers simply do MOI.supports(::Optimizer, ::MOI.RawParameter) = true without checking the parameter string but I'm in favor of fixing the solvers supports function.

@odow
Copy link
Member Author

odow commented Feb 2, 2021

Passing RawParameter is also fraught though, even if it supports the parameter name. Because one solver might have print_level that takes an integer, and another might have print_level that takes a string or some solver-constant.

@joaquimg
Copy link
Member

joaquimg commented Feb 2, 2021

I know it would be very breaking, but what if RawParameter takes name and value?

@blegat
Copy link
Member

blegat commented Feb 19, 2021

We could do the following:

  • If a RawParameter is set when no optimizer is attached: throw a warning or an error
  • If a RawParameter is set when an optimizer is attached: set it to the optimizer but not to the cache.

The issues is that if some solver define custom optimizer attributes, we have the same issue. So I don't thing that RawParameter is a well-defined special case. Maybe we could have a subtype AbstractSolverSpecificOptimizerAttribute or a MOI.is_solver_specific trait

@odow
Copy link
Member Author

odow commented Feb 20, 2021

We have

# Force users to specify whether the attribute should be queried from the
# model_cache or the optimizer. Maybe we could consider a small whitelist of
# attributes to handle automatically.
# These are expert methods to get or set attributes directly in the model_cache
# or optimizer.
struct AttributeFromModelCache{T <: MOI.AnyAttribute}
attr::T
end
struct AttributeFromOptimizer{T <: MOI.AnyAttribute}

which we haven't really utilized yet.

@odow
Copy link
Member Author

odow commented Mar 25, 2021

Discussed on the developer call. Consensus was that we should only copy MOI.Silent and MOI.TimeLimitSec, if they are set.

Edit: I resolved to copy all copyable optimizer attributes that are supported, excluding RawParameters.

@odow odow force-pushed the od/rawparameter branch from f180a15 to db848fc Compare March 25, 2021 23:02
@odow odow requested a review from blegat March 26, 2021 01:48
"""
function reset_optimizer(m::CachingOptimizer, optimizer::MOI.AbstractOptimizer)
@assert MOI.is_empty(optimizer)
m.optimizer = optimizer
m.state = EMPTY_OPTIMIZER
for attr in MOI.get(m.model_cache, MOI.ListOfOptimizerAttributesSet())
# Skip attributes which don't apply to the new optimizer.
if attr isa MOI.RawParameter
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So if I undersand correctly, this is a workaround for solvers not implementing supports correctly for RawParameter? Otherwise, this would we caughr by the second condition

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or is it because RawParameter with the same name might have different meaning?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be clarified in a comment

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

because RawParameter with the same name might have different meaning?

This.

@odow odow merged commit f585f81 into master Apr 5, 2021
@odow odow deleted the od/rawparameter branch April 5, 2021 20:35
@blegat blegat added this to the v0.9.21 milestone May 22, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging this pull request may close these issues.

reset_optimizer attempts to pass RawParameters between models
3 participants