Remove temporary docs; apply some fixes

master
Alinson S. Xavier 4 years ago
parent 95b253429b
commit e0055f16f4

@ -15,247 +15,3 @@
[miplearn]: https://github.com/ANL-CEEESA/MIPLearn [miplearn]: https://github.com/ANL-CEEESA/MIPLearn
## 1. Usage
### 1.1 Installation
To use MIPLearn.jl, the first step is to [install the Julia programming language on your machine](https://julialang.org/). After Julia is installed, launch the Julia console, type `]` to switch to package manager mode, then run:
```
(@v1.6) pkg> add https://github.com/ANL-CEEESA/MIPLearn.jl.git
```
This command should also automatically install all the required Python dependencies. To test that the package has been correctly installed, run (in package manager mode):
```
(@v1.6) pkg> test MIPLearn
```
If you find any issues installing the package, please do not hesitate to [open an issue](https://github.com/ANL-CEEESA/MIPLearn.jl/issues).
### 1.2 Describing instances
```julia
using JuMP
using MIPLearn
# Create problem data
weights = [1.0, 2.0, 3.0]
prices = [5.0, 6.0, 7.0]
capacity = 3.0
# Create standard JuMP model
model = Model()
n = length(weights)
@variable(model, x[1:n], Bin)
@objective(model, Max, sum(x[i] * prices[i] for i in 1:n))
@constraint(model, c1, sum(x[i] * weights[i] for i in 1:n) <= capacity)
# Add ML information
@feature(model, [5.0])
@feature(c1, [1.0, 2.0, 3.0])
@category(c1, "c1")
for i in 1:n
@feature(x[i], [weights[i]; prices[i]])
@category(x[i], "type-$i")
end
instance = JuMPInstance(model)
```
### 1.3 Solving instances and training
```julia
using MIPLearn
using Cbc
# Create training and test instances
training_instances = [...]
test_instances = [...]
# Create solver
solver = LearningSolver(Cbc.Optimizer)
# Solve training instances
for instance in train_instances
solve!(solver, instance)
end
# Train ML models
fit!(solver, training_instances)
# Solve test instances
for instance in test_instances
solve!(solver, instance)
end
```
### 1.4 Saving and loading solver state
```julia
using MIPLearn
using Cbc
# Solve training instances
training_instances = [...]
solver = LearningSolver(Cbc.Optimizer)
for instance in training_instances
solve!(solver, instance)
end
# Train ML models
fit!(solver, training_instances)
# Save trained solver to disk
save("solver.bin", solver)
# Application restarts...
# Load trained solver from disk
solver = load_solver("solver.bin")
# Solve additional instances
test_instances = [...]
for instance in test_instances
solve!(solver, instance)
end
```
### 1.5 Solving instances from disk
In all examples above, we have assumed that instances are available as `JuMPInstance` objects, stored in memory. When problem instances are very large, or when there is a large number of problem instances, this approach may require an excessive amount of memory. To reduce memory requirements, MIPLearn.jl can also operate on instances that are stored on disk, through the `FileInstance` class, as the next example illustrates.
```julia
using MIPLearn
using JuMP
using Cbc
# Create a large number of problem instances
for i in 1:600
# Build JuMP model
model = Model()
@variable(...)
@objective(...)
@constraint(...)
# Add ML features and categories
@feature(...)
@category(...)
# Save instances to file
instance = JuMPInstance(m)
save("instance-$i.h5", instance)
end
# Initialize training and test instances
training_instances = [FileInstance("instance-$i.h5") for i in 1:500]
test_instances = [FileInstance("instance-$i.h5") for i in 501:600]
# Initialize solver
solver = LearningSolver(Cbc.Optimizer)
# Solve training instances. Files are modified in-place, and at most one
# file is loaded to memory at a time.
for instance in training_instances
solve!(solver, instance)
end
# Train ML models
fit!(solver, training_instances)
# Solve test instances
for instance in test_instances
solve!(solver, instance)
end
```
### 1.6 Solving training instances in parallel
In many situations, instances can be solved in parallel to accelerate the training process. MIPLearn.jl provides the method `parallel_solve!(solver, instances)` to easily achieve this.
First, launch Julia in multi-process mode:
```
julia --procs 4
```
Then call `parallel_solve!` as follows:
```julia
@everywhere using MIPLearn
@everywhere using Cbc
# Initialize training and test instances
training_instances = [...]
test_instances = [...]
# Initialize the solver
solver = LearningSolver(Cbc.Optimizer)
# Solve training instances in parallel. The number of instances solved
# simultaneously is the same as the `--procs` argument provided to Julia.
parallel_solve!(solver, training_instances)
# Train machine learning models
fit!(solver, training_instances)
# Solve test instances in parallel
parallel_solve!(solver, test_instances)
```
**NOTE:** Only `FileInstance` instances are currently supported.
## 2. Customization
### 2.1 Selecting solver components
```julia
using MIPLearn
solver = LearningSolver(
Cbc.Optimizer,
components=[
PrimalSolutionComponent(...),
ObjectiveValueComponent(...),
]
)
```
### 2.2 Adjusting component aggressiveness
```julia
using MIPLearn
solver = LearningSolver(
Cbc.Optimizer,
components=[
PrimalSolutionComponent(
threshold=MinPrecisionThreshold(0.95),
),
]
)
```
## 3. Acknowledgments
* Based upon work supported by **Laboratory Directed Research and Development** (LDRD) funding from Argonne National Laboratory, provided by the Director, Office of Science, of the U.S. Department of Energy under Contract No. DE-AC02-06CH11357.
* Based upon work supported by the **U.S. Department of Energy Advanced Grid Modeling Program** under Grant DE-OE0000875.
## 4. Citing MIPLearn
If you use MIPLearn in your research (either the solver or the included problem generators), we kindly request that you cite the package as follows:
* **Alinson S. Xavier, Feng Qiu.** *MIPLearn: An Extensible Framework for Learning-Enhanced Optimization*. Zenodo (2020). DOI: [10.5281/zenodo.4287567](https://doi.org/10.5281/zenodo.4287567)
If you use MIPLearn in the field of power systems optimization, we kindly request that you cite the reference below, in which the main techniques implemented in MIPLearn were first developed:
* **Alinson S. Xavier, Feng Qiu, Shabbir Ahmed.** *Learning to Solve Large-Scale Unit Commitment Problems.* INFORMS Journal on Computing (2020). DOI: [10.1287/ijoc.2020.0976](https://doi.org/10.1287/ijoc.2020.0976)
## 5. License
Released under the modified BSD license. See `LICENSE` for more details.

@ -20,6 +20,8 @@ global UserCutsComponent = PyNULL()
global MemorySample = PyNULL() global MemorySample = PyNULL()
global Hdf5Sample = PyNULL() global Hdf5Sample = PyNULL()
include("solvers/structs.jl")
include("utils/log.jl") include("utils/log.jl")
include("utils/exceptions.jl") include("utils/exceptions.jl")
include("instance/abstract_instance.jl") include("instance/abstract_instance.jl")

@ -9,59 +9,86 @@ mutable struct FileInstance <: Instance
py::Union{Nothing,PyCall.PyObject} py::Union{Nothing,PyCall.PyObject}
loaded::Union{Nothing,JuMPInstance} loaded::Union{Nothing,JuMPInstance}
filename::AbstractString filename::AbstractString
h5::PyCall.PyObject sample::PyCall.PyObject
build_model::Function build_model::Function
mode::String
function FileInstance(filename::AbstractString, build_model::Function)::FileInstance
instance = new(nothing, nothing, filename, nothing, build_model) function FileInstance(
filename::AbstractString,
build_model::Function;
mode::String = "a",
)::FileInstance
instance = new(nothing, nothing, filename, nothing, build_model, mode)
instance.py = PyFileInstance(instance) instance.py = PyFileInstance(instance)
instance.h5 = Hdf5Sample("$filename.h5", mode = "a") if mode != "r" || isfile("$filename.h5")
instance.sample = Hdf5Sample("$filename.h5", mode = mode)
end
instance.filename = filename instance.filename = filename
return instance return instance
end end
end end
to_model(instance::FileInstance) = to_model(instance.loaded) function _load!(instance::FileInstance)
if instance.loaded === nothing
data = load_data(instance.filename)
instance.loaded = JuMPInstance(instance.build_model(data))
end
end
get_instance_features(instance::FileInstance) = get_instance_features(instance.loaded) function free(instance::FileInstance)
instance.loaded = nothing
end
get_variable_features(instance::FileInstance, names) = function to_model(instance::FileInstance)
get_variable_features(instance.loaded, names) _load!(instance)
return to_model(instance.loaded)
end
get_variable_categories(instance::FileInstance, names) = function get_instance_features(instance::FileInstance)
get_variable_categories(instance.loaded, names) _load!(instance)
return get_instance_features(instance.loaded)
end
get_constraint_features(instance::FileInstance, names) = function get_variable_features(instance::FileInstance, names)
get_constraint_features(instance.loaded, names) _load!(instance)
return get_variable_features(instance.loaded, names)
end
get_constraint_categories(instance::FileInstance, names) = function get_variable_categories(instance::FileInstance, names)
get_constraint_categories(instance.loaded, names) _load!(instance)
return get_variable_categories(instance.loaded, names)
end
find_violated_lazy_constraints(instance::FileInstance, solver) = function get_constraint_features(instance::FileInstance, names)
find_violated_lazy_constraints(instance.loaded, solver) _load!(instance)
return get_constraint_features(instance.loaded, names)
end
enforce_lazy_constraint(instance::FileInstance, solver, violation) = function get_constraint_categories(instance::FileInstance, names)
enforce_lazy_constraint(instance.loaded, solver, violation) _load!(instance)
return get_constraint_categories(instance.loaded, names)
end
function get_samples(instance::FileInstance) function find_violated_lazy_constraints(instance::FileInstance, solver)
return [instance.h5] _load!(instance)
return find_violated_lazy_constraints(instance.loaded, solver)
end end
function create_sample!(instance::FileInstance) function enforce_lazy_constraint(instance::FileInstance, solver, violation)
return instance.h5 _load!(instance)
return enforce_lazy_constraint(instance.loaded, solver, violation)
end end
function load(instance::FileInstance) function get_samples(instance::FileInstance)
if instance.loaded === nothing return [instance.sample]
data = load_data(instance.filename)
instance.loaded = JuMPInstance(instance.build_model(data))
end
end end
function free(instance::FileInstance) function create_sample!(instance::FileInstance)
instance.loaded.samples = [] if instance.mode == "r"
instance.loaded = nothing return MemorySample()
GC.gc() else
return instance.sample
end
end end
function save_data(filename::AbstractString, data)::Nothing function save_data(filename::AbstractString, data)::Nothing
@ -74,7 +101,49 @@ function load_data(filename::AbstractString)
end end
end end
function flush(instance::FileInstance) end function load(filename::AbstractString, build_model::Function)
jldopen(filename, "r") do file
return build_model(file["data"])
end
end
function save(data::AbstractVector, dirname::String)::Nothing
mkpath(dirname)
for (i, d) in enumerate(data)
filename = joinpath(dirname, @sprintf("%06d.jld2", i))
jldsave(filename, data = d)
end
end
function solve!(
solver::LearningSolver,
filenames::Vector,
build_model::Function;
tee::Bool = false,
)
for filename in filenames
solve!(solver, filename, build_model; tee)
end
end
function fit!(
solver::LearningSolver,
filenames::Vector,
build_model::Function;
tee::Bool = false,
)
instances = [FileInstance(f, build_model) for f in filenames]
fit!(solver, instances)
end
function solve!(
solver::LearningSolver,
filename::AbstractString,
build_model::Function;
tee::Bool = false,
)
solve!(solver, FileInstance(filename, build_model); tee)
end
function __init_PyFileInstance__() function __init_PyFileInstance__()
@pydef mutable struct Class <: miplearn.Instance @pydef mutable struct Class <: miplearn.Instance
@ -87,19 +156,22 @@ function __init_PyFileInstance__()
get_variable_features(self.jl, from_str_array(names)) get_variable_features(self.jl, from_str_array(names))
get_variable_categories(self, names) = get_variable_categories(self, names) =
to_str_array(get_variable_categories(self.jl, from_str_array(names))) to_str_array(get_variable_categories(self.jl, from_str_array(names)))
get_constraint_features(self, names) =
get_constraint_features(self.jl, from_str_array(names))
get_constraint_categories(self, names) =
to_str_array(get_constraint_categories(self.jl, from_str_array(names)))
get_samples(self) = get_samples(self.jl) get_samples(self) = get_samples(self.jl)
create_sample(self) = create_sample!(self.jl) create_sample(self) = create_sample!(self.jl)
load(self) = load(self.jl)
free(self) = free(self.jl)
flush(self) = flush(self.jl)
find_violated_lazy_constraints(self, solver, _) = find_violated_lazy_constraints(self, solver, _) =
find_violated_lazy_constraints(self.jl, solver) find_violated_lazy_constraints(self.jl, solver)
enforce_lazy_constraint(self, solver, _, violation) = enforce_lazy_constraint(self, solver, _, violation) =
enforce_lazy_constraint(self.jl, solver, violation) enforce_lazy_constraint(self.jl, solver, violation)
free(self) = free(self.jl)
# FIXME: The two functions below are disabled because they break lazy loading
# of FileInstance.
# get_constraint_features(self, names) =
# get_constraint_features(self.jl, from_str_array(names))
# get_constraint_categories(self, names) =
# to_str_array(get_constraint_categories(self.jl, from_str_array(names)))
end end
copy!(PyFileInstance, Class) copy!(PyFileInstance, Class)
end end

@ -5,41 +5,26 @@
using JuMP using JuMP
import JSON import JSON
mutable struct JuMPInstance <: Instance Base.@kwdef mutable struct JuMPInstance <: Instance
py::Union{Nothing,PyCall.PyObject} py::Union{Nothing,PyCall.PyObject} = nothing
model::Union{Nothing,JuMP.Model} model::Union{Nothing,JuMP.Model} = nothing
mps::Union{Nothing,Vector{UInt8}} samples::Vector{PyCall.PyObject} = []
ext::AbstractDict
samples::Vector{PyCall.PyObject}
function JuMPInstance(model::JuMP.Model)::JuMPInstance function JuMPInstance(model::JuMP.Model)::JuMPInstance
init_miplearn_ext(model) init_miplearn_ext(model)
instance = new(nothing, model, nothing, model.ext[:miplearn], []) instance = new(nothing, model, [])
py = PyJuMPInstance(instance) py = PyJuMPInstance(instance)
instance.py = py instance.py = py
return instance return instance
end end
function JuMPInstance(mps::Vector{UInt8}, ext::AbstractDict)
"instance_features" in keys(ext) || error("provided ext is not initialized")
instance = new(nothing, nothing, mps, ext, [])
instance.py = PyJuMPInstance(instance)
return instance
end
end end
function to_model(instance::JuMPInstance)::JuMP.Model function to_model(instance::JuMPInstance)::JuMP.Model
if instance.model === nothing
mps_filename = "$(tempname()).mps.gz"
write(mps_filename, instance.mps)
instance.model = read_from_file(mps_filename)
instance.model.ext[:miplearn] = instance.ext
end
return instance.model return instance.model
end end
function get_instance_features(instance::JuMPInstance)::Union{Vector{Float64},Nothing} function get_instance_features(instance::JuMPInstance)::Union{Vector{Float64},Nothing}
return instance.ext["instance_features"] return instance.model.ext[:miplearn]["instance_features"]
end end
function _concat_features(dict, names)::Matrix{Float64} function _concat_features(dict, names)::Matrix{Float64}
@ -58,22 +43,22 @@ function get_variable_features(
instance::JuMPInstance, instance::JuMPInstance,
names::Vector{String}, names::Vector{String},
)::Matrix{Float64} )::Matrix{Float64}
return _concat_features(instance.ext["variable_features"], names) return _concat_features(instance.model.ext[:miplearn]["variable_features"], names)
end end
function get_variable_categories(instance::JuMPInstance, names::Vector{String}) function get_variable_categories(instance::JuMPInstance, names::Vector{String})
return _concat_categories(instance.ext["variable_categories"], names) return _concat_categories(instance.model.ext[:miplearn]["variable_categories"], names)
end end
function get_constraint_features( function get_constraint_features(
instance::JuMPInstance, instance::JuMPInstance,
names::Vector{String}, names::Vector{String},
)::Matrix{Float64} )::Matrix{Float64}
return _concat_features(instance.ext["constraint_features"], names) return _concat_features(instance.model.ext[:miplearn]["constraint_features"], names)
end end
function get_constraint_categories(instance::JuMPInstance, names::Vector{String}) function get_constraint_categories(instance::JuMPInstance, names::Vector{String})
return _concat_categories(instance.ext["constraint_categories"], names) return _concat_categories(instance.model.ext[:miplearn]["constraint_categories"], names)
end end
get_samples(instance::JuMPInstance) = instance.samples get_samples(instance::JuMPInstance) = instance.samples
@ -96,6 +81,10 @@ function enforce_lazy_constraint(instance::JuMPInstance, solver, violation::Stri
instance.model.ext[:miplearn]["lazy_enforce_cb"](instance.model, solver.data, violation) instance.model.ext[:miplearn]["lazy_enforce_cb"](instance.model, solver.data, violation)
end end
function solve!(solver::LearningSolver, model::JuMP.Model; kwargs...)
solve!(solver, JuMPInstance(model); kwargs...)
end
function __init_PyJuMPInstance__() function __init_PyJuMPInstance__()
@pydef mutable struct Class <: miplearn.Instance @pydef mutable struct Class <: miplearn.Instance
function __init__(self, jl) function __init__(self, jl)

@ -83,7 +83,7 @@ function _update_solution!(data::JuMPSolverData)
try try
data.sensitivity_report = lp_sensitivity_report(data.model) data.sensitivity_report = lp_sensitivity_report(data.model)
catch catch
@warn("Sensitivity analysis is unavailable; ignoring") @warn "Sensitivity analysis is unavailable; ignoring" maxlog=1
end end
basis_status_supported = true basis_status_supported = true
@ -99,7 +99,7 @@ function _update_solution!(data::JuMPSolverData)
data.basis_status[constr] = data.basis_status[constr] =
MOI.get(data.model, MOI.ConstraintBasisStatus(), constr) MOI.get(data.model, MOI.ConstraintBasisStatus(), constr)
catch catch
@warn "Basis status is unavailable; ignoring" @warn "Basis status is unavailable; ignoring" maxlog=1
basis_status_supported = false basis_status_supported = false
data.basis_status = Dict() data.basis_status = Dict()
end end
@ -240,6 +240,9 @@ function solve(
wallclock_time += @elapsed begin wallclock_time += @elapsed begin
log *= _optimize_and_capture_output!(model, tee = tee) log *= _optimize_and_capture_output!(model, tee = tee)
end end
if is_infeasible(data)
break
end
if iteration_cb !== nothing if iteration_cb !== nothing
iteration_cb() || break iteration_cb() || break
else else

@ -6,12 +6,6 @@ using Distributed
using JLD2 using JLD2
struct LearningSolver
py::PyCall.PyObject
optimizer_factory::Any
end
function LearningSolver( function LearningSolver(
optimizer_factory; optimizer_factory;
components = nothing, components = nothing,
@ -49,42 +43,12 @@ function solve!(
) )
end end
function fit!(solver::LearningSolver, instances::Vector{<:Instance}) function fit!(solver::LearningSolver, instances::Vector{<:Instance})
@python_call solver.py.fit([instance.py for instance in instances]) @python_call solver.py.fit([instance.py for instance in instances])
return return
end end
function _solve(solver_filename, instance_filename; discard_output::Bool)
@info "solve $instance_filename"
solver = load_solver(solver_filename)
solver.py._silence_miplearn_logger()
stats = solve!(solver, FileInstance(instance_filename), discard_output = discard_output)
solver.py._restore_miplearn_logger()
GC.gc()
@info "solve $instance_filename [done]"
return stats
end
function parallel_solve!(
solver::LearningSolver,
instances::Vector{FileInstance};
discard_output::Bool = false,
)
instance_filenames = [instance.filename for instance in instances]
solver_filename = tempname()
save(solver_filename, solver)
return pmap(
instance_filename ->
_solve(solver_filename, instance_filename, discard_output = discard_output),
instance_filenames,
on_error = identity,
)
end
function save(filename::AbstractString, solver::LearningSolver) function save(filename::AbstractString, solver::LearningSolver)
internal_solver = solver.py.internal_solver internal_solver = solver.py.internal_solver
internal_solver_prototype = solver.py.internal_solver_prototype internal_solver_prototype = solver.py.internal_solver_prototype

@ -0,0 +1,8 @@
# MIPLearn: Extensible Framework for Learning-Enhanced Mixed-Integer Optimization
# Copyright (C) 2020-2021, UChicago Argonne, LLC. All rights reserved.
# Released under the modified BSD license. See COPYING.md for more details.
struct LearningSolver
py::PyCall.PyObject
optimizer_factory::Any
end

@ -22,15 +22,18 @@ mutable struct BenchmarkRunner
end end
end end
function parallel_solve!( function solve!(
runner::BenchmarkRunner, runner::BenchmarkRunner,
instances::Vector{FileInstance}; instances::Vector{FileInstance};
n_trials::Int = 3, n_trials::Int = 1,
)::Nothing )::Nothing
instances = repeat(instances, n_trials) instances = repeat(instances, n_trials)
for (solver_name, solver) in runner.solvers for (solver_name, solver) in runner.solvers
@info "benchmark $solver_name" @info "benchmark $solver_name"
stats = parallel_solve!(solver, instances, discard_output = true) stats = [
solve!(solver, instance, discard_output = true, tee = true) for
instance in instances
]
for (i, s) in enumerate(stats) for (i, s) in enumerate(stats)
s["Solver"] = solver_name s["Solver"] = solver_name
s["Instance"] = instances[i].filename s["Instance"] = instances[i].filename
@ -54,4 +57,4 @@ function write_csv!(runner::BenchmarkRunner, filename::AbstractString)::Nothing
return return
end end
export BenchmarkRunner, parallel_solve!, fit!, write_csv! export BenchmarkRunner, solve!, fit!, write_csv!

Loading…
Cancel
Save