Julia interface for MIPLearn, an extensible framework for Learning-Enhanced Mixed-Integer Optimization
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Go to file
Alinson S. Xavier 6784b2153d
Reorganize files
4 years ago
.github/workflows GitHub Actions: Remove nightly Julia 5 years ago
deps Replace tuples; make it work with plain JuMP models 4 years ago
src Reorganize files 4 years ago
test Reorganize files 4 years ago
.gitignore Remove CPLEX and Gurobi from dependencies 5 years ago
LICENSE Reorganize package; add macros 5 years ago
Manifest.toml Implement save, load_jump_instance 4 years ago
Project.toml Implement save, load_jump_instance 4 years ago
README.md Implement save, load_jump_instance 4 years ago

README.md

MIPLearn.jl

MIPLearn is an extensible open-source framework for solving discrete optimization problems using a combination of Mixed-Integer Linear Programming (MIP) and Machine Learning (ML). See the main repository for more information. This repository holds an experimental Julia interface for the package.

1. Usage

1.1 Installation

To use MIPLearn.jl, the first step is to install the Julia programming language on your machine. After Julia is installed, launch the Julia console, type ] to switch to package manager mode, then run:

(@v1.6) pkg> add MIPLearn@0.2

This command should also automatically install all the required Python dependencies. To test that the package has been correctly installed, run (in package manager mode):

(@v1.6) pkg> test MIPLearn

If you find any issues installing the package, please do not hesitate to open an issue.

1.2 Describing instances

using JuMP
using MIPLearn

# Create problem data
weights = [1.0, 2.0, 3.0]
prices = [5.0, 6.0, 7.0]
capacity = 3.0

# Create standard JuMP model
model = Model()
n = length(weights)
@variable(model, x[1:n], Bin)
@objective(model, Max, sum(x[i] * prices[i] for i in 1:n))
@constraint(model, c1, sum(x[i] * weights[i] for i in 1:n) <= capacity)

# Add ML information
@feature(model, [5.0])
@feature(c1, [1.0, 2.0, 3.0])
@category(c1, "c1")
for i in 1:n
    @feature(x[i], [weights[i]; prices[i]])
    @category(x[i], "type-$i")
end

instance = JuMPInstance(model)

1.3 Solving instances and training

using MIPLearn
using Cbc

# Create training and test instances
training_instances = [...]
test_instances = [...]

# Create solver
solver = LearningSolver(Cbc.Optimizer)

# Solve training instances
for instance in train_instances
    solve!(solver, instance)
end

# Train ML models
fit!(solver, training_instances)

# Solve test instances
for instance in test_instances
    solve!(solver, instance)
end

1.4 Saving and loading solver state

using MIPLearn
using Cbc

# Solve training instances
training_instances = [...]
solver = LearningSolver(Cbc.Optimizer)
for instance in training_instances
    solve!(solver, instance)
end

# Train ML models
fit!(solver, training_instances)

# Save trained solver to disk
save("solver.mls", solver)

# Application restarts...

# Load trained solver from disk
solver = load("solver.mls")

# Solve additional instances
test_instances = [...]
for instance in test_instances
    solve!(solver, instance)
end

1.5 Solving training instances in parallel

using MIPLearn
using Cbc

# Solve training instances in parallel
training_instances = [...]
solver = LearningSolver(Cbc.Optimizer)
parallel_solve!(solver, training_instances, n_jobs=4)
fit!(solver, training_instances)

# Solve test instances in parallel
test_instances = [...]
parallel_solve!(solver, test_instances)

1.6 Solving instances from disk

using MIPLearn
using JuMP
using Cbc

# Create 600 problem instances and save them to files
for i in 1:600
    m = Model()
    @variable(m, x, Bin)
    @objective(m, Min, x)
    @feature(x, [1.0])
    
    instance = JuMPInstance(m)
    save("instance-$i.bin", instance)
end

# Initialize instances and solver
training_instances = [FileInstance("instance-$i.bin") for i in 1:500]
test_instances = [FileInstance("instance-$i.bin") for i in 501:600]
solver = LearningSolver(Cbc.Optimizer)

# Solve training instances
for instance in training_instances
    solve!(solver, instance)
end

# Train ML models
fit!(solver, training_instances)

# Solve test instances
for instance in test_instances
    solve!(solver, instance)
end

2. Customization

2.1 Selecting solver components

using MIPLearn

solver = LearningSolver(
  Cbc.Optimizer,
  components=[
    PrimalSolutionComponent(...),
    ObjectiveValueComponent(...),
  ]
)

2.2 Adjusting component aggressiveness

using MIPLearn

solver = LearningSolver(
  Cbc.Optimizer,
  components=[
    PrimalSolutionComponent(
      threshold=MinPrecisionThreshold(0.95),
    ),
  ]
)

2.3 Evaluating component performance

TODO

2.4 Using customized ML classifiers and regressors

TODO

3. Acknowledgments

  • Based upon work supported by Laboratory Directed Research and Development (LDRD) funding from Argonne National Laboratory, provided by the Director, Office of Science, of the U.S. Department of Energy under Contract No. DE-AC02-06CH11357.
  • Based upon work supported by the U.S. Department of Energy Advanced Grid Modeling Program under Grant DE-OE0000875.

4. Citing MIPLearn

If you use MIPLearn in your research (either the solver or the included problem generators), we kindly request that you cite the package as follows:

  • Alinson S. Xavier, Feng Qiu. MIPLearn: An Extensible Framework for Learning-Enhanced Optimization. Zenodo (2020). DOI: 10.5281/zenodo.4287567

If you use MIPLearn in the field of power systems optimization, we kindly request that you cite the reference below, in which the main techniques implemented in MIPLearn were first developed:

  • Alinson S. Xavier, Feng Qiu, Shabbir Ahmed. Learning to Solve Large-Scale Unit Commitment Problems. INFORMS Journal on Computing (2020). DOI: 10.1287/ijoc.2020.0976

5. License

Released under the modified BSD license. See LICENSE for more details.