Instance: Reformat comments

master
Alinson S. Xavier 5 years ago
parent a1b959755c
commit 372d6eb066

@ -14,12 +14,14 @@ from miplearn.types import TrainingSample
class Instance(ABC): class Instance(ABC):
""" """
Abstract class holding all the data necessary to generate a concrete model of the problem. Abstract class holding all the data necessary to generate a concrete model of the
problem.
In the knapsack problem, for example, this class could hold the number of items, their weights
and costs, as well as the size of the knapsack. Objects implementing this class are able to In the knapsack problem, for example, this class could hold the number of items,
convert themselves into a concrete optimization model, which can be optimized by a solver, or their weights and costs, as well as the size of the knapsack. Objects
into arrays of features, which can be provided as inputs to machine learning models. implementing this class are able to convert themselves into a concrete
optimization model, which can be optimized by a solver, or into arrays of
features, which can be provided as inputs to machine learning models.
""" """
def __init__(self): def __init__(self):
@ -34,21 +36,23 @@ class Instance(ABC):
def get_instance_features(self): def get_instance_features(self):
""" """
Returns a 1-dimensional Numpy array of (numerical) features describing the entire instance. Returns a 1-dimensional Numpy array of (numerical) features describing the
entire instance.
The array is used by LearningSolver to determine how similar two instances are. It may also The array is used by LearningSolver to determine how similar two instances
be used to predict, in combination with variable-specific features, the values of binary are. It may also be used to predict, in combination with variable-specific
decision variables in the problem. features, the values of binary decision variables in the problem.
There is not necessarily a one-to-one correspondence between models and instance features: There is not necessarily a one-to-one correspondence between models and
the features may encode only part of the data necessary to generate the complete model. instance features: the features may encode only part of the data necessary to
Features may also be statistics computed from the original data. For example, in the generate the complete model. Features may also be statistics computed from
knapsack problem, an implementation may decide to provide as instance features only the original data. For example, in the knapsack problem, an implementation
the average weights, average prices, number of items and the size of the knapsack. may decide to provide as instance features only the average weights, average
prices, number of items and the size of the knapsack.
The returned array MUST have the same length for all relevant instances of the problem. If The returned array MUST have the same length for all relevant instances of
two instances map into arrays of different lengths, they cannot be solved by the same the problem. If two instances map into arrays of different lengths,
LearningSolver object. they cannot be solved by the same LearningSolver object.
By default, returns [0]. By default, returns [0].
""" """
@ -56,20 +60,22 @@ class Instance(ABC):
def get_variable_features(self, var, index): def get_variable_features(self, var, index):
""" """
Returns a 1-dimensional array of (numerical) features describing a particular decision Returns a 1-dimensional array of (numerical) features describing a particular
variable. decision variable.
The argument `var` is a pyomo.core.Var object, which represents a collection of decision The argument `var` is a pyomo.core.Var object, which represents a collection
variables. The argument `index` specifies which variable in the collection is the relevant of decision variables. The argument `index` specifies which variable in the
one. collection is the relevant one.
In combination with instance features, variable features are used by LearningSolver to In combination with instance features, variable features are used by
predict, among other things, the optimal value of each decision variable before the LearningSolver to predict, among other things, the optimal value of each
optimization takes place. In the knapsack problem, for example, an implementation could decision variable before the optimization takes place. In the knapsack
provide as variable features the weight and the price of a specific item. problem, for example, an implementation could provide as variable features
the weight and the price of a specific item.
Like instance features, the arrays returned by this method MUST have the same length for Like instance features, the arrays returned by this method MUST have the same
all variables within the same category, for all relevant instances of the problem. length for all variables within the same category, for all relevant instances
of the problem.
By default, returns [0]. By default, returns [0].
""" """
@ -77,12 +83,12 @@ class Instance(ABC):
def get_variable_category(self, var, index): def get_variable_category(self, var, index):
""" """
Returns the category (a string, an integer or any hashable type) for each decision Returns the category (a string, an integer or any hashable type) for each
variable. decision variable.
If two variables have the same category, LearningSolver will use the same internal ML If two variables have the same category, LearningSolver will use the same
model to predict the values of both variables. If the returned category is None, ML internal ML model to predict the values of both variables. If the returned
models will ignore the variable. category is None, ML models will ignore the variable.
By default, returns "default". By default, returns "default".
""" """
@ -107,16 +113,16 @@ class Instance(ABC):
""" """
Returns lazy constraint violations found for the current solution. Returns lazy constraint violations found for the current solution.
After solving a model, LearningSolver will ask the instance to identify which lazy After solving a model, LearningSolver will ask the instance to identify which
constraints are violated by the current solution. For each identified violation, lazy constraints are violated by the current solution. For each identified
LearningSolver will then call the build_lazy_constraint, add the generated Pyomo violation, LearningSolver will then call the build_lazy_constraint, add the
constraint to the model, then resolve the problem. The process repeats until no further generated Pyomo constraint to the model, then resolve the problem. The
lazy constraint violations are found. process repeats until no further lazy constraint violations are found.
Each "violation" is simply a string, a tuple or any other hashable type which allows the Each "violation" is simply a string, a tuple or any other hashable type which
instance to identify unambiguously which lazy constraint should be generated. In the allows the instance to identify unambiguously which lazy constraint should be
Traveling Salesman Problem, for example, a subtour violation could be a frozen set generated. In the Traveling Salesman Problem, for example, a subtour
containing the cities in the subtour. violation could be a frozen set containing the cities in the subtour.
For a concrete example, see TravelingSalesmanInstance. For a concrete example, see TravelingSalesmanInstance.
""" """
@ -126,15 +132,17 @@ class Instance(ABC):
""" """
Returns a Pyomo constraint which fixes a given violation. Returns a Pyomo constraint which fixes a given violation.
This method is typically called immediately after find_violated_lazy_constraints. The violation object This method is typically called immediately after
provided to this method is exactly the same object returned earlier by find_violated_lazy_constraints. find_violated_lazy_constraints. The violation object provided to this method
After some training, LearningSolver may decide to proactively build some lazy constraints is exactly the same object returned earlier by
at the beginning of the optimization process, before a solution is even available. In this find_violated_lazy_constraints. After some training, LearningSolver may
case, build_lazy_constraints will be called without a corresponding call to decide to proactively build some lazy constraints at the beginning of the
optimization process, before a solution is even available. In this case,
build_lazy_constraints will be called without a corresponding call to
find_violated_lazy_constraints. find_violated_lazy_constraints.
The implementation should not directly add the constraint to the model. The constraint The implementation should not directly add the constraint to the model. The
will be added by LearningSolver after the method returns. constraint will be added by LearningSolver after the method returns.
For a concrete example, see TravelingSalesmanInstance. For a concrete example, see TravelingSalesmanInstance.
""" """

Loading…
Cancel
Save