Skip to content

Commit

Permalink
fix code block in the MCO section (examples.rst)
Browse files Browse the repository at this point in the history
  • Loading branch information
alvls committed Nov 1, 2024
1 parent 209cbd5 commit 6a4047c
Showing 1 changed file with 84 additions and 3 deletions.
87 changes: 84 additions & 3 deletions docs/source/examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1826,17 +1826,17 @@ we will set the number_of_parallel_points parameter to 12, and also limit oursel
Level lines of the objective function, built with the parameter kernel='rbf'
During the experiment an optimal quality metric value of -0.9469 was obtained with the following combination
of parameters: :math:`C = 1.647410^5, C = 1.6474 \dot 10^5`, gamma = 0.0767, kernel = 'rbf'.
of parameters: :math:`C = 1.647410^5`, gamma = 0.0767, kernel = 'rbf'.
Example of solving a multi-criteria optimization problem
Example of solving a multicriteria optimization problem
________________________________________________________
If optimization is performed using several criteria, then the solution to the problem becomes more complicated.
The problem is that the criteria are usually contradictory: decreasing the value of one of them often leads
to an increase in the values of others. Let's consider the work of the iOpt framework when solving a multi-criteria problem.
to an increase in the values of others. Let's consider the work of the iOpt framework when solving a multicriteria problem.
To do this, we modify the problem statement in Section Tuning support vector machine hyperparameters for a classification problem in machine learning.
Dataset used
Expand All @@ -1855,6 +1855,87 @@ Finding optimal parameters using the iOpt framework
Let's launch the iOpt framework to construct the Pareto set.
We consider two continuous parameters:
#. regularization parameter **C**: [10\ :sup:`1`, 10\ :sup:`6`];
#. kernel coefficient **gamma**: [10\ :sup:`-7`, 10\ :sup:`-3`].
First, we need to formulate the basic problem as a class inherited from Problem.
.. code-block::
:caption: Statement of the problem of multicriteria optimization using the iOpt framework
import numpy as np
cs = np.logspace(1, 6, 20)
gamms = np.logspace(-7, -3, 20)
params = {'C': cs, 'gamma': gamms}
search = GridSearchCV(SVC(), cv=5, param_grid=params,
scoring=lambda model, x, y: f1_score(y, model.predict(x)))
search.fit(x, y)
import numpy as np
from iOpt.trial import Point
from iOpt.trial import FunctionValue
from iOpt.trial import Trial
from iOpt.problem import Problem
from sklearn.model_selection import cross_val_score
from sklearn.svm import SVC
class mco_breast_cancer(Problem):
def __init__(self, X, y, X_train, y_train):
"""
Class constructor breast_cancer problem.
"""
super(mco_breast_cancer, self).__init__()
self.X = X
self.y = y
self.X_train = X_train
self.y_train = y_train
self.name = "mco_test1"
self.dimension = 2
self.number_of_float_variables = 2
self.number_of_discrete_variables = 0
self.number_of_objectives = 2
self.number_of_constraints = 0
self.float_variable_names = np.ndarray(shape =
(self.number_of_float_variables,), dtype=object)
for i in range(self.number_of_float_variables):
self.float_variable_names[i] = str(i)
self.lower_bound_of_float_variables = np.array([1, -7],
dtype=np.double)
self.upper_bound_of_float_variables = np.array([6, -3],
dtype=np.double)
self.known_optimum = np.ndarray(shape=(1,), dtype=Trial)
def calculateAllFunction(self, point: Point, function_values:
np.ndarray(shape=(1), dtype=FunctionValue)) -> \
np.ndarray(shape=(1), dtype=FunctionValue):
x = point.float_variables
svc_c = 10 ** x[0]
gamma = 10 ** x[1]
classifier_obj = SVC(C=svc_c, gamma=gamma)
classifier_obj.fit(self.X_train, self.y_train)
# OBJECTIV 1
function_values[0].value = - cross_val_score(classifier_obj, self.X,
self.y, n_jobs=4, scoring='precision').mean()
# OBJECTIV 2
function_values[1].value = - cross_val_score(classifier_obj, self.X,
self.y, n_jobs=4, scoring='recall').mean()
return function_values

0 comments on commit 6a4047c

Please sign in to comment.