hyperparameter_hunter.optimization package

Submodules

hyperparameter_hunter.optimization.protocol_core module

This module defines the base Optimization Protocol classes. The classes defined herein are not intended for direct use, but are rather parent classes to those defined in hyperparameter_hunter.optimization.backends.skopt.protocols

Module contents

class hyperparameter_hunter.optimization.BayesianOptPro(target_metric=None, iterations=1, verbose=1, read_experiments=True, reporter_parameters=None, warn_on_re_ask=False, base_estimator='GP', n_initial_points=10, acquisition_function='gp_hedge', acquisition_optimizer='auto', random_state=32, acquisition_function_kwargs=None, acquisition_optimizer_kwargs=None, n_random_starts='DEPRECATED', callbacks=None, base_estimator_kwargs=None)

Bases: hyperparameter_hunter.optimization.protocol_core.SKOptPro

Bayesian optimization with Gaussian Processes

Attributes
search_space_size

The number of different hyperparameter permutations possible given the current

source_script

Methods

forge_experiment(self, model_initializer[, …])

Define hyperparameter search scaffold for building Experiments during optimization

get_ready(self)

Prepare for optimization by finalizing hyperparameter space and identifying similar Experiments.

go(self[, force_ready])

Execute hyperparameter optimization, building an Experiment for each iteration

set_dimensions(self)

Locate given hyperparameters that are space choice declarations and add them to dimensions

set_experiment_guidelines(self, \*args, …)

Deprecated since version 3.0.0a2.

source_script = None
class hyperparameter_hunter.optimization.GradientBoostedRegressionTreeOptPro(target_metric=None, iterations=1, verbose=1, read_experiments=True, reporter_parameters=None, warn_on_re_ask=False, base_estimator='GBRT', n_initial_points=10, acquisition_function='EI', acquisition_optimizer='sampling', random_state=32, acquisition_function_kwargs=None, acquisition_optimizer_kwargs=None, n_random_starts='DEPRECATED', callbacks=None, base_estimator_kwargs=None)

Bases: hyperparameter_hunter.optimization.protocol_core.SKOptPro

Sequential optimization with gradient boosted regression trees

Attributes
search_space_size

The number of different hyperparameter permutations possible given the current

source_script

Methods

forge_experiment(self, model_initializer[, …])

Define hyperparameter search scaffold for building Experiments during optimization

get_ready(self)

Prepare for optimization by finalizing hyperparameter space and identifying similar Experiments.

go(self[, force_ready])

Execute hyperparameter optimization, building an Experiment for each iteration

set_dimensions(self)

Locate given hyperparameters that are space choice declarations and add them to dimensions

set_experiment_guidelines(self, \*args, …)

Deprecated since version 3.0.0a2.

source_script = None
hyperparameter_hunter.optimization.GBRT

alias of hyperparameter_hunter.optimization.backends.skopt.protocols.GradientBoostedRegressionTreeOptPro

class hyperparameter_hunter.optimization.RandomForestOptPro(target_metric=None, iterations=1, verbose=1, read_experiments=True, reporter_parameters=None, warn_on_re_ask=False, base_estimator='RF', n_initial_points=10, acquisition_function='EI', acquisition_optimizer='sampling', random_state=32, acquisition_function_kwargs=None, acquisition_optimizer_kwargs=None, n_random_starts='DEPRECATED', callbacks=None, base_estimator_kwargs=None)

Bases: hyperparameter_hunter.optimization.protocol_core.SKOptPro

Sequential optimization with random forest regressor decision trees

Attributes
search_space_size

The number of different hyperparameter permutations possible given the current

source_script

Methods

forge_experiment(self, model_initializer[, …])

Define hyperparameter search scaffold for building Experiments during optimization

get_ready(self)

Prepare for optimization by finalizing hyperparameter space and identifying similar Experiments.

go(self[, force_ready])

Execute hyperparameter optimization, building an Experiment for each iteration

set_dimensions(self)

Locate given hyperparameters that are space choice declarations and add them to dimensions

set_experiment_guidelines(self, \*args, …)

Deprecated since version 3.0.0a2.

source_script = None
hyperparameter_hunter.optimization.RF

alias of hyperparameter_hunter.optimization.backends.skopt.protocols.RandomForestOptPro

class hyperparameter_hunter.optimization.ExtraTreesOptPro(target_metric=None, iterations=1, verbose=1, read_experiments=True, reporter_parameters=None, warn_on_re_ask=False, base_estimator='ET', n_initial_points=10, acquisition_function='EI', acquisition_optimizer='sampling', random_state=32, acquisition_function_kwargs=None, acquisition_optimizer_kwargs=None, n_random_starts='DEPRECATED', callbacks=None, base_estimator_kwargs=None)

Bases: hyperparameter_hunter.optimization.protocol_core.SKOptPro

Sequential optimization with extra trees regressor decision trees

Attributes
search_space_size

The number of different hyperparameter permutations possible given the current

source_script

Methods

forge_experiment(self, model_initializer[, …])

Define hyperparameter search scaffold for building Experiments during optimization

get_ready(self)

Prepare for optimization by finalizing hyperparameter space and identifying similar Experiments.

go(self[, force_ready])

Execute hyperparameter optimization, building an Experiment for each iteration

set_dimensions(self)

Locate given hyperparameters that are space choice declarations and add them to dimensions

set_experiment_guidelines(self, \*args, …)

Deprecated since version 3.0.0a2.

source_script = None
hyperparameter_hunter.optimization.ET

alias of hyperparameter_hunter.optimization.backends.skopt.protocols.ExtraTreesOptPro

class hyperparameter_hunter.optimization.DummyOptPro(target_metric=None, iterations=1, verbose=1, read_experiments=True, reporter_parameters=None, warn_on_re_ask=False, base_estimator='DUMMY', n_initial_points=10, acquisition_function='EI', acquisition_optimizer='sampling', random_state=32, acquisition_function_kwargs=None, acquisition_optimizer_kwargs=None, n_random_starts='DEPRECATED', callbacks=None, base_estimator_kwargs=None)

Bases: hyperparameter_hunter.optimization.protocol_core.SKOptPro

Random search by uniform sampling

Attributes
search_space_size

The number of different hyperparameter permutations possible given the current

source_script

Methods

forge_experiment(self, model_initializer[, …])

Define hyperparameter search scaffold for building Experiments during optimization

get_ready(self)

Prepare for optimization by finalizing hyperparameter space and identifying similar Experiments.

go(self[, force_ready])

Execute hyperparameter optimization, building an Experiment for each iteration

set_dimensions(self)

Locate given hyperparameters that are space choice declarations and add them to dimensions

set_experiment_guidelines(self, \*args, …)

Deprecated since version 3.0.0a2.

source_script = None
class hyperparameter_hunter.optimization.BayesianOptimization(**kwargs)

Bases: hyperparameter_hunter.optimization.backends.skopt.protocols.BayesianOptPro

Deprecated since version 3.0.0a2: Will be removed in 3.2.0. Renamed to BayesianOptPro

Attributes
search_space_size

The number of different hyperparameter permutations possible given the current

source_script

Methods

forge_experiment(self, model_initializer[, …])

Define hyperparameter search scaffold for building Experiments during optimization

get_ready(self)

Prepare for optimization by finalizing hyperparameter space and identifying similar Experiments.

go(self[, force_ready])

Execute hyperparameter optimization, building an Experiment for each iteration

set_dimensions(self)

Locate given hyperparameters that are space choice declarations and add them to dimensions

set_experiment_guidelines(self, \*args, …)

Deprecated since version 3.0.0a2.

source_script = None
class hyperparameter_hunter.optimization.GradientBoostedRegressionTreeOptimization(**kwargs)

Bases: hyperparameter_hunter.optimization.backends.skopt.protocols.GradientBoostedRegressionTreeOptPro

Deprecated since version 3.0.0a2: Will be removed in 3.2.0. Renamed to GradientBoostedRegressionTreeOptPro

Attributes
search_space_size

The number of different hyperparameter permutations possible given the current

source_script

Methods

forge_experiment(self, model_initializer[, …])

Define hyperparameter search scaffold for building Experiments during optimization

get_ready(self)

Prepare for optimization by finalizing hyperparameter space and identifying similar Experiments.

go(self[, force_ready])

Execute hyperparameter optimization, building an Experiment for each iteration

set_dimensions(self)

Locate given hyperparameters that are space choice declarations and add them to dimensions

set_experiment_guidelines(self, \*args, …)

Deprecated since version 3.0.0a2.

source_script = None
class hyperparameter_hunter.optimization.RandomForestOptimization(**kwargs)

Bases: hyperparameter_hunter.optimization.backends.skopt.protocols.RandomForestOptPro

Deprecated since version 3.0.0a2: Will be removed in 3.2.0. Renamed to RandomForestOptPro

Attributes
search_space_size

The number of different hyperparameter permutations possible given the current

source_script

Methods

forge_experiment(self, model_initializer[, …])

Define hyperparameter search scaffold for building Experiments during optimization

get_ready(self)

Prepare for optimization by finalizing hyperparameter space and identifying similar Experiments.

go(self[, force_ready])

Execute hyperparameter optimization, building an Experiment for each iteration

set_dimensions(self)

Locate given hyperparameters that are space choice declarations and add them to dimensions

set_experiment_guidelines(self, \*args, …)

Deprecated since version 3.0.0a2.

source_script = None
class hyperparameter_hunter.optimization.ExtraTreesOptimization(**kwargs)

Bases: hyperparameter_hunter.optimization.backends.skopt.protocols.ExtraTreesOptPro

Deprecated since version 3.0.0a2: Will be removed in 3.2.0. Renamed to ExtraTreesOptPro

Attributes
search_space_size

The number of different hyperparameter permutations possible given the current

source_script

Methods

forge_experiment(self, model_initializer[, …])

Define hyperparameter search scaffold for building Experiments during optimization

get_ready(self)

Prepare for optimization by finalizing hyperparameter space and identifying similar Experiments.

go(self[, force_ready])

Execute hyperparameter optimization, building an Experiment for each iteration

set_dimensions(self)

Locate given hyperparameters that are space choice declarations and add them to dimensions

set_experiment_guidelines(self, \*args, …)

Deprecated since version 3.0.0a2.

source_script = None
class hyperparameter_hunter.optimization.DummySearch(**kwargs)

Bases: hyperparameter_hunter.optimization.backends.skopt.protocols.DummyOptPro

Deprecated since version 3.0.0a2: Will be removed in 3.2.0. Renamed to DummyOptPro

Attributes
search_space_size

The number of different hyperparameter permutations possible given the current

source_script

Methods

forge_experiment(self, model_initializer[, …])

Define hyperparameter search scaffold for building Experiments during optimization

get_ready(self)

Prepare for optimization by finalizing hyperparameter space and identifying similar Experiments.

go(self[, force_ready])

Execute hyperparameter optimization, building an Experiment for each iteration

set_dimensions(self)

Locate given hyperparameters that are space choice declarations and add them to dimensions

set_experiment_guidelines(self, \*args, …)

Deprecated since version 3.0.0a2.

source_script = None