pyccea.optimizers package
- class pyccea.optimizers.BinaryGeneticAlgorithm(subpop_size: int, n_features: int, conf: dict)[source]
Bases:
object
Binary genetic algorithm.
- Attributes:
- mutation_rate: float
Probability of a mutation occurring.
- crossover_rate: float
Probability of a crossover occurring.
- tournament_sample_size: int
Sample size of the subpopulation that will be used in Tournament Selection.
- selection_method: str
Population update method employed in each generation, which can take one of two options: (i) generational, where the entire population is replaced in each generation (excluding a specified ‘elite_size’ individuals), or (ii) steady-state, where only the two least fit individuals are replaced.
- elite_size: int, optional
Number of individuals that will be preserved from the current to the next generation. They are selected according to the fitness, i.e., if it is N, the N best individuals of the current generation will be preserved for the next generation. This parameter is only used if the selection_method is generational.
Methods
evolve
(subpop, fitness)Evolve a subpopulation for a single generation.
- evolve(subpop: ndarray, fitness: list)[source]
Evolve a subpopulation for a single generation.
- Parameters:
- subpop: np.ndarray
Individuals of the subpopulation, where each individual is an array of size equal to the number of features.
- fitness: list
Evaluation of all individuals in the subpopulation.
- Returns:
- next_subpop: np.ndarray
Individuals in the subpopulation of the next generation.
- selection_methods = ['generational', 'steady-state']
- class pyccea.optimizers.DifferentialEvolution(subpop_size: int, n_features: int, conf: dict)[source]
Bases:
object
Differential Evolution (AMDE) algorithm (rand/1/exp).
Storn, Rainer, and Kenneth Price. “Differential Evolution - A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces” Journal of global optimization 11 (1997): 341-359.
- Attributes:
- scaling_factor: float
The mutation constant. In the literature this is also known as differential weight, being denoted by F. It should be in the range [0, 2].
- crossover_probability: float
The recombination constant, should be in the range [0, 1]. In the literature this is also known as the crossover probability. Increasing this value allows a larger number of mutants to progress into the next generation, but at the risk of population stability.
- bounds: tuple[float, float]
Bounds for continuous variables (min, max).
Methods
evolve
(subpop, fitness)Evolve a subpopulation for a single generation.
- evolve(subpop: ndarray, fitness: list)[source]
Evolve a subpopulation for a single generation.
- Parameters:
- subpop: np.ndarray
Individuals of the subpopulation, where each individual is an array of size equal to the number of features.
- fitness: list
Evaluation of all individuals in the subpopulation.
- Returns:
- next_subpop: np.ndarray
Individuals in the subpopulation of the next generation.
Submodules
pyccea.optimizers.differential_evolution module
- class pyccea.optimizers.differential_evolution.DifferentialEvolution(subpop_size: int, n_features: int, conf: dict)[source]
Bases:
object
Differential Evolution (AMDE) algorithm (rand/1/exp).
Storn, Rainer, and Kenneth Price. “Differential Evolution - A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces” Journal of global optimization 11 (1997): 341-359.
- Attributes:
- scaling_factor: float
The mutation constant. In the literature this is also known as differential weight, being denoted by F. It should be in the range [0, 2].
- crossover_probability: float
The recombination constant, should be in the range [0, 1]. In the literature this is also known as the crossover probability. Increasing this value allows a larger number of mutants to progress into the next generation, but at the risk of population stability.
- bounds: tuple[float, float]
Bounds for continuous variables (min, max).
Methods
evolve
(subpop, fitness)Evolve a subpopulation for a single generation.
- evolve(subpop: ndarray, fitness: list)[source]
Evolve a subpopulation for a single generation.
- Parameters:
- subpop: np.ndarray
Individuals of the subpopulation, where each individual is an array of size equal to the number of features.
- fitness: list
Evaluation of all individuals in the subpopulation.
- Returns:
- next_subpop: np.ndarray
Individuals in the subpopulation of the next generation.
pyccea.optimizers.genetic_algorithm module
- class pyccea.optimizers.genetic_algorithm.BinaryGeneticAlgorithm(subpop_size: int, n_features: int, conf: dict)[source]
Bases:
object
Binary genetic algorithm.
- Attributes:
- mutation_rate: float
Probability of a mutation occurring.
- crossover_rate: float
Probability of a crossover occurring.
- tournament_sample_size: int
Sample size of the subpopulation that will be used in Tournament Selection.
- selection_method: str
Population update method employed in each generation, which can take one of two options: (i) generational, where the entire population is replaced in each generation (excluding a specified ‘elite_size’ individuals), or (ii) steady-state, where only the two least fit individuals are replaced.
- elite_size: int, optional
Number of individuals that will be preserved from the current to the next generation. They are selected according to the fitness, i.e., if it is N, the N best individuals of the current generation will be preserved for the next generation. This parameter is only used if the selection_method is generational.
Methods
evolve
(subpop, fitness)Evolve a subpopulation for a single generation.
- evolve(subpop: ndarray, fitness: list)[source]
Evolve a subpopulation for a single generation.
- Parameters:
- subpop: np.ndarray
Individuals of the subpopulation, where each individual is an array of size equal to the number of features.
- fitness: list
Evaluation of all individuals in the subpopulation.
- Returns:
- next_subpop: np.ndarray
Individuals in the subpopulation of the next generation.
- selection_methods = ['generational', 'steady-state']