Optimization of a Complex Non-Linear Function Using DEAP

Optimization of a Complex Non-Linear Function Using DEAP

Let’s consider an optimization problem that involves a complex mathematical function.

Here is a challenging example where we want to minimize a non-linear, multi-variable function that includes trigonometric and exponential components.

Optimization Problem

We aim to minimize the following complex objective function:

f(x,y,z)=ex2+y2+z2+sin(5x)+cos(5y)+tan(z)

where (x), (y), and (z) are real-valued variables.
This function has multiple local minima and maxima, making it difficult for traditional optimization techniques to solve efficiently.

Problem Setup

  1. Objective: Minimize the function (f(x,y,z)).
  2. Constraints:
    • (x[5,5])
    • (y[5,5])
    • (z[1,1]) (The range of (z) is constrained because the tangent function can become undefined at specific values of (z).)
  3. Method: We will use a Genetic Algorithm (GA) via the DEAP library to solve this problem.

Genetic algorithms are a powerful tool for global optimization, especially for non-convex and multi-modal problems like this one.

DEAP Implementation

We will use DEAP to define the genetic algorithm, where each individual in the population represents a candidate solution ((x,y,z)).

The steps for the genetic algorithm are as follows:

  1. Initialization: A population of random individuals (sets of (x), (y), and (z)) is created.
  2. Selection: Individuals are selected based on their fitness values, which are computed as the value of the objective function (f(x,y,z)).
    We aim to minimize this value.
  3. Crossover: Pairs of individuals are selected to produce offspring via crossover.
  4. Mutation: Random mutations are applied to some individuals to introduce new genetic material.
  5. Evolution: Over multiple generations, the population evolves toward better solutions.

Here is how to solve this problem using DEAP in Python:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
import random
import numpy as np
from deap import base, creator, tools, algorithms

# Define the objective function
def objective(individual):
x, y, z = individual
return np.exp(x**2 + y**2 + z**2) + np.sin(5*x) + np.cos(5*y) + np.tan(z),

# Define the constraints for x, y, z
BOUND_LOW, BOUND_UP = [-5, -5, -1], [5, 5, 1]

# Create the DEAP optimization environment
creator.create("FitnessMin", base.Fitness, weights=(-1.0,))
creator.create("Individual", list, fitness=creator.FitnessMin)

toolbox = base.Toolbox()
toolbox.register("attr_float", random.uniform, -5, 5)
toolbox.register("individual", tools.initRepeat, creator.Individual,
toolbox.attr_float, n=3)
toolbox.register("population", tools.initRepeat, list, toolbox.individual)

toolbox.register("evaluate", objective)
toolbox.register("mate", tools.cxBlend, alpha=0.5)
toolbox.register("mutate", tools.mutGaussian, mu=0, sigma=0.1, indpb=0.2)
toolbox.register("select", tools.selTournament, tournsize=3)

# Manually apply the constraints after mutation and crossover
def checkBounds(individual):
for i in range(len(individual)):
if individual[i] < BOUND_LOW[i]:
individual[i] = BOUND_LOW[i]
elif individual[i] > BOUND_UP[i]:
individual[i] = BOUND_UP[i]
return individual

toolbox.register("mate", tools.cxBlend, alpha=0.5)
toolbox.register("mutate", tools.mutGaussian, mu=0, sigma=0.1, indpb=0.2)
toolbox.register("select", tools.selTournament, tournsize=3)

# Wrap the mating and mutation functions to apply constraints
def mate_and_checkBounds(ind1, ind2):
tools.cxBlend(ind1, ind2, alpha=0.5)
checkBounds(ind1)
checkBounds(ind2)
return ind1, ind2

def mutate_and_checkBounds(ind):
tools.mutGaussian(ind, mu=0, sigma=0.1, indpb=0.2)
checkBounds(ind)
return ind,

toolbox.register("mate", mate_and_checkBounds)
toolbox.register("mutate", mutate_and_checkBounds)

# Algorithm parameters
population_size = 100
generations = 200
cx_prob = 0.5 # Crossover probability
mut_prob = 0.2 # Mutation probability

# Run the Genetic Algorithm
def main():
pop = toolbox.population(n=population_size)
hof = tools.HallOfFame(1) # Keep track of the best individual

stats = tools.Statistics(lambda ind: ind.fitness.values)
stats.register("avg", np.mean)
stats.register("min", np.min)

# Run the GA
pop, log = algorithms.eaSimple(pop, toolbox, cxpb=cx_prob, mutpb=mut_prob,
ngen=generations, stats=stats,
halloffame=hof, verbose=True)

# Output the best solution
best_ind = hof[0]
print(f"Best individual: {best_ind}")
print(f"Best fitness: {best_ind.fitness.values[0]}")

if __name__ == "__main__":
main()

Explanation of the Code

  1. Objective Function: The objective function computes the value of (f(x,y,z)) for a given individual.
    Since DEAP minimizes the objective, the fitness value is the value of (f(x,y,z)).
  2. Individual Representation: Each individual is a list of three floating-point numbers representing (x), (y), and (z).
  3. Constraints: The constraints ensure that the variables (x), (y), and (z) stay within their respective bounds during the evolutionary process.
    This is achieved using the checkBounds function.
  4. Evolutionary Operations:
    • Crossover: We use a blend crossover (cxBlend), which combines the genetic material from two parent individuals.
    • Mutation: Gaussian mutation (mutGaussian) is used to add small random changes to the individuals.
    • Selection: Tournament selection (selTournament) is used to select individuals for reproduction based on their fitness.
  5. Population Evolution: The algorithm runs for a defined number of generations, during which the population evolves toward better solutions.

Running the Algorithm

When you run the algorithm, DEAP will output the progress of the optimization process, including the best fitness found so far.
The final output will be the best individual (i.e., the values of (x), (y), and (z)) and the corresponding minimum value of the objective function.

This example demonstrates how DEAP can be used to tackle complex mathematical optimization problems with non-linear, multi-variable functions.

Result

1
2
Best individual: [-1.0229844156448218, 1.0256973498197954, 1.5785811632285052]
Best fitness: -28.587191470057036

The result shows that the genetic algorithm found the best solution for the given optimization problem.

The best individual refers to the optimal values of the variables ((x,y,z)) that minimize the objective function:

  • (x=1.023)
  • (y=1.026)
  • (z=1.579)

The corresponding best fitness (which represents the minimum value of the objective function (f(x,y,z))) is approximately (28.59).

This is the lowest value found by the algorithm, indicating a successful minimization of the complex function.