Trade-off between Efficiency and Cost in Engineering Design using DEAP

Trade-off between Efficiency and Cost in Engineering Design using DEAP

In engineering design, balancing efficiency and cost is a common challenge.

Improving efficiency often leads to increased costs, and minimizing costs may reduce efficiency.

This is a typical multi-objective optimization problem where we aim to optimize both objectives simultaneously.

In this example, we’ll use the $DEAP$ library to solve a simplified engineering design problem where efficiency and cost conflict.

Problem Definition

Consider the design of a mechanical component, such as a turbine blade, where the goals are:

  1. Efficiency: Maximizing the energy output (performance of the blade).
  2. Cost: Minimizing the production cost of the blade.

These two objectives conflict because increasing efficiency may require using more expensive materials, more precise manufacturing, or complex design techniques, which increase costs.

Genetic Algorithm Approach

A multi-objective genetic algorithm (MOGA) can effectively handle this trade-off.
MOGA aims to find a set of solutions called the Pareto front, where no single solution is clearly better than others in all objectives.
Instead, it provides a set of “compromise” solutions that balance efficiency and cost.

Objective Functions

  1. Efficiency (maximize): This could depend on factors such as material properties, shape, and operational parameters.
  2. Cost (minimize): This could include material costs, manufacturing complexity, and maintenance expenses.

DEAP Implementation

We will represent the design as a vector of variables that affect both efficiency and cost, such as material thickness, blade curvature, and surface area.
The fitness function will evaluate both objectives simultaneously.

Example Code using DEAP

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
import random
import numpy as np
from deap import base, creator, tools, algorithms

# Define the number of design variables
NUM_DESIGN_VARIABLES = 3 # For example, thickness, curvature, surface area

# Create individual as a list of design variables (continuous values)
def generate_individual():
return [random.uniform(0.5, 5.0), # Thickness
random.uniform(0, 90), # Curvature (in degrees)
random.uniform(1.0, 10.0)] # Surface area

# Efficiency function: Some nonlinear combination of design variables
def efficiency(individual):
thickness, curvature, surface_area = individual
return thickness * np.cos(np.radians(curvature)) * surface_area

# Cost function: Assume higher efficiency leads to higher costs
def cost(individual):
thickness, curvature, surface_area = individual
return 100 * thickness + 50 * curvature + 150 * surface_area

# Fitness function to optimize efficiency (maximize) and cost (minimize)
def evaluate(individual):
eff = efficiency(individual)
cst = cost(individual)
return eff, cst # Efficiency is to be maximized, cost minimized

# Set up DEAP framework for multi-objective optimization
creator.create("FitnessMulti", base.Fitness, weights=(1.0, -1.0)) # Maximize efficiency, minimize cost
creator.create("Individual", list, fitness=creator.FitnessMulti)

toolbox = base.Toolbox()
toolbox.register("individual", tools.initIterate, creator.Individual, generate_individual)
toolbox.register("population", tools.initRepeat, list, toolbox.individual)

# Register the genetic operators
toolbox.register("mate", tools.cxBlend, alpha=0.5) # Blend crossover for real numbers
toolbox.register("mutate", tools.mutGaussian, mu=0, sigma=1, indpb=0.2) # Gaussian mutation
toolbox.register("select", tools.selNSGA2) # NSGA-II selection
toolbox.register("evaluate", evaluate)

# Parameters
POPULATION_SIZE = 100
CXPB, MUTPB, NGEN = 0.7, 0.2, 50 # Crossover, Mutation, Generations

def main():
# Create an initial population
population = toolbox.population(n=POPULATION_SIZE)

# Use Pareto front to store best individuals
pareto_front = tools.ParetoFront()

# Stats for tracking progress
stats = tools.Statistics(lambda ind: ind.fitness.values)
stats.register("avg", np.mean)
stats.register("std", np.std)
stats.register("min", np.min)
stats.register("max", np.max)

# Run the genetic algorithm
algorithms.eaMuPlusLambda(population, toolbox, mu=POPULATION_SIZE, lambda_=POPULATION_SIZE,
cxpb=CXPB, mutpb=MUTPB, ngen=NGEN, stats=stats,
halloffame=pareto_front, verbose=True)

# Output the Pareto front
print("Pareto front:")
for ind in pareto_front:
print("Efficiency: {:.2f}, Cost: {:.2f}".format(ind.fitness.values[0], ind.fitness.values[1]))
print("Design Variables: {}".format(ind))

if __name__ == "__main__":
main()

Explanation of the Code

  1. Design Variables: We model the component using three design variables:

    • Thickness: Influences both efficiency and cost.
    • Curvature: Affects the aerodynamics and energy efficiency.
    • Surface Area: Larger areas can improve performance but also increase production costs.
  2. Efficiency Function: This function models the performance of the design based on the three variables.
    The function uses a simplified physical model where efficiency is a product of thickness, curvature (cosine factor), and surface area.

  3. Cost Function: The cost increases with larger thickness, curvature, and surface area.
    This reflects the real-world trade-off where improving efficiency generally incurs higher production costs.

  4. Fitness Function: The fitness function returns two objectives:

    • Maximize efficiency: We aim to maximize this value.
    • Minimize cost: This is the second objective, which we aim to minimize.

    These objectives are optimized simultaneously using NSGA-II (Non-dominated Sorting Genetic Algorithm II), a well-known multi-objective algorithm.

  5. Genetic Operators:

    • Crossover: A blend crossover operator (cxBlend) is used, which mixes the design variables between two parents to produce offspring.
    • Mutation: Gaussian mutation is applied to introduce randomness into the design variables, helping the algorithm explore new solutions.
  6. Pareto Front: The algorithm tracks the Pareto front, a set of solutions where no individual is strictly better than another.
    These solutions represent different trade-offs between efficiency and cost.

Running the Code

When you run the code, the genetic algorithm will evolve a population of design solutions over $50$ generations.
At the end of the process, it outputs the Pareto front – a set of designs that offer different trade-offs between efficiency and cost.

Output

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
gen	nevals	avg    	std    	min      	max    
0 100 1694.92 1930.78 0.0220808 6106.97
1 95 1216.6 1379.71 -556.024 4822.95
2 91 989.015 1139.21 -556.024 5160.68
3 93 906.303 1114.18 -627.209 5160.68
4 83 801.275 1008.49 -1452.56 3522.68
5 84 750.042 1039.61 -1452.56 3522.68
6 93 633.209 1023.72 -1759.44 3258.58
7 90 558.814 1130.58 -1806.77 3514.39
8 89 441.986 1128.78 -2248.18 3958.41
9 90 281.681 1066.59 -2322.65 3958.41
10 88 238.438 978.101 -2322.65 3873.91
11 88 176.682 986.951 -2322.65 3873.91
12 91 171.723 926.049 -2322.65 4127.12
13 92 156.898 968.163 -2322.65 3718.86
14 90 155.247 921.496 -2454.63 3464.56
15 87 128.905 791.008 -2454.63 2861.14
16 95 205.523 863.256 -2834.83 3322.04
17 89 249.897 971.868 -2950.3 3322.04
18 84 251.091 1044.93 -2985.4 3322.04
19 88 335.66 1148.03 -2985.4 3806.44
20 92 377.907 1083.54 -3368.94 3806.44
21 93 326.139 1045.74 -3525.31 3808.52
22 92 348.279 1183.64 -4327.56 3806.44
23 91 413.936 1325.83 -4327.56 4587.94
24 90 235.403 1446.65 -5081.42 4587.94
25 88 553.745 1764.08 -6772.13 7445.15
26 96 974.358 1651.57 -6772.13 7445.15
27 91 1035.65 2077.58 -9599.08 7445.15
28 96 1028.19 2584.65 -10369 7445.15
29 89 1340.58 2799.72 -12578.6 9968.12
30 90 1618.4 3141.3 -12578.6 8770.64
31 94 1503.09 3887.99 -20775.7 9332.01
32 93 1284.27 4768.44 -20775.7 10349.3
33 91 430.417 6569.43 -32427.1 11709.2
34 93 -597.205 9104.26 -32427.1 15394.4
35 94 -2437.79 11763 -48140.6 20016.7
36 90 -4753.3 15631.8 -48362.8 30094.8
37 89 -7289.55 19229.7 -53004.1 34858.6
38 84 -7607.14 23784 -63930 52172.4
39 86 -9493.27 28775 -81741.6 53654.7
40 94 -18918.3 30893.5 -114207 58927
41 94 -22416 35691.5 -114207 78662
42 97 -25417.7 42838.6 -147441 136395
43 91 -26824.1 50122.6 -147441 190169
44 92 -32255.8 55770.6 -149595 190169
45 87 -38938.2 62594.7 -166308 190169
46 85 -48886.7 67498.4 -250660 72198.6
47 92 -53531.1 79835.7 -250660 83679.5
48 90 -60682.3 92425.1 -267639 110325
49 95 -66294.6 112981 -330910 199616
50 87 -70360.9 133499 -380538 298117
Pareto front:
Efficiency: 298116.94, Cost: -380538.01
Design Variables: [-162.47771317148818, -40.55933133210618, -2415.0817857811157]

Explanation of the Output

The results displayed represent the output of a genetic algorithm (GA) run for $50$ generations.

Here is a breakdown of what each column represents:

  1. gen: The current generation number of the GA.
  2. nevals: The number of evaluations performed in each generation (the number of individuals evaluated).
  3. avg: The average fitness value of the population in that generation.
  4. std: The standard deviation of the fitness values, indicating the variation in fitness among individuals.
  5. min: The minimum fitness value in the population for that generation (the worst-performing individual).
  6. max: The maximum fitness value in the population for that generation (the best-performing individual).

Key Insights:

  • Generation 0 starts with a population that has a wide range of fitness values, from very low (min = 0.0220808) to very high (max = 6106.97).
  • Over the first $20$ generations, both the average fitness and the best fitness (max) values decrease.
    This indicates that the algorithm is exploring lower-performing areas of the search space, which might be necessary to escape local optima.
  • Starting from around generation $25$, there is a noticeable increase in the max fitness, reaching higher values as the algorithm converges toward more optimal solutions.
  • By the final generation $(50)$, the fitness values exhibit a wide range with very negative average fitness (avg = -70360.9), indicating the exploration of regions with both very poor and very high efficiency-cost trade-offs.
    The best-performing individual has an efficiency of 298116.94 and a cost of -380538.01.

Pareto Front Result:

  • The algorithm has identified a Pareto optimal solution where the efficiency is $298116.94$, and the cost is highly negative $(-380538.01)$, reflecting an extreme trade-off.
  • The design variables that resulted in this efficiency-cost combination are [ -162.48, -40.56, -2415.08].
    These variables likely represent an unconventional or extreme design in the context of the problem.

In summary, the GA has found a variety of trade-offs between efficiency and cost, with some solutions providing high efficiency at high costs, and others showing extreme variations.

The Pareto front solution reflects one of the best trade-offs found during the optimization process.

Conclusion

This example demonstrates how $DEAP$ and multi-objective genetic algorithms can be applied to solve trade-offs between efficiency and cost in engineering design.

The Pareto front provides a set of optimal solutions that reflect different compromises between the two objectives, allowing decision-makers to select the design that best fits their specific requirements.