## Trade-off between Efficiency and Cost in Engineering Design using DEAP

In engineering design, balancing **efficiency** and **cost** is a common challenge.

Improving efficiency often leads to increased costs, and minimizing costs may reduce efficiency.

This is a typical **multi-objective optimization** problem where we aim to optimize both objectives simultaneously.

In this example, we’ll use the $DEAP$ library to solve a simplified engineering design problem where efficiency and cost conflict.

### Problem Definition

Consider the design of a mechanical component, such as a turbine blade, where the goals are:

**Efficiency**: Maximizing the energy output (performance of the blade).**Cost**: Minimizing the production cost of the blade.

These two objectives conflict because increasing efficiency may require using more expensive materials, more precise manufacturing, or complex design techniques, which increase costs.

### Genetic Algorithm Approach

A **multi-objective genetic algorithm (MOGA)** can effectively handle this trade-off.**MOGA** aims to find a set of solutions called the **Pareto front**, where no single solution is clearly better than others in all objectives.

Instead, it provides a set of “compromise” solutions that balance efficiency and cost.

### Objective Functions

**Efficiency (maximize)**: This could depend on factors such as material properties, shape, and operational parameters.**Cost (minimize)**: This could include material costs, manufacturing complexity, and maintenance expenses.

### DEAP Implementation

We will represent the design as a vector of variables that affect both efficiency and cost, such as material thickness, blade curvature, and surface area.

The fitness function will evaluate both objectives simultaneously.

### Example Code using DEAP

1 | import random |

### Explanation of the Code

**Design Variables**: We model the component using three design variables:**Thickness**: Influences both efficiency and cost.**Curvature**: Affects the aerodynamics and energy efficiency.**Surface Area**: Larger areas can improve performance but also increase production costs.

**Efficiency Function**: This function models the performance of the design based on the three variables.

The function uses a simplified physical model where efficiency is a product of thickness, curvature (cosine factor), and surface area.**Cost Function**: The cost increases with larger thickness, curvature, and surface area.

This reflects the real-world trade-off where improving efficiency generally incurs higher production costs.**Fitness Function**: The fitness function returns two objectives:**Maximize efficiency**: We aim to maximize this value.**Minimize cost**: This is the second objective, which we aim to minimize.

These objectives are optimized simultaneously using

**NSGA-II**(Non-dominated Sorting Genetic Algorithm II), a well-known multi-objective algorithm.**Genetic Operators**:**Crossover**: A blend crossover operator (`cxBlend`

) is used, which mixes the design variables between two parents to produce offspring.**Mutation**: Gaussian mutation is applied to introduce randomness into the design variables, helping the algorithm explore new solutions.

**Pareto Front**: The algorithm tracks the Pareto front, a set of solutions where no individual is strictly better than another.

These solutions represent different trade-offs between efficiency and cost.

### Running the Code

When you run the code, the genetic algorithm will evolve a population of design solutions over $50$ generations.

At the end of the process, it outputs the **Pareto front** – a set of designs that offer different trade-offs between efficiency and cost.

### Output

1 | gen nevals avg std min max |

### Explanation of the Output

The results displayed represent the output of a **genetic algorithm (GA)** run for $50$ generations.

Here is a breakdown of what each column represents:

**gen**: The current generation number of the GA.**nevals**: The number of evaluations performed in each generation (the number of individuals evaluated).**avg**: The average fitness value of the population in that generation.**std**: The standard deviation of the fitness values, indicating the variation in fitness among individuals.**min**: The minimum fitness value in the population for that generation (the worst-performing individual).**max**: The maximum fitness value in the population for that generation (the best-performing individual).

### Key Insights:

**Generation 0**starts with a population that has a wide range of fitness values, from very low (`min = 0.0220808`

) to very high (`max = 6106.97`

).- Over the first $20$ generations, both the
**average fitness**and the**best fitness (max)**values decrease.

This indicates that the algorithm is exploring lower-performing areas of the search space, which might be necessary to escape local optima. - Starting from around generation $25$, there is a noticeable increase in the
**max fitness**, reaching higher values as the algorithm converges toward more optimal solutions. - By the final generation $(50)$, the fitness values exhibit a wide range with very negative
**average fitness**(`avg = -70360.9`

), indicating the exploration of regions with both very poor and very high efficiency-cost trade-offs.

The best-performing individual has an**efficiency of 298116.94**and a**cost of -380538.01**.

### Pareto Front Result:

- The algorithm has identified a Pareto optimal solution where the
**efficiency**is $298116.94$, and the**cost**is highly negative $(-380538.01)$, reflecting an extreme trade-off. - The design variables that resulted in this efficiency-cost combination are
`[ -162.48, -40.56, -2415.08]`

.

These variables likely represent an unconventional or extreme design in the context of the problem.

In summary, the GA has found a variety of trade-offs between efficiency and cost, with some solutions providing high efficiency at high costs, and others showing extreme variations.

The Pareto front solution reflects one of the best trade-offs found during the optimization process.

### Conclusion

This example demonstrates how $DEAP$ and multi-objective genetic algorithms can be applied to solve trade-offs between efficiency and cost in engineering design.

The Pareto front provides a set of optimal solutions that reflect different compromises between the two objectives, allowing decision-makers to select the design that best fits their specific requirements.