Interactive demos powered by the DEAP Python library
Evolve a binary string to maximise the count of 1s — the classic GA baseline.
Each cell = one gene. = 1 = 0
Minimise the Rastrigin function in 2D — a classic multi-modal benchmark full of local minima.
Each frame shows where individuals are in the search space.
GA evolves MA crossover + RSI exit parameters. Fitness = Sharpe ratio (maximise) + max drawdown (minimise) via NSGA-II. Walk-forward validation separates in-sample from out-of-sample.
High IS → Low OOS gap reveals overfitting.
Strategy performance on data the GA never trained on.
Non-dominated solutions from the last window's final generation. Each point is a viable strategy.
Genetic Algorithms (GAs) are search & optimisation algorithms inspired by natural evolution. They work by maintaining a population of candidate solutions and applying selection, crossover, and mutation to improve them over generations.
Better individuals are more likely to reproduce. Tournament selection picks the best out of a random subset of size k.
Two parents swap parts of their genome. Two-point crossover splits each parent at two random points and exchanges the middle segments.
Random small changes prevent the population from converging prematurely and help escape local optima. Controlled by mut_prob.
| Parameter | Effect | Typical range |
|---|---|---|
| Population size | Diversity vs. compute cost | 50–500 |
| Crossover prob. | How often parents exchange genes | 0.5–0.9 |
| Mutation prob. | Exploration vs. exploitation | 0.01–0.3 |
| Generations | Budget for improvement | 20–500+ |
from deap import base, creator, tools # 1. Define what "fitness" means creator.create("FitnessMax", base.Fitness, weights=(1.0,)) creator.create("Individual", list, fitness=creator.FitnessMax) # 2. Register operators in a Toolbox tb = base.Toolbox() tb.register("attr_bool", random.randint, 0, 1) tb.register("individual", tools.initRepeat, creator.Individual, tb.attr_bool, n=50) tb.register("population", tools.initRepeat, list, tb.individual) # 3. Evaluation function def evaluate(ind): return (sum(ind),) # OneMax = count of 1s tb.register("evaluate", evaluate) tb.register("mate", tools.cxTwoPoint) tb.register("mutate", tools.mutFlipBit, indpb=0.05) tb.register("select", tools.selTournament, tournsize=3) # 4. Run the evolutionary loop pop = tb.population(n=100) for gen in range(40): offspring = tb.select(pop, len(pop)) offspring = list(map(tb.clone, offspring)) for c1, c2 in zip(offspring[::2], offspring[1::2]): if random.random() < 0.7: tb.mate(c1, c2) for mut in offspring: if random.random() < 0.2: tb.mutate(mut) pop[:] = offspring