Example for differential evolutionary regressionΒΆ

Example demonstrating the use of Cartesian genetic programming for a regression task that involves numeric constants. Local gradient-based search is used to determine numeric leaf values of the graph.

References:

  • Topchy, A., & Punch, W. F. (2001). Faster genetic programming based on local gradient search of numeric leaf values. In Proceedings of the genetic and evolutionary computation conference (GECCO-2001) (Vol. 155162). Morgan Kaufmann San Francisco, CA, USA.

  • Izzo, D., Biscani, F., & Mereta, A. (2017). Differentiable genetic programming. In European Conference on Genetic Programming (pp. 35-51). Springer, Cham.

# The docopt str is added explicitly to ensure compatibility with
# sphinx-gallery.
docopt_str = """
  Usage:
    example_differential_evo_regression.py [--max-generations=<N>]

  Options:
    -h --help
    --max-generations=<N>  Maximum number of generations [default: 500]

"""

import functools

import matplotlib.pyplot as plt
import numpy as np
import scipy.constants
import torch
from docopt import docopt

import cgp

args = docopt(docopt_str)

We first define the target function. Note that this function contains numeric values which are initially not available as constants to the search.

def f_target(x):
    return x[:, 0] ** 2 + 1.0 + np.pi

Then we define the differentiable(!) objective function for the evolution. It consists of an inner objective which accepts a torch tensor as input variable and uses mean-squared error between the expression represented by a given individual and the target function evaluated on a set of random points. This inner objective is then used by actual objective function to update the fitness of the individual.

def inner_objective(f, seed):
    """Return a differentiable loss of the differentiable graph f. Used
    for calculating the fitness of each individual and for the local
    search of numeric leaf values.

    """

    torch.manual_seed(seed)
    batch_size = 500
    x = torch.DoubleTensor(batch_size, 1).uniform_(-5, 5)
    y = f(x)
    return torch.nn.MSELoss()(f_target(x), y[:, 0])


def objective(individual, seed):
    """Objective function of the regression task."""

    if not individual.fitness_is_None():
        return individual

    f = individual.to_torch()
    loss = inner_objective(f, seed)
    individual.fitness = -loss.item()

    return individual

Next, we define the parameters for the population, the genome of individuals, the evolutionary algorithm, and the local search.

seed = 1234
genome_params = {"primitives": (cgp.Add, cgp.Sub, cgp.Mul, cgp.Parameter)}

# apply local search only to the top two individuals
ea_params = {"k_local_search": 2}

evolve_params = {"max_generations": int(args["--max-generations"]), "termination_fitness": -1e-8}

# use an uneven number of gradient steps so they can not easily
# average out for clipped values
local_search_params = {"lr": 1e-3, "gradient_steps": 9}

We then create a Population instance and instantiate the local search and evolutionary algorithm.

pop = cgp.Population(genome_params=genome_params)

# define the function for local search; parameters such as the
# learning rate and number of gradient steps are fixed via the use
# of `partial`; the `local_search` function should only receive a
# population of individuals as input
local_search = functools.partial(
    cgp.local_search.gradient_based,
    objective=functools.partial(inner_objective, seed=seed),
    **local_search_params,
)

ea = cgp.ea.MuPlusLambda(**ea_params, local_search=local_search)

We define a recording callback closure for bookkeeping of the progression of the evolution.

history = {}
history["champion"] = []
history["fitness_parents"] = []


def recording_callback(pop):
    history["champion"].append(pop.champion)
    history["fitness_parents"].append(pop.fitness_parents())


obj = functools.partial(objective, seed=seed)

Finally, we call the evolve method to perform the evolutionary search.

cgp.evolve(obj, pop, ea, **evolve_params, print_progress=True, callback=recording_callback)

Out:

[2/500] max fitness: -186.49093806669683
[3/500] max fitness: -182.39212868561245
[4/500] max fitness: -178.42834902668957
[5/500] max fitness: -157.45432741168432
[6/500] max fitness: -150.548499777144
[7/500] max fitness: -144.12334156650218
[8/500] max fitness: -138.14539652986417
[9/500] max fitness: -132.58353708772972
[10/500] max fitness: -127.40880224750158
[11/500] max fitness: -122.59424680156346
[12/500] max fitness: -118.11480102169223
[13/500] max fitness: -113.94714011922306
[14/500] max fitness: -110.06956279123735
[15/500] max fitness: -106.46187822035469
[16/500] max fitness: -103.10530093972936
[17/500] max fitness: -99.9823530158055
[18/500] max fitness: -97.07677303949006
[19/500] max fitness: -82.80212012192032
[20/500] max fitness: -80.28021775037736
[21/500] max fitness: -78.01722486172258
[22/500] max fitness: -75.9865606806311
[23/500] max fitness: -74.16437332995879
[24/500] max fitness: -72.52925967016824
[25/500] max fitness: -0.3594506183109534
[26/500] max fitness: -0.3467280508371042
[27/500] max fitness: -0.0011398665748614178
[28/500] max fitness: -0.0010995215909579748
[29/500] max fitness: -0.0010606045967525026
[30/500] max fitness: -0.0010230650492935678
[31/500] max fitness: -0.0009868541945705538
[32/500] max fitness: -0.000951925004195846
[33/500] max fitness: -0.0009182321143273926
[34/500] max fitness: -0.0008857317667523377
[35/500] max fitness: -0.0008543817520571783
[36/500] max fitness: -0.0008241413548085907
[37/500] max fitness: -0.0007949713006747531
[38/500] max fitness: -0.0007668337054186063
[39/500] max fitness: -0.0007396920256956696
[40/500] max fitness: -0.0007135110115941772
[41/500] max fitness: -0.0006882566608545954
[42/500] max fitness: -0.0006638961747099739
[43/500] max fitness: -0.0006403979152883135
[44/500] max fitness: -0.0006177313645235944
[45/500] max fitness: -0.0005958670845209631
[46/500] max fitness: -0.0005747766793245963
[47/500] max fitness: -0.0005544327580386244
[48/500] max fitness: -0.0005348088992537972
[49/500] max fitness: -0.0005158796167327524
[50/500] max fitness: -0.000497620326310329
[51/500] max fitness: -0.00048000731396504
[52/500] max fitness: -0.00046301770502080674
[53/500] max fitness: -0.0004466294344388844
[54/500] max fitness: -0.00043082121816106014
[55/500] max fitness: -0.0004155725254672675
[56/500] max fitness: -0.00040086355231154887
[57/500] max fitness: -0.00038667519560192036
[58/500] max fitness: -0.0003729890283903399
[59/500] max fitness: -0.0003597872759410438
[60/500] max fitness: -0.00034705279264568876
[61/500] max fitness: -0.0003347690397558274
[62/500] max fitness: -0.0003229200639035999
[63/500] max fitness: -0.0003114904763820176
[64/500] max fitness: -0.0003004654331595359
[65/500] max fitness: -0.0002898306156012994
[66/500] max fitness: -0.00027957221187312306
[67/500] max fitness: -0.00026967689900347973
[68/500] max fitness: -0.0002601318255804957
[69/500] max fitness: -0.0002509245950613792
[70/500] max fitness: -0.00024204324967237422
[71/500] max fitness: -0.00023347625487898964
[72/500] max fitness: -0.0002252124844056456
[73/500] max fitness: -0.00021724120578529456
[74/500] max fitness: -0.0002095520664212394
[75/500] max fitness: -0.00020213508014126544
[76/500] max fitness: -0.00019498061422878324
[77/500] max fitness: -0.0001880793769120322
[78/500] max fitness: -0.00018142240529672688
[79/500] max fitness: -0.0001750010537255094
[80/500] max fitness: -0.0001688069825496508
[81/500] max fitness: -0.00016283214729791162
[82/500] max fitness: -0.00015706878822888747
[83/500] max fitness: -0.00015150942025319192
[84/500] max fitness: -0.0001461468232123277
[85/500] max fitness: -0.00014097403250145697
[86/500] max fitness: -0.0001359843300244043
[87/500] max fitness: -0.00013117123546845517
[88/500] max fitness: -0.00012652849788823717
[89/500] max fitness: -0.00012205008758725364
[90/500] max fitness: -0.00011773018828698943
[91/500] max fitness: -0.00011356318957314009
[92/500] max fitness: -0.00010954367960908581
[93/500] max fitness: -0.00010566643810730722
[94/500] max fitness: -0.00010192642954964454
[95/500] max fitness: -9.831879664749041e-05
[96/500] max fitness: -9.483885403347486e-05
[97/500] max fitness: -9.148208217630561e-05
[98/500] max fitness: -8.824412151120398e-05
[99/500] max fitness: -8.512076677786489e-05
[100/500] max fitness: -8.210796155903322e-05
[101/500] max fitness: -7.920179301220828e-05
[102/500] max fitness: -7.63984867878677e-05
[103/500] max fitness: -7.369440212773384e-05
[104/500] max fitness: -7.108602713614342e-05
[105/500] max fitness: -6.856997421921516e-05
[106/500] max fitness: -6.614297568522993e-05
[107/500] max fitness: -6.380187950064741e-05
[108/500] max fitness: -6.154364519651726e-05
[109/500] max fitness: -5.936533991974782e-05
[110/500] max fitness: -5.7264134624047796e-05
[111/500] max fitness: -5.5237300395725526e-05
[112/500] max fitness: -5.328220490955658e-05
[113/500] max fitness: -5.139630901012807e-05
[114/500] max fitness: -4.9577163414114587e-05
[115/500] max fitness: -4.782240552923447e-05
[116/500] max fitness: -4.6129756386022275e-05
[117/500] max fitness: -4.449701767789996e-05
[118/500] max fitness: -4.292206890620874e-05
[119/500] max fitness: -4.140286462624321e-05
[120/500] max fitness: -3.993743179072613e-05
[121/500] max fitness: -3.852386718739269e-05
[122/500] max fitness: -3.71603349671582e-05
[123/500] max fitness: -3.584506425991446e-05
[124/500] max fitness: -3.4576346874506226e-05
[125/500] max fitness: -3.335253508032116e-05
[126/500] max fitness: -3.217203946738959e-05
[127/500] max fitness: -3.1033326882005614e-05
[128/500] max fitness: -2.993491843567391e-05
[129/500] max fitness: -2.8875387584372102e-05
[130/500] max fitness: -2.7853358275865907e-05
[131/500] max fitness: -2.686750316257104e-05
[132/500] max fitness: -2.5916541877698288e-05
[133/500] max fitness: -2.499923937234209e-05
[134/500] max fitness: -2.4114404311559526e-05
[135/500] max fitness: -2.3260887527007786e-05
[136/500] max fitness: -2.2437580524635737e-05
[137/500] max fitness: -2.1643414044912738e-05
[138/500] max fitness: -2.0877356674241503e-05
[139/500] max fitness: -2.013841350532097e-05
[140/500] max fitness: -1.942562484510989e-05
[141/500] max fitness: -1.8738064968394314e-05
[142/500] max fitness: -1.8074840915505483e-05
[143/500] max fitness: -1.7435091332628646e-05
[144/500] max fitness: -1.681798535314479e-05
[145/500] max fitness: -1.6222721518492496e-05
[146/500] max fitness: -1.564852673729193e-05
[147/500] max fitness: -1.5094655281416275e-05
[148/500] max fitness: -1.4560387817332182e-05
[149/500] max fitness: -1.4045030471939493e-05
[150/500] max fitness: -1.3547913931455264e-05
[151/500] max fitness: -1.3068392572061726e-05
[152/500] max fitness: -1.2605843621513193e-05
[153/500] max fitness: -1.2159666350233736e-05
[154/500] max fitness: -1.1729281291138534e-05
[155/500] max fitness: -1.1314129487119508e-05
[156/500] max fitness: -1.0913671765038248e-05
[157/500] max fitness: -1.0527388035525274e-05
[158/500] max fitness: -1.0154776617490184e-05
[159/500] max fitness: -9.795353586579063e-06
[160/500] max fitness: -9.448652146692725e-06
[161/500] max fitness: -9.1142220237525e-06
[162/500] max fitness: -8.791628880873315e-06
[163/500] max fitness: -8.480453754321406e-06
[164/500] max fitness: -8.180292509352224e-06
[165/500] max fitness: -7.89075531535582e-06
[166/500] max fitness: -7.611466139588044e-06
[167/500] max fitness: -7.342062258772618e-06
[168/500] max fitness: -7.082193788042311e-06
[169/500] max fitness: -6.831523226522698e-06
[170/500] max fitness: -6.589725019003962e-06
[171/500] max fitness: -6.356485133134979e-06
[172/500] max fitness: -6.131500651588328e-06
[173/500] max fitness: -5.914479378610696e-06
[174/500] max fitness: -5.7051394606033655e-06
[175/500] max fitness: -5.503209019986504e-06
[176/500] max fitness: -5.308425802173062e-06
[177/500] max fitness: -5.120536834929872e-06
[178/500] max fitness: -4.939298099850169e-06
[179/500] max fitness: -4.764474215436513e-06
[180/500] max fitness: -4.59583813138229e-06
[181/500] max fitness: -4.433170833711874e-06
[182/500] max fitness: -4.276261060349364e-06
[183/500] max fitness: -4.124905026710703e-06
[184/500] max fitness: -3.978906161081043e-06
[185/500] max fitness: -3.8380748492735885e-06
[186/500] max fitness: -3.7022281884151013e-06
[187/500] max fitness: -3.571189749391951e-06
[188/500] max fitness: -3.444789347683948e-06
[189/500] max fitness: -3.322862822383966e-06
[190/500] max fitness: -3.205251822956401e-06
[191/500] max fitness: -3.0918036036170565e-06
[192/500] max fitness: -2.9823708249309897e-06
[193/500] max fitness: -2.8768113624642707e-06
[194/500] max fitness: -2.774988122207174e-06
[195/500] max fitness: -2.676768862519906e-06
[196/500] max fitness: -2.582026022389141e-06
[197/500] max fitness: -2.490636555751111e-06
[198/500] max fitness: -2.4024817716991706e-06
[199/500] max fitness: -2.3174471803303103e-06
[200/500] max fitness: -2.2354223440453133e-06
[201/500] max fitness: -2.1563007341355307e-06
[202/500] max fitness: -2.0799795924059355e-06
[203/500] max fitness: -2.006359797747887e-06
[204/500] max fitness: -1.935345737389024e-06
[205/500] max fitness: -1.8668451827206409e-06
[206/500] max fitness: -1.8007691695203165e-06
[207/500] max fitness: -1.7370318823983063e-06
[208/500] max fitness: -1.675550543372976e-06
[209/500] max fitness: -1.6162453043272206e-06
[210/500] max fitness: -1.5590391433388162e-06
[211/500] max fitness: -1.5038577646335358e-06
[212/500] max fitness: -1.4506295020916678e-06
[213/500] max fitness: -1.3992852261905425e-06
[214/500] max fitness: -1.3497582541993377e-06
[215/500] max fitness: -1.301984263590565e-06
[216/500] max fitness: -1.2559012085098567e-06
[217/500] max fitness: -1.21144923916672e-06
[218/500] max fitness: -1.168570624135276e-06
[219/500] max fitness: -1.1272096753527903e-06
[220/500] max fitness: -1.0873126758159882e-06
[221/500] max fitness: -1.0488278098044492e-06
[222/500] max fitness: -1.0117050955896162e-06
[223/500] max fitness: -9.758963205132251e-07
[224/500] max fitness: -9.413549783846545e-07
[225/500] max fitness: -9.08036209078633e-07
[226/500] max fitness: -8.758967402604833e-07
[227/500] max fitness: -8.448948312059377e-07
[228/500] max fitness: -8.149902185852457e-07
[229/500] max fitness: -7.861440641517546e-07
[230/500] max fitness: -7.583189043362982e-07
[231/500] max fitness: -7.31478601562884e-07
[232/500] max fitness: -7.05588297332947e-07
[233/500] max fitness: -6.80614366938983e-07
[234/500] max fitness: -6.565243758084212e-07
[235/500] max fitness: -6.332870373693061e-07
[236/500] max fitness: -6.108721724252055e-07
[237/500] max fitness: -5.892506699556786e-07
[238/500] max fitness: -5.683944493073574e-07
[239/500] max fitness: -5.482764237289396e-07
[240/500] max fitness: -5.288704651891891e-07
[241/500] max fitness: -5.101513704483518e-07
[242/500] max fitness: -4.92094828320447e-07
[243/500] max fitness: -4.7467738810056813e-07
[244/500] max fitness: -4.5787642910877537e-07
[245/500] max fitness: -4.4167013131081663e-07
[246/500] max fitness: -4.260374469862934e-07
[247/500] max fitness: -4.109580733816884e-07
[248/500] max fitness: -3.9641242635330755e-07
[249/500] max fitness: -3.823816149309154e-07
[250/500] max fitness: -3.688474167713294e-07
[251/500] max fitness: -3.557922545082884e-07
[252/500] max fitness: -3.431991729151782e-07
[253/500] max fitness: -3.3105181688925946e-07
[254/500] max fitness: -3.193344102051384e-07
[255/500] max fitness: -3.0803173502954205e-07
[256/500] max fitness: -2.971291121566563e-07
[257/500] max fitness: -2.866123819424254e-07
[258/500] max fitness: -2.764678859187891e-07
[259/500] max fitness: -2.6668244905015297e-07
[260/500] max fitness: -2.5724336262429516e-07
[261/500] max fitness: -2.481383677479561e-07
[262/500] max fitness: -2.3935563942420305e-07
[263/500] max fitness: -2.3088377119671804e-07
[264/500] max fitness: -2.2271176033313808e-07
[265/500] max fitness: -2.1482899353940853e-07
[266/500] max fitness: -2.0722523317109817e-07
[267/500] max fitness: -1.998906039427969e-07
[268/500] max fitness: -1.928155800962304e-07
[269/500] max fitness: -1.859909730351765e-07
[270/500] max fitness: -1.7940791938714868e-07
[271/500] max fitness: -1.730578694960282e-07
[272/500] max fitness: -1.6693257631427238e-07
[273/500] max fitness: -1.6102408469581527e-07
[274/500] max fitness: -1.5532472106099854e-07
[275/500] max fitness: -1.4982708343509077e-07
[276/500] max fitness: -1.4452403182936986e-07
[277/500] max fitness: -1.3940867897404874e-07
[278/500] max fitness: -1.344743813706512e-07
[279/500] max fitness: -1.2971473066053167e-07
[280/500] max fitness: -1.251235453108721e-07
[281/500] max fitness: -1.206948625740519e-07
[282/500] max fitness: -1.1642293075662726e-07
[283/500] max fitness: -1.1230220174382598e-07
[284/500] max fitness: -1.0832732378680725e-07
[285/500] max fitness: -1.0449313456574766e-07
[286/500] max fitness: -1.0079465447648152e-07
[287/500] max fitness: -9.722708016296129e-08
[288/500] max fitness: -9.378577828500747e-08
[289/500] max fitness: -9.046627949583837e-08
[290/500] max fitness: -8.726427263816023e-08
[291/500] max fitness: -8.417559914667144e-08
[292/500] max fitness: -8.119624764440177e-08
[293/500] max fitness: -7.832234873761663e-08
[294/500] max fitness: -7.555016998596561e-08
[295/500] max fitness: -7.28761110572314e-08
[296/500] max fitness: -7.029669905200073e-08
[297/500] max fitness: -6.780858399128113e-08
[298/500] max fitness: -6.540853446788041e-08
[299/500] max fitness: -6.309343344840829e-08
[300/500] max fitness: -6.086027422269288e-08
[301/500] max fitness: -5.870615650578865e-08
[302/500] max fitness: -5.662828266418225e-08
[303/500] max fitness: -5.462395408512762e-08
[304/500] max fitness: -5.269056767185315e-08
[305/500] max fitness: -5.0825612463984985e-08
[306/500] max fitness: -4.902666637450999e-08
[307/500] max fitness: -4.7291393046342906e-08
[308/500] max fitness: -4.561753881326509e-08
[309/500] max fitness: -4.400292977916426e-08
[310/500] max fitness: -4.244546899140444e-08
[311/500] max fitness: -4.0943133716671093e-08
[312/500] max fitness: -3.949397281738444e-08
[313/500] max fitness: -3.809610421365547e-08
[314/500] max fitness: -3.6747712441896936e-08
[315/500] max fitness: -3.544704629411852e-08
[316/500] max fitness: -3.419241654765119e-08
[317/500] max fitness: -3.2982193767720656e-08
[318/500] max fitness: -3.181480619244281e-08
[319/500] max fitness: -3.068873769305998e-08
[320/500] max fitness: -2.9602525801351234e-08
[321/500] max fitness: -2.855475981373822e-08
[322/500] max fitness: -2.7544078958369145e-08
[323/500] max fitness: -2.6569170625166675e-08
[324/500] max fitness: -2.5628768665122917e-08
[325/500] max fitness: -2.4721651743009356e-08
[326/500] max fitness: -2.3846641751825785e-08
[327/500] max fitness: -2.3002602284145275e-08
[328/500] max fitness: -2.2188437153512728e-08
[329/500] max fitness: -2.1403088973707116e-08
[330/500] max fitness: -2.0645537784073067e-08
[331/500] max fitness: -1.9914799724240227e-08
[332/500] max fitness: -1.9209925757812876e-08
[333/500] max fitness: -1.8530000437686807e-08
[334/500] max fitness: -1.787414072045158e-08
[335/500] max fitness: -1.724149481668558e-08
[336/500] max fitness: -1.6631241085397864e-08
[337/500] max fitness: -1.6042586966868727e-08
[338/500] max fitness: -1.547476795459699e-08
[339/500] max fitness: -1.492704660049747e-08
[340/500] max fitness: -1.439871155866749e-08
[341/500] max fitness: -1.388907666008206e-08
[342/500] max fitness: -1.3397480023367768e-08
[343/500] max fitness: -1.2923283194002717e-08
[344/500] max fitness: -1.2465870314433205e-08
[345/500] max fitness: -1.2024647325399223e-08
[346/500] max fitness: -1.1599041194707715e-08
[347/500] max fitness: -1.1188499171027739e-08
[348/500] max fitness: -1.079248806834865e-08
[349/500] max fitness: -1.0410493572649419e-08
[350/500] max fitness: -1.004201957348784e-08

<cgp.population.Population object at 0x7f7d35cc6340>

After finishing the evolution, we plot the result and log the final evolved expression.

width = 9.0
fig = plt.figure(figsize=(width, width / scipy.constants.golden))

ax_fitness = fig.add_subplot(121)
ax_fitness.set_xlabel("Generation")
ax_fitness.set_ylabel("Fitness")
ax_fitness.set_yscale("symlog")

ax_function = fig.add_subplot(122)
ax_function.set_ylabel(r"$f(x)$")
ax_function.set_xlabel(r"$x$")


print(f"Final expression {pop.champion.to_sympy()} with fitness {pop.champion.fitness}")

history_fitness = np.array(history["fitness_parents"])
ax_fitness.plot(np.max(history_fitness, axis=1), label="Champion")
ax_fitness.plot(np.mean(history_fitness, axis=1), label="Population mean")

x = np.linspace(-5.0, 5, 100).reshape(-1, 1)
f = pop.champion.to_func()
y = [f(xi) for xi in x]
ax_function.plot(x, f_target(x), lw=2, label="Target")
ax_function.plot(x, y, lw=1, label="Target", marker="x")

plt.savefig("example_differential_evo_regression.pdf", dpi=300)
example differential evo regression

Out:

Final expression x_0**2 + 4.141492443712169 with fitness -1.004201957348784e-08

Total running time of the script: ( 0 minutes 31.421 seconds)

Gallery generated by Sphinx-Gallery