Liang, Hongqian; Junzhi Cui, "Genetic Simulated Annealing Algorithm With Selective Generation", 2000 August 6-2000 August 10

Online content

Fullscreen
Genetic Simulated A nnealing A Igorithm With Selective Generation

Hongqian Liang, Junzhi Cui
Institute of Computational Mathematics and Science/Engineering Computing, Chinese A cademy
of Science

lhgian@|sec.cc.ac.cn

Abstract:

The general Genetic Algorithm has the weakness of premature and poor ability of local searching
and general Simulated Anneaing doesnot have fast speed of global searching. The paper uses the
conception of selective generation, designs an algorithm accompanying Genetic Algorithm and
Simulated Annealing, which improves the searching speed and avoids the premature. Computing
results of optimizing some hard-optimizing functions show that the method has more fast
conver-gence speed.

Keywords:

Genetic Algorithm, Simulated Annealing, Selectivity Generation, Reproduction Operator,
Optimization

1 Introduction

Genetic Algorithm(GA)[1]is a effective solution of problems of optimization.It was
introduced by John Holland in 1975.After 1975*it has been developed to general computing
model of resolving problems of optimization by simulating evolving process of the nature.

Simulated A nnealing(SA)[2] is introduced to combinational optimization by Kirkpatrick in
1982.SA can effectively resove problems of Nondeterministic Polynomial Completeness.

The GA has ability of global searching,but is poor in local searching.For example,if the
population is near the optimization,the algorithm also wastes much time for this optimum
value.W hat's more,GA is easily convergent to the local optimization.SA can fast find the local
optimization, however, the ability of global searching of the algorithm is despondent.In some
cases,The GA and the SA can be corporated to resolve problem of optimization. Such methods
are introduced by paper{4] and paper{5] and resolve a lot problems which can not resolve well by
only one. However,when resolving a variety of engineering problems through such methods,
people find the results still cannot satisfy the requirements,such as premature, convergenve to
local optimization and poor speed of global searching. Paper{3] introduces a conception of
selective genration, which selects some individuals with fine genetic modes in the current
generation to reproduce and at some degree voids randomness and blindness of reproduction to
speed computation. This paper use the concetion of selective generation,based on the GA and the
SA, introduces the genetic simulated annealing algorithm with selective generation, which can
select some special individuals for reproduction and improve the searching speed,
Meanwhile,improves the method provided by paper{3], greatly avoids randomness and
blindness of reproduction to speed computation.

Now the method provided by this paper has been used for load forecast in electric power

This paper is Subsidized by the special funds for major state basic reseach projects.
Project No:G 1998030416
system. It is proved that the method has high convergent speed of searching.

2 Construct G enetic Algorithm with selective generation
The method provided by this paper uses individuals with real-coded.
1 Define the fitness function

F (x) se (iit) f an (x)

f (x) : the object function

fin (X) : the minimal value of the object function of the current population.

a@: control parameter

The definition of the fitness function guanrantes non-negativity and maximization.While «
decreases, the probability of the dividuals with high fitness to be selected increases.

2 Define the operator of selection

The strategy is first to select minimal individual and maximal individual into matching set,
then to select other individuals with the method of competence between two,which is to randomly
select two individuals from current population and put the maximal individual into matching
set,repeating the transaction above until the quanitity of the individuals of matching set arrive at
the number given.

This definition of operator of selection guanrantes the current maximal individual
unconditionally comes into matching set and competence between two avoids the problem of
Super Number String[6] and of Close Competence of the Proportional Model[6],what's more,in
this method, to select the minimal individual to guanrantee the diversity of the population and
reserve fine gene mode.

3 Define operator of crossover

Firstly,the maximal individual and the minimal individual perform one-point crossover:
randomly select a point in the real-coded string of individual for crossoverthen exchange the
matching chorosomes of two individuals at the point, at last put the results of crossover into
matching set.

Secondly,other individuals of matching set perform arithmatic crossover and one-point
crossover.

For example, select individuals A and B belonging to the matching set to be parent
individuals.The arithmatic crossover between two is following:

a=0A+(1-0)B

b =OB+(1-0)A
parameter 0 is a real in[{0,1],may vary according to the fitness of A and B.If A is proximately

similar to B, Ocould be a large number,on the other hand, Ocould be a small number.The
arithmatic crossover of A and B produce two children,a and b.

Meanwhile,A and B perform one-point crossover,which also produce two child individuals.

Finally put four child individuals and their parents,A and B, into matching set.

The style of definition of operator of crossover between the maximal individual and the
minimal individual guarantes the minimal probabilities of destruction of fine genetic mode by
one-point crossover. crossovers of other individuals of matching set by one-point crossover and
arithmatic crossover are available to produce more individuals in the next generation to increase
diversities of individuals and to decrease probability of similar individuals presented.

4 Define operator of mutation
The maximal individual and the minimal individual perform non-homogeneous mutation[3] :
given parental individual

X, =(k =1,2,3, 0... n) is the point for mutation, the definition field of x, is[A,, B,].so new
genetic value is decided by the following formula»

ris a random number homogeneously distributed in [0 ,1] +t is the number of the generations T is
the given gross quanitity of the generations. * is the parameter preciously given and might vary
according to information of the current generation,in some beginning generations, * could bea

= Peta Ba may) rand (0,1) =0
X= Gy ta(t, x, -Ay) rand (0,1) =1
t
B(1-—)
A(t,y) =y(l-r tT’)

large number,and in some ending generations, * could be asmall number. The result of rand(0,1)
is equally probablly 0 or 1.Finally put the results of mutation of individuals and their parent X
into matching set.

Concomitly, other individuals of matching set also perform non-homogeneously mutate on
probability of mutation and put the results of mutation of individuals and their parents into
matching set.

The method compulsivly mutates the maximal individual can speed searching the local
maximal value by the mutation of maximal individual,Otherwise the mutation of minimal
individual can increase diversity of the population to avoid the premature of Genetic
Algorithm(GA).To maintain random of operator of mutation,the parent population should
non-homogeneously mutate on the probability of mutation,then put the results of the mutation of
individuals into matching set.

5 Define operator of simulated annealing

Firstly select maximal individual from current generation, perform it with operator of

simulated annealing, receive new result according to probability:
f()-fy

y=min(le * )

f (i) is the valueof the fitness functionof new result, fp is the valueof thefitnessfunction

of maximalindividualof currentgeneration|> whichdecreasesby formuld? =a’? is the
currentdegreeof temperatue of algorithmof simulatedannealing.

During the process of the computing,preserve every optimization of every Markov chain to
certain value of T. The maximal resolution of the optimizations preserved is the final result.

Finally, put the result of simulated annealing into matching set.

This definition of operator of simulated annealing guanrantes firstly to search the local
maximization near maximal individual. Because the algorithm of simulated annealing has
powerful ability to search the local maximization, the operator can save much time and soon
determine whether the local maximization is the global maximazation or not.Meanwhile,
preserving the optimization of every Markov chain can avoid acquiring non- optimum resolution.

3 Construct Algorithm
Properly organizing the operaters of reproduction above,the paper gets an effective algorithm
which can effectively resolve many hard- optimizing problems. Through genetic stock to preseve
individuals with fine gene modes,the algorithm can effectively avoid plunging into local
optimization and fast get the global optimization.

The algorithm constructe d is following :

linitialize the generation counter :t< 0.

2 create randomly t he original population P(t).

3 perform operator of selection : P(t) =selection [P (t)].

A perform operator of crossover : P (t) =crossover [P (t)].

5 perform operator of mutation : P(t) =mutation [P (t)].

6 perform operator of simulated annealing :a =Simulated Annealing [P(t)].

7 perform operator of selection : P(t +1) =selection [{a}@ P “(t) ® P(t) ® P(t)).

8 make a judge whethert he method should be stopped, if no, retumto 4, if yes, output
the optimizati on and stop the computing process.

4 Computing example

Given the minimal value of the Shubert function.

f(x )=n Y ool +i}, +i]

the global minimal value of the function of Shubert is -186.73099 and the number of the point
with the global minimal value is infinite.The following is to resolve the minimal value of the
function by the method provided above and to compare with processes of Simple Genetic
Algorithms SGA *,Real-coded Genetic Algorithms RGA *and method given by Paper{3].In the
method given by paper(3] and the method given by this paper,Pc=0.8,Pm=0.01,scale of
population is 100,the most number of generation is 50, O chooses 0.4 and 0.6, * chooses 0.6 or
0.8, the field of definite of variables x1,x2 is [-10,10] .

The following table represents the diference of SGA,RGA,the method by paper{3] and the
method given by this paper.

Name Of Algorithm SGA RGA Method by Paper{3] |The Given Method|
Number Of Generatio: 112) 81 50 45
xX -@.802] -7.08313 -1.42514| -@.80035)
ry -7.70874| -1.42517| -7.0835 4.85803
f(x,y) -186.73] -186.7304| - 186 . 7304 -186 .7309|
time(s) 2.34 1.88 0.67| 0.63

Therefore,the computing time of the method introduced by this paper is more less than the SGA,
the RGA, the method given by paper{3] and the precision better than those.

5 Conclusion

The paper deeply analyses random which can resulted in waste of computing time and the
local optimization and all kinds of currently operators of reproduction* operator of selections

operator of crossovers operator of mutation*of a varity of the Genetic Algorithm(GA),then
introduces the conception of selective generation.

1 The operator of selection puts unconditionaly the dividuals with the maximal fitness and the
minimal fitness in order to maintain the diversity of individuals.

2 The fitness function is defined by exponential function to guanrante non-negativity, and
constrain parameter ¢ is used to control optimization of individuals.

3 Using conception of selective generation to define special operator of reproductions operator of
selection* operator of crossover* operator of mutations ,compulsivly select maximal and minimal
individuals from current population for selections crossover and mutation to avoid phenomenon of
premature * to speed computation.

4 Accompanying the algorithm of Simulated Annealing,the method can soon search out the
optimization near the maximal individual of the current population then decide whether it is local
optimization or not,which is advantaging of jumping off the the local maximization.

5 After computing some engineering examples,we can find this method has high performance of
global searching and local searching.

References:

[1] Holland J H. Adaptive of Natural and Artificial Systems. Ann ArboreThe University Of
Michigan Press,1975

[2] Kirkpatrik S etc. Optimization By Simulated Annealing. Science, 1983, 220: 671~680

[3] Hongqian Liang etc. Genetic Algorithm With Selective Generation. APGA2000, HongKong,

2000

Chen H,Flann N S. Paralled Simulated Annealing and Genetic Algorithm: A Space of

Hybrid Methods. In: Paralled Problem Solving from Nature 3, Springer-Verlag, 1994,

428~438

Grefenstette J J. Incorporation Problem Specific Knowledge into Genetic Algorithm. In:

Davis L Ed. Genetic Algorithms and Simulated A nnealing, Pitman, 1987, 42~60

[6] Han zhenxiang etc. Optimization Method of Simulating Evolution and Application.
Computer Science,1995*2*47~56

[4

(5.

Metadata

Resource Type:
Document
Rights:
Date Uploaded:
December 19, 2019

Using these materials

Access:
The archives are open to the public and anyone is welcome to visit and view the collections.
Collection restrictions:
Access to this collection is unrestricted unless otherwide denoted.
Collection terms of access:
https://creativecommons.org/licenses/by/4.0/

Access options

Ask an Archivist

Ask a question or schedule an individualized meeting to discuss archival materials and potential research needs.

Schedule a Visit

Archival materials can be viewed in-person in our reading room. We recommend making an appointment to ensure materials are available when you arrive.