Title: Non-revisiting Genetic Algorithm with Constant Memory

It is one seminar of Friday's Postgraduate Research Seminar Series, Department of Electronic Engineering, City University of Hong Kong.

Chair: Prof Moshe Zukerman
Venue: Lecture Theatre 16, Floor 4, Academic 1, City University of Hong Kong
Date: 30 October 2015


Abstract:

The continuous non-revisiting genetic algorithm (cNrGA) uses the entire search history and parameter-less adaptive mutation to significantly enhance search performance. Storing the entire search history is natural and costs little when the number of fitness evaluations is small or moderate. However, if the number of evaluations required is substantial, some memory management is desirable. We propose two pruning mechanisms to keep the memory used constant. They are least recently used (LRU) pruning and random (R) pruning. The basic idea is to prune a unit of memory when the memory threshold is reached and some new search information is required to be stored, thus keeping the overall memory used constant. Meanwhile, both pruning strategies naturally form parameter-less adaptive mutation operators. A study is carried out to evaluate the impact on performance caused by loss of search history information. Experimental results show that both strategies can maintain the performance of cNrGA, up to the empirical limit when 90% of the search history is not recorded. The overhead of both pruning strategies is small. This suggests that cNrGA can be extended to use in situations when the number of fitness evaluations are much larger than before with no significant effect on statistical performance. This widens the applicability of cNrGA to include more practical problems that require larger number of fitness evaluations before converging to the global optima. The new cNrGAs are compared with real coded genetic algorithm and show superior performance. The comparison suggests that the cNrGAs can automatically find suitable parameters from its search history and pruning mechanisms. They are also compared with particle swarm optimization 2011, the current "standard" PSO. The results show that the cNrGAs are better at solving the multimodal and composition problems, but are inferior for solving unimodal problems. These results suggest that the new cNrGAs would give better performance than PSO when the problems have complex landscapes.


Relevant Papers:

1. Yang Lou and Shiu Yin Yuen, "Non-revisiting Genetic Algorithm with Constant Memory", In Proc. IEEE SMC '15, Hong Kong, 1714-1719 (2015)
2. Yang Lou and Shiu Yin Yuen, "Non-revisiting Genetic Algorithm with Adaptive Mutation Using Constant Memory", Memetic Computing, doi: 10.1007/s12293-015-0178-6 (2016) [Link] [Matlab Code]

Back