Computer Science Department : George Mason University  


EC lab Activities : Paper Discussion Group SuggestionsSuggesting papersTo suggest a paper for the discussion group, email me bibliographic information, a text abstract, and a link to a soft copy of the paper (if possible). Try to limit your suggestions to two per person. If you want to replace a suggestion you already have posted, indicate this in your email. If there is a vote pending, I may elect to hold the suggestion until the vote is over. This is to eliminate the possiblity that different people are voting with different suggestion lists.Voting for papersReview the list provided in the section below. We will use a weighted voting scheme to decide what the next paper we read should be. Each person should assign a grade to each paper, a score between 0 and 1 (inclusive). The sum of the grades does not have to be 1. The total will be aggregated and the largest score will used to select the paper. In the event of a tie, a paper will be selected at random.The voting results will be posted on the main paper discussion page. Paper suggestionsThe following selections were made by various members of the paper discussion group for consideration for reading.Adrian's suggestions[PDF] Abstract: In this paper we develop techniques based on evolvability statistics of the fitness landscape surrounding sampled solutions. Averaging the measures over a sample of equal fitness solutions allows us to build up fitness evolvability portraits of the fitness landscape, which we show can be used to compare both the ruggedness and neutrality in a set of tunably rugged and tunably neutral landscapes. We further show that the techniques can be used with solution samples collected through both random sampling of the landscapes, and online sampling during optimisation. Finally, we apply the techniques to two real evolutionary electronics search spaces, and highlight differences between the two search spaces, comparing with the time taken to find good solutions through search.
[PDF] Abstract: Investigates the underlying search space of a difficult robotics problem. Previous work (P. Husbands et al., 1998) on the development of neural networks incorporating a model of gaseous neuromodulation (the GasNet) suggested that such networks are wellsuited to evolutionary design for some problems. Networks that are allowed to use the gaseous signalling mechanism evolved significantly faster than networks with the mechanism disabled, implying a significant difference between the two search spaces. In this paper, we investigate this difference using a series of standard techniques for predicting the “difficulty” of searching in fitness landscapes. We show that, in this instance, measures based on random sampling do not discriminate between the two search spaces, due to the highly skewed nature of the fitness distributions, similar to those found in other difficult optimisation problems. It may be that such metrics are not useful as measures of difficulty for a class of complex problems
[PDF] Abstract: Recent work has argued for the importance of nonadaptive neutral evolution in optimisation over difficult search landscapes. In this paper we show that the search process underlying a difficult evolutionary robotics problem does indeed show phases of neutral evolution. The noise in evaluated fitness of a single genotype is shown to be able to account for the variance in fitness across a long period of the evolutionary run. We further show that the population moves significantly in genotype space during this neutral phase, possibly increasing in divergence. Finally, we investigate the probabilities of mutating to a higher fitness, above the neutral plateau, and find no evidence for a significant upward trend in these probabilities before the crucial mutations actually occurred.
Alexei's suggestions[PDF] Abstract: To construct a perpetual selfaware cognitive agent that can continuously operate with independence, an introspective machine must be produced. To assemble such an agent, it is necessary to perform a full integration of cognition (planning, understanding, and learning) and metacognition (control and monitoring of cognition) with intelligent behaviors. The failure to do this completely is why similar, more limited efforts have not succeeded in the past. I outline some key computational requirements of metacognition by describing a multistrategy learning system called MetaAQUA and then discuss an integration of MetaAQUA with a nonlinear statespace planning agent. I show how the resultant system, INTRO, can independently generate its own goals, and I relate this work to the general issue of selfawareness by machine.
Bill's suggestions[HTML] Abstract: In this paper, we investigate mating network interactions in a genetic algorithm (GA). We approach this study using a computational method from the study of complex systems: the analysis of networks interactions and network topology among basic components. Why should we study evolutionary algorithms (EAs) in this way? First, this approach is feasible and easily implemented. EAs are simulations of evolutionary processes and as such they can readily produce any data required to perform virtually any datadriven analysis of their behavior; much in the same way that large genomic and proteomic databases are fueling systems biology research. Second, this approach can serve as a unifying framework for studying evolutionary systems, both natural and computational. The data needed for this type of analysis is independent of the implementation and of the particular flavor of the evolutionary system.
Jeff's suggestions[PS] Abstract: An important goal of the theory of genetic algorithms is to build predictive models of how well genetic algorithms are expected to perform, given a representation, a fitness landscape, and a set of genetic operators. This paper attempts to provide pieces of such a theory, in the form of tools that predict the behavior of genetic algorithms based on assumptions concerning the fitness distribution of genetic operators. The fitness distribution of an operator describes the distribution of fitness values of individuals resulting from an operator application as a function of the fitness of the original individual. It is shown that in some cases, the mean of the fitness distribution for genetic operators may be described by simple unctions of the fitness of the parents. For these cases, predictive models of population fitness can be derived.
[Hardcopy will be made available] Abstract: It has long been a mystery how Fisher (1930, 1941, 1958) derived his famous 'fundamental theorem of Natural Selection' and exactly what he meant by it. He stated the theorem in these words (1930, p. 35; 1958, p. 37): 'The rate of increase in fitness of any organism at any time is equal to its genetic variance in fitness at that time.' And also in these words (1930, p. 46; 1058, p. 50): 'The rate of increase of fitness of any species is equal to the genetic variance in fitness.' He compared this result to the second law of thermodynamics, and described it as holding 'the supreme position among the biological sciences'. Also, he spoke of the 'rigour' of his derivation of the theorem and of 'the ease of its interpretation'. But others have variously described his derivation as 'recondite' (Crow & Kimura, 1970), 'very difficult' (Turner, 1970), or 'entirely obscure' (Kempthorne, 1957). And no one has ever found any other way to derive the result that Fisher seems to state. Hence, many authors (not reviewed here) have maintained that the theorem holds only under very special conditions, while only a few (e.g. Edwards, 1967) have thought that Fisher may have been correct  if only we could understand what he meant! It will be shown here that this latter view is correct. Fisher's theorem does indeed hold with the generality he claimed for it. The mystery and the controversy result from incomprehensibility rather than error.
 


Last updated: Wednesday September 10, 2008  webmaster 