An important strength of learning classifier systems (LCSs) lies in the combination of genetic optimization techniques with gradient-based approximation techniques. The chosen approximation technique develops locally optimal approximations, such as accurate classification estimates, Q-value predictions, or linear function approximations. The genetic optimization technique is designed to distribute these local approximations efficiently over the problem space. Together, the two components develop a distributed, locally optimized problem solution in the form of a population of expert rules, often called classifiers. In function approximation problems, the XCSF classifier system develops a problem solution in the form of overlapping, piecewise linear approximations. This paper shows that XCSF performance on function approximation problems additively benefits from: 1) improved representations; 2) improved genetic operators; and 3) improved approximation techniques. Additionally, this paper introduces a novel closest classifier matching mechanism for the efficient compaction of XCS's final problem solution. The resulting compaction mechanism can boil the population size down by 90% on average, while decreasing prediction accuracy only marginally. Performance evaluations show that the additional mechanisms enable XCSF to reliably, accurately, and compactly approximate even seven dimensional functions. Performance comparisons with other, heuristic function approximation techniques show that XCSF yields competitive or even superior noise-robust performance.

Function Approximation With XCS: Hyperellipsoidal Conditions, Recursive Least Squares, and Compaction

LANZI, PIER LUCA;
2008-01-01

Abstract

An important strength of learning classifier systems (LCSs) lies in the combination of genetic optimization techniques with gradient-based approximation techniques. The chosen approximation technique develops locally optimal approximations, such as accurate classification estimates, Q-value predictions, or linear function approximations. The genetic optimization technique is designed to distribute these local approximations efficiently over the problem space. Together, the two components develop a distributed, locally optimized problem solution in the form of a population of expert rules, often called classifiers. In function approximation problems, the XCSF classifier system develops a problem solution in the form of overlapping, piecewise linear approximations. This paper shows that XCSF performance on function approximation problems additively benefits from: 1) improved representations; 2) improved genetic operators; and 3) improved approximation techniques. Additionally, this paper introduces a novel closest classifier matching mechanism for the efficient compaction of XCS's final problem solution. The resulting compaction mechanism can boil the population size down by 90% on average, while decreasing prediction accuracy only marginally. Performance evaluations show that the additional mechanisms enable XCSF to reliably, accurately, and compactly approximate even seven dimensional functions. Performance comparisons with other, heuristic function approximation techniques show that XCSF yields competitive or even superior noise-robust performance.
2008
File in questo prodotto:
File Dimensione Formato  
FunctionApproximation.pdf

Accesso riservato

: Post-Print (DRAFT o Author’s Accepted Manuscript-AAM)
Dimensione 1.34 MB
Formato Adobe PDF
1.34 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/551133
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 98
  • ???jsp.display-item.citation.isi??? 72
social impact