Since its first happening in 1993, ESANN has become a reference for researchers on fundamentals and theoretical aspects of artificial neural networks, computational intelligence, machine learning and related topics. Each year, around 100 specialists attend ESANN, in order to present their latest results and comprehensive surveys, and to discuss the future developments in this field.
The ESANN'2010 conference will follow this tradition, while adapting its scope to the new developments in the field. The ESANN conferences cover artificial neural networks, machine learning, statistical information processing and computational intelligence. Mathematical foundations, algorithms and tools, and applications are covered.
The following is a non-exhaustive list of machine learning, computational intelligence and artificial neural networks topics covered during the ESANN conferences:
THEORY and MODELS
Statistical and mathematical aspects of learning
Graphical models, EM and Bayesian learning
Vector quantization and self-organizing maps
Recurrent networks and dynamical systems
Blind signal processing
Nonlinear projection and data visualization
Fuzzy neural networks
INFORMATION PROCESSING and APPLICATIONS
Signal processing and modeling
Approximation and identification
Classification and clustering
Feature extraction and dimension reduction
Time series forecasting
Multimodal interfaces and multichannel processing
Vision and sensory systems
Papers will be presented orally (single track) and in poster sessions; all posters will be complemented by a short oral presentation during a plenary session. It is important to mention that the topics of a paper decide if it better fits into an oral or a poster session, not its quality. The selection of posters will be identical to oral presentations, and both will be printed in the same way in the proceedings. Nevertheless, authors must indicate their preference for oral or poster presentation when submitting their paper.
Special Session: Machine learning techniques based on random projections
Benjamin Schrauwen (Ghent Univ., Belgium), Amaury Lendasse (Helsinki Univ. of Tech., Finland), Yoan Miche (I.N.P. Grenoble, France)
Machine learning techniques based on random projections have recently been widely used to perform regression, classification and Time Series prediction tasks. Among the most successful proposed methods lie Reservoir Computing , Extreme Learning Machine , Associative Neural Networks , Optimally-Pruned Extreme Learning Machine ‚..
One of the reasons for the success of random projections based methods is the extremely good performance in terms of ratio between accuracy and computational time. Indeed, even though random projection based methods are often not the most accurate ones, their usual training (learning) time is orders of magnitude smaller than this of classical methods. Furthermore, these methods are easily parallelized and can thus benefit from the recent improvements in the multi-core architecture of modern computers and video cards.
Recent developments in random projection led to groundbreaking advances and different approaches in machine learning. For example, exhaustive and brute force strategies that were not computationally possible became feasible within a reasonable time.
This special session is interested in theoretical advances, new random projection methods, new learning or meta-learning strategies and industrial applications for which the ratio accuracy/computational time is crucial.
 David Verstraeten, Benjamin Schrauwen, Michiel D`Haene and Dirk Stroobandt: An experimental unification of reservoir computing methods Neural Networks, Vol. 20(3) pp. 391-403 (2007)
 G.-B. Huang, Q.-Y. Zhu and C.-K. Siew: Extreme Learning Machine: Theory and Applications, Neurocomputing, vol. 70, pp. 489-501, 2006.
 Miller W. T., Glanz F. H., and Kraft L. G. Cmac: An associative neural network alternative to backpropagation. In Proceedings of the IEEE, volume 70, pages 1561‚Äì1567. October 1990.
 Yoan Miche, Patrick Bas, Christian Jutten, Olli Simula and Amaury Lendasse: A methodology for Building Regression Models using Extreme Learning Machine: OP-ELM, ESANN 2007, European Symposium on Artificial Neural Networks, Bruges (Belgium), pages 457-462. April, 2008.