Abstract
We present in this paper the implementation of the data driven method we called DTS (Divide to Simplify), that builds dynamically a Multi-Neural Network Architecture. The Multi-Neural Network architecture, we propose, solves a complex problem by splitting it into several easier problems. We have previously present a software version of the DTS multi-neural network architecture. The main idea of the DTS approach is to use a set of small and specialized mapping neural networks, or Slave Neural Networks (SNN), that are guided by a prototype based neural network, or Master Neural Network (MNN). In this paper, the MNN manages a set of hardware digital neural networks. Learning is performed in few milliseconds. We get a very good rate of classification when using the two spirals problem as a benchmark.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
S. Goonatilake and S. Khebbal, « Intelligent Hybrid Systems: Issues, Classification and Future Directions », In « Intelligent Hybrid Systems » John Wiley & Sons, pp 1–20, ISBN 0 471 94242 1
A. Krogh, J. Vedelsby, « Neural Network Ensembles, Cross Validation, and Active Learning », Advances in Neural Information Processing Systems 7, The MIT Press, Ed by G. Tesauro, pp 231–238.
ZISC036 data book, IBM Essonnes Component Development Laboratory, IBM Microelectronics, Corbeil-Essonnes, France.
K. J. Lang and M. J. Witbrock, « Learning to tell two spirals apart », Proc. of the 1988 Connectionist Models Summer School, Morgan Kauffman, pp 52–59.
S. E. Fahlman, C. Lebiere, « The Cascaded-Correlation Learning Architecture », Advances in Neural Information Processing Systems 2, Morgan Kauffman, San Mateo, pp 524–534.
T. R. Shultz, Y. O. Takane and Y. Takana, « Analysis of Unstandardized Contributions of Cross Connected Networks », Advances in Neural Information Processing Systems 7, The MIT Press, Ed by G. Tesauro, pp 610–608.
J. Hun, C. Moraga, « The influence of Sigmoid Function Parameters on the speed of Backpropagation Learning», International Workshop on Artificial Neural Network, Malaga-Torremolinos, Spain, June 1995, Springer, pp 195–201.
B. Fritzke, « Growing Cell Structure, A self organizing network for unsupervised and supervised training », ICSI Berkeley, Technical Report, tr-93-026.
J. Bruske, G. Sommer, «Dynamic Cell Structure», Advances in Neural Information Processing Systems 7, The MIT Press, Ed by G. Tesauro, pp 497–504.
A Hannibal « VLSI Building Block for Neural Networks with on chip Back Learning », Neurocoputing n∘5,1993, pp 25–37.
K. K. Sang and P. Niyogi, « Active learning for function approximation », Advances in Neural Information Processing Systems 7, The MIT Press, Ed by G. Tesauro, pp 593–600.
A. Chebira, G. Mercier, K.Madani, G. de Tremiolles « A prototype based neural network supervising a set of feature based networks: application to the two spirals problem», Fourth European Congress on Intelligent Techniques and Soft Computing, Aachen, Germany, September 2–5, 1996, volume 1, pp 303–307.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1997 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Chebira, A., Madani, K., Mercier, G. (1997). Multi-Neural Networks hardware and software architecture: Application of the divide to simplify paradigm DTS. In: Mira, J., Moreno-Díaz, R., Cabestany, J. (eds) Biological and Artificial Computation: From Neuroscience to Technology. IWANN 1997. Lecture Notes in Computer Science, vol 1240. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0032544
Download citation
DOI: https://doi.org/10.1007/BFb0032544
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-63047-0
Online ISBN: 978-3-540-69074-0
eBook Packages: Springer Book Archive
