Abstract
The problem of determining an optimal training schedule for a locally recurrent neural network is discussed. Specifically, the proper choice of the most informative measurement data guaranteeing the reliable prediction of the neural network response is considered. Based on a scalar measure of the performance defined on the Fisher information matrix related to the network parameters, the problem was formulated in terms of optimal experimental design. Then, its solution can be readily achieved via the adaptation of effective numerical algorithms based on the convex optimization theory. Finally, some illustrative experiments are provided to verify the presented approach.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Gupta, M.M., Jin, L., Homma, N.: Static and Dynamic Neural Networks. From Fundamentals to Advanced Theory. John Wiley & Sons, New Jersey (2003)
Korbicz, J., Kościelny, J., Kowalczuk, Z., Cholewa, W.: Fault Diagnosis. Models, Artificial Intelligence, Applications. Springer, Heidelberg (2004)
van de Wal, M., de Jager, B.: A review of methods for input/output selection. Automatica 37, 487–510 (2001)
Patan, K.: Artificial Neural Networks for the Modelling and Fault Diagnosis of Technical Processes. LNCIS. Springer, Berlin (2008)
Fukumizu, K.: Statistical active learning in multilayer perceptrons. IEEE Transactions on Neural Networks 11, 17–26 (2000)
Witczak, M.: Toward the training of feed-forward neural networks with the D-optimum input sequence. IEEE Transactions on Neural Networks 17, 357–373 (2006)
Patan, K., Patan, M.: Selection of training sequences for locally recurrent neural network training. In: Malinowski, K., Rutkowski, L. (eds.) Recent Advances in Control and Automation, pp. 252–262. Academic Publishing House, EXIT, Warsaw (2008)
Fedorov, V.V., Hackl, P.: Model-Oriented Design of Experiments. Lecture Notes in Statistics. Springer, New York (1997)
Tsoi, A.C., Back, A.D.: Locally recurrent globally feedforward networks: A critical review of architectures. IEEE Transactions on Neural Networks 5, 229–239 (1994)
Marcu, T., Mirea, L., Frank, P.M.: Development of dynamical neural networks with application to observer based fault detection and isolation. International Journal of Applied Mathematics and Computer Science 9(3), 547–570 (1999)
Patan, K.: Stability analysis and the stabilization of a class of discrete-time dynamic neural networks. IEEE Transactions on Neural Networks 18, 660–673 (2007)
Atkinson, A.C., Donev, A.N.: Optimum Experimental Designs. Clarendon Press, Oxford (1992)
Walter, E., Pronzato, L.: Identification of Parametric Models from Experimental Data. Springer, London (1997)
Patan, M.: Optimal Observation Strategies for Parameter Estimation of Distributed Systems. Lecture Notes in Control and Computer Science, vol. 5. Zielona Góra University Press, Zielona Góra (2004)
Uciński, D.: Optimal selection of measurement locations for parameter estimation in distributed processes. International Journal of Applied Mathematics and Computer Science 10(2), 357–379 (2000)
Rafajłowicz, E.: Optimum choice of moving sensor trajectories for distributed parameter system identification. International Journal of Control 43(5), 1441–1451 (1986)
Uciński, D.: Optimal Measurement Methods for Distributed Parameter System Identification. CRC Press, Boca Raton (2005)
Kiefer, J., Wolfowitz, J.: Optimum designs in regression problems. The Annals of Mathematical Statistics 30, 271–294 (1959)
Pázman, A.: Foundations of Optimum Experimental Design. Mathematics and Its Applications. D. Reidel Publishing Company, Dordrecht (1986)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Patan, K., Patan, M. (2009). Optimal Training Sequences for Locally Recurrent Neural Networks. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds) Artificial Neural Networks – ICANN 2009. ICANN 2009. Lecture Notes in Computer Science, vol 5768. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04274-4_9
Download citation
DOI: https://doi.org/10.1007/978-3-642-04274-4_9
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-04273-7
Online ISBN: 978-3-642-04274-4
eBook Packages: Computer ScienceComputer Science (R0)