Page 100 - Neural Network Modeling and Identification of Dynamical Systems
P. 100

88                2. DYNAMIC NEURAL NETWORKS: STRUCTURES AND TRAINING METHODS

                         is uniform in a given frequency range in order to  [15] Mandic DP, Chambers JA. Recurrent neural networks
                         exert a sufficient excitatory effect on the dynam-  for prediction: Learning algorithms, architectures and
                         ical system.                                     stability. New York, NY: John Wiley & Sons, Inc.; 2001.
                                                                       [16] Medsker LR, Jain LC. Recurrent neural networks: De-
                            Step 6 of the process described above pro-
                                                                          sign and applications. New York, NY: CRC Press; 2001.
                         vides an input perturbation signal added to the
                                                                       [17] Michel A, Liu D. Qualitative analysis and synthesis of
                         main control action selected, for example, for   recurrent neural networks. London, New York: CRC
                         balancing an airplane or for performing a pre-   Press; 2002.
                         determined maneuver.                          [18] Yi Z, Tan KK. Convergence analysis of recurrent neural
                                                                          networks. Berlin: Springer; 2004.
                                                                       [19] Gupta MM, Jin L, Homma N. Static and dynamic neu-
                                                                          ral networks: From fundamentals to advanced theory.
                                       REFERENCES                         Hoboken, New Jersey: John Wiley & Sons; 2003.
                                                                       [20] Lin DT, Dayhoff JE, Ligomenides PA. Trajectory pro-
                           [1] Ollongren A. Definition of programming languages by  duction with the adaptive time-delay neural network.
                             interpreting automata. London, New York, San Fran-  Neural Netw 1995;8(3):447–61.
                             cisco: Academic Press; 1974.              [21] Guh RS, Shiue YR. Fast and accurate recognition of
                           [2] Brookshear JG. Theory of computation: Formal lan-  control chart patterns using a time delay neural net-
                             guages, automata, and complexity. Redwood City,  work. J Chin Inst Ind Eng 2010;27(1):61–79.
                             California: The Benjamin/Cummings Publishing Co.;  [22] Yazdizadeh A, Khorasani K, Patel RV. Identification of
                             1989.                                        a two-link flexible manipulator using adaptive time
                           [3] Chiswell I. A course in formal languages, automata  delay neural networks. IEEE Trans Syst Man Cybern,
                             and groups. London: Springer-Verlag; 2009.   Part B, Cybern 2010;30(1):165–72.
                           [4] Fu KS. Syntactic pattern recognition. London, New  [23] Juang JG, Chang HH, Chang WB. Intelligent automatic
                             York: Academic Press; 1974.                  landing system using time delay neural network con-
                           [5] Fu KS. Syntactic pattern recognition and applications.  troller. Appl Artif Intell 2003;17(7):563–81.
                             Englewood Cliffs, New Jersey: Prentice Hall, Inc.; 1982.  [24] Sun Y, Babovic V, Chan ES. Multi-step-ahead model er-
                           [6] Fu KS, editor. Syntactic methods in pattern recog-  ror prediction using time-delay neural networks com-
                             nition, applications. Berlin, Heidelberg, New York:  bined with chaos theory. J Hydrol 2010;395:109–16.
                             Springer-Verlag; 1977.                    [25] Zhang J, Wang Z, Ding D, Liu X. H ∞ state estima-
                           [7] Gonzalez RC, Thomason MG. Syntactic pattern recog-  tion for discrete-time delayed neural networks with
                             nition: An introduction. London: Addison-Wesley  randomly occurring quantizations and missing mea-
                             Publishing Company Inc.; 1978.               surements. Neurocomputing 2015;148:388–96.
                           [8] Tutschku K. Recurrent multilayer perceptrons for iden-  [26] Yazdizadeh A, Khorasani K. Adaptive time delay neu-
                             tification and control: The road to applications. Uni-
                                                                          ral network structures for nonlinear system identifica-
                             versity of Würzburg, Institute of Computer Science,  tion. Neurocomputing 2002;77:207–40.
                             Research Report Series, Report No. 118; June 1995.
                           [9] Heister F, Müller R. An approach for the identifica-  [27] Ren XM, Rad AB. Identification of nonlinear systems
                                                                          with unknown time delay based on time-delay neural
                             tion of nonlinear, dynamic processes with Kalman-  networks. IEEE Trans Neural Netw 2007;18(5):1536–41.
                             filter-trained recurrent neural structures. University of
                                                                       [28] Beale MH, Hagan MT, Demuth HB. Neural network
                             Würzburg, Institute of Computer Science, Research
                                                                          toolbox: User’s guide. Natick, MA: The MathWorks,
                             Report Series, Report No. 193; April 1999.
                          [10] Haykin S. Neural networks: A comprehensive founda-  Inc.; 2017.
                             tion. 2nd ed. Upper Saddle River, NJ, USA: Prentice  [29] ˇ Cerˇnanský M, Beˇnušková L. Simple recurrent network
                             Hall; 1998.                                  trained by RTRL and extended Kalman filter algo-
                          [11] Hagan MT, Demuth HB, Beale MH, De Jesús O. Neural  rithms. Neural Netw World 2003;13(3):223–34.
                             network design. 2nd ed. PSW Publishing Co.; 2014.  [30] Elman JL. Finding structure in time. Cogn Sci
                          [12] Graves A. Supervised sequence labelling with recur-  1990;14(2):179–211.
                             rent neural networks. Berlin, Heidelberg: Springer;  [31] Elman JL. Distributed representations, simple recur-
                             2012.                                        rent networks, and grammatical structure. Mach Learn
                          [13] Hammer B. Learning with recurrent neural networks.  1991;7:195–225.
                             Berlin, Heidelberg: Springer; 2000.       [32] Elman JL. Learning and development in neural net-
                          [14] Kolen JF, Kremer SC. A field guide to dynamical recur-  works: the importance of starting small. Cognition
                             rent networks. New York: IEEE Press; 2001.   1993;48(1):71–99.
   95   96   97   98   99   100   101   102   103   104   105