THE SOLUTION OF THE APPROXIMATION PROBLEM OF NONLINEAR DEPENDANCES USING ARTIFICIAL NEURAL NETWORKS
https://doi.org/10.26467/2079-0619-2018-21-2-40-50
Abstract
The paper discusses issues connected with the use of an artificial neural network (ANN) to approximate the experimental data. One of the problems in the development of the ANN is the choice of an appropriate activation function for neurons of the hidden layer and adjusting the parameters of the function in the learning process of the network. The article discusses a three-layer perceptron with one hidden layer, each neuron of which has the activation function in the form of a Gaussian curve. The choice of radial basis activation function allows the use of the direct method of determining the weight coefficients – method of least squares in the process of network training. Thus the quality of the approximation depends on the correct choice of the value parameter of the activation function, which in this case is the width of the Gaussian bell curve. In practice, this parameter is determined by conducting numerical experiments. This is a rather time-taking process. In this paper we propose to define the value of this parameter by the training set, representing the coordinates of the test curve points set with the desired properties. These properties are based on the a priori data of the approximated functions (linear, quadratic, logarithmic, exponential relationship). Because the test curve is given in explicit form, the parameter of activation function is determined from the condition of reaching the minimum of the integral from the squared difference between the values of the test functions and the output of the network. This approach guarantees obtaining the approximating curve with good properties, in particular, it is characterized by the absence of so-called "oscillations" – many inflection points in its graph.
About the Author
V. N. AgeyevRussian Federation
Doctor of Technical Sciences, Professor of Applied Mathematics Chair
References
1. Kalitkin N.N. Chislenniye metody [Numerical methods]. Saint-Petersburg, 2011. 592 p. (in Russian)
2. Kallan R. Osnovniye kontseptsii neyronnyh setey [Basic concepts of neural networks]. Translated from English. М.: Williams, 2003, 288 pp. (in Russian)
3. Haykin S. Neyronniye seti. Polniy kurs [Neural networks. A complete course]. Translated from English. 2nd edition. 2006. 1104 p. (in Russian)
4. Kruglov V.V., Borisov V.V. Iskusstvenniye neyronniye seti. Teoria i praktica [Artificial neural network. Theory and practice]. M: 2002. 382 pp. (in Russian)
5. Osovskiy S. Neyronnie seti dla obrobotki informatsii [Neural network for information processing]. Translated from polish. M.: Finance and Statistics, 2002, 344 p. (in Russian)
6. Kolmogorov A.N. O predstavlenii nepreryvnyh funktsiy neskol’kih peremennyh v vide superpozitsii nepreryvnyh funktsiy odnogo peremennogo [On the representation of continuous functions of several variables as a superposition of continuous functions of one variable]. Proceedings of the USSR Academy of Sciences, 1957. vol. 114, No 5, pp. 953–956. (in Russian)
7. Arnold V.I. O predstavlenii nepreryvnyh funktsiy neskol’kih peremennyh v vide superpozitsii nepreryvnyh funktsiy men’shego chisla peremennyh [On the representation of continuous functions of several variables as superposition of continuous functions of fewer variables] // Matematicheskoye prosveshenie [Mathematical education], 1958, No. 3, pp. 41–61. (in Russian)
8. Kruglov V.V., Dlee M.I., Golunov R.Yu. Nechetkaya logika i iskustvenniy intellekt [Fuzzy logic and artificial neural network]. М.: Fizmatlit, 2001, 224 p. (in Russian)
9. Pospelov V.V. O priblezhenii funktsiy neskolkih peremennyh proizvedeniyami finktsiy odnogo peremennogo: preprint №32 [On the approximation of several variables functions by products of functions of one variable: preprint. No. 32]. M.: Keldysh Institute of Applied Mathematics, 1978, 72 p. (in Russian)
10. Terekhov V.A., Efimov D.V., Tyukin I.Yu., Antonov V.I. Neyrosetevye sistemy upravleniya [Neural network control systems]. Saint-Petersburg: SPb. State University, 1999, 265 p. (in Russian)
11. Shura-Bura M.R. Aproksimatsiya funktsiy mnogih peremennyh funktsiyami. Kazhdaya is kotoryh zavisit ot odnogo peremennogo [Approximation of functions of many variables functions, each of which depends on one variable]. Vychislitelnaya matematika [Computational mathematics] – 1957, issue 27, pp. 3–19. (in Russian)
12. Butirsky E.Yu., Kuvaldin I.A., Chalkin V.P. Aproksimatsiya mnogomernyh funktsiy [Approximation of multidimensional functions]. Nauchnoye priborostroyeniye [Scientific instrumentation]. 2010, vol. 20, № 2, 82–92 pp. (in Russian)
Review
For citations:
Ageyev V.N. THE SOLUTION OF THE APPROXIMATION PROBLEM OF NONLINEAR DEPENDANCES USING ARTIFICIAL NEURAL NETWORKS. Civil Aviation High Technologies. 2018;21(2):40-50. (In Russ.) https://doi.org/10.26467/2079-0619-2018-21-2-40-50