Approximation by perturbed neural network operators

Volume 42 / 2015

George A. Anastassiou Applicationes Mathematicae 42 (2015), 57-81 MSC: 41A17, 41A25, 41A30, 41A36. DOI: 10.4064/am42-1-5


This article deals with the determination of the rate of convergence to the unit of each of three newly introduced perturbed normalized neural network operators of one hidden layer. These are given through the modulus of continuity of the function involved or its high order derivative that appears in the right-hand side of the associated Jackson type inequalities. The activation function is very general, in particular it can derive from any sigmoid or bell-shaped function. The right-hand sides of our convergence inequalities do not depend on the activation function. The sample functionals are of Stancu, Kantorovich or quadrature types. We give applications for the first derivative of the function involved.


  • George A. AnastassiouDepartment of Mathematical Sciences
    University of Memphis
    Memphis, TN 38152, U.S.A.

Search for IMPAN publications

Query phrase too short. Type at least 4 characters.

Rewrite code from the image

Reload image

Reload image