JEDNOSTKA NAUKOWA KATEGORII A+

Approximation by perturbed neural network operators

Tom 42 / 2015

George A. Anastassiou Applicationes Mathematicae 42 (2015), 57-81 MSC: 41A17, 41A25, 41A30, 41A36. DOI: 10.4064/am42-1-5

Streszczenie

This article deals with the determination of the rate of convergence to the unit of each of three newly introduced perturbed normalized neural network operators of one hidden layer. These are given through the modulus of continuity of the function involved or its high order derivative that appears in the right-hand side of the associated Jackson type inequalities. The activation function is very general, in particular it can derive from any sigmoid or bell-shaped function. The right-hand sides of our convergence inequalities do not depend on the activation function. The sample functionals are of Stancu, Kantorovich or quadrature types. We give applications for the first derivative of the function involved.

Autorzy

  • George A. AnastassiouDepartment of Mathematical Sciences
    University of Memphis
    Memphis, TN 38152, U.S.A.
    e-mail

Przeszukaj wydawnictwa IMPAN

Zbyt krótkie zapytanie. Wpisz co najmniej 4 znaki.

Przepisz kod z obrazka

Odśwież obrazek

Odśwież obrazek