In this paper, the technique of approximate partition of unity is used to construct a class of neural networks operators with sigmoidal functions. Using the modulus of continuity of function as a metric, the errors of the operators approximating continuous functions defined on a compact interval are estimated. Furthmore, Bochner-Riesz means operators of double Fourier series are used to construct networks operators for approximating bivariate functions, and the errors of approximation by the operators are estimated.
In this paper, the multivariate Bernstein polynomials defined on a simplex are viewed as sampling operators, and a generalization by allowing the sampling operators to take place at scattered sites is studied. Both stochastic and deterministic aspects are applied in the study. On the stochastic aspect, a Chebyshev type estimate for the sampling operators is established. On the deterministic aspect, combining the theory of uniform distribution and the discrepancy method, the rate of approximating continuous fimction and Lp convergence for these operators are studied, respectively.
We establish a general oracle inequality for regularized risk minimizers with strongly mixing observations, and apply this inequality to support vector machine (SVM) type algorithms. The obtained main results extend the previous known results for independent and identically distributed samples to the case of exponentially strongly mixing observations.