George A. Anastassiou
University of Memphis
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by George A. Anastassiou.
Fuzzy Mathematics: Approximation Theory | 2010
George A. Anastassiou
This monograph belongs to the broader area of Fuzzy Mathematics and it is the first one in Fuzzy Approximation Theory. The chapters are self-contained with lots of applications to teach several advanced courses and the topics covered are very diverse. An extensive background of Fuzziness and Fuzzy Real Analysis is given. The author covers Fuzzy Differentiation and Integration Theory followed by Fuzzy Ostrowski inequalities. Then results on classical algebraic and trigonometric polynomial Fuzzy Approximation are presented. The author develops a complete theory of convergence with rates of Fuzzy Positive linear operators to Fuzzy unit operator, the so-called Fuzzy Korovkin Theory. The related Fuzzy Global Smoothness is included. Then follows the study of Fuzzy Wavelet type operators and their convergence with rates to Fuzzy unit operator. Similarly the Fuzzy Neural Network Operators are discussed followed by Fuzzy Random Korovkin approximation theory and Fuzzy Random Neural Network approximations. The author continues with Fuzzy Korovkin approximations in the sense of Summability. Finally fuzzy sense differences of Fuzzy Wavelet type operators are estimated. The monographs approach is quantitative and the main results are given via Fuzzy inequalities, involving Fuzzy moduli of continuity, that is Fuzzy Jackson type inequalities. The exposed theory is destined and expected to find applications to all aspects of Fuzziness from theoretical to practical in almost all sciences, technology, finance and industry. Also it has its interest within Pure Mathematics. So this monograph is suitable for researchers, graduate students and seminars of theoretical and applied mathematics, computer science, statistics and engineering.
Mathematical and Computer Modelling | 2010
George A. Anastassiou
Here we define a Caputo like discrete nabla fractional difference and we produce discrete nabla fractional Taylor formulae for the first time. We estimate their remainders. Then we derive related discrete nabla fractional Opial, Ostrowski, Poincare and Sobolev type inequalities.
Journal of Mathematical Analysis and Applications | 1997
George A. Anastassiou
This chapter deals with the determination of the rate of convergence to the unit of some neural network operators, namely, “the normalized bell and squashing type operators”.
Archive | 2010
George A. Anastassiou
In this chapter we show that any 2π-periodic fuzzy continuous function from \(\mathbb R\) to the fuzzy number space \({\mathbb R}_{\mathcal F}\), can be uniformly approximated by some fuzzy trigonometric polynomials. This chapter is based on [31].
Mathematical and Computer Modelling | 2011
George A. Anastassiou
Here we prove fractional representation formulae involving generalized fractional derivatives, Caputo fractional derivatives and Riemann-Liouville fractional derivatives.
Archive | 2011
George A. Anastassiou
-1. Opial-type inequalities for balanced fractional derivatives(Introduction, Background, Main Results, References). -2. Univariate right Caputo fractional Ostrowski inequalities (Introduction, Main Results, References). -3. Multivariate right Caputo fractional Ostrowski inequalities (Introduction, Main Results, References).-4. Univariate mixed fractional Ostrowski inequalities (Introduction, Main Results, References).-5. Multivariate radical mixed fractional Ostrowski inequalities(Introduction, Main Results, References).-6. Shell mixed Caputo fractional Ostrowski inequalities(Introduction, Main Results, References).-7. Left Caputo fractional uniform Landau inequalities (Introduction, Main Results, References).-8. Left Caputo Fractional Landau type Inequalities(Introduction, Main Results, References).-9. Right Caputo Fractional Landau type Inequalities(Introduction, Main Results, References).-10. Mixed Caputo Fractional Landau type Inequalities(Introduction, Main Results, References).-11. Multivariate Caputo Fractional Landau type Inequalities(Introduction, Main Results, References).
Analysis | 1991
George A. Anastassiou; Claudia Cottin; Heinz H. Gonska
AMS 1980 Subject Classification (1985 Revision): 41A17, 26A15, 26A16
Computers & Mathematics With Applications | 2012
George A. Anastassiou
Here, we study the univariate fractional quantitative approximation of real valued functions on a compact interval by quasi-interpolation sigmoidal and hyperbolic tangent neural network operators. These approximations are derived by establishing Jackson type inequalities involving the moduli of continuity of the right and left Caputo fractional derivatives of the engaged function. The approximations are pointwise and with respect to the uniform norm. The related feed-forward neural networks are with one hidden layer. Our fractional approximation results into higher order converges better than the ordinary ones.
Computers & Mathematics With Applications | 2010
George A. Anastassiou
Here we develop the nabla fractional calculus on time scales. Then we produce related integral inequalities of types: Poincare, Sobolev, Opial, Ostrowski and Hilbert-Pachpatte. Finally we give inequality applications on the time scales R, Z.
Computers & Mathematics With Applications | 2011
George A. Anastassiou
Here we study the multivariate quantitative approximation of real and complex valued continuous multivariate functions on a box or R^N, N@?N, by the multivariate quasi-interpolation hyperbolic tangent neural network operators. This approximation is derived by establishing multidimensional Jackson type inequalities involving the multivariate modulus of continuity of the engaged function or its high order partial derivatives. Our multivariate operators are defined by using a multidimensional density function induced by the hyperbolic tangent function. The approximations are pointwise and uniform. The related feed-forward neural network is with one hidden layer.