Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Andrea Caponnetto is active.

Publication


Featured researches published by Andrea Caponnetto.


Neural Computation | 2004

Are loss functions all the same

Lorenzo Rosasco; Ernesto De Vito; Andrea Caponnetto; Michele Piana; Alessandro Verri

In this letter, we investigate the impact of choosing different loss functions from the viewpoint of statistical learning theory. We introduce a convexity assumption, which is met by all loss functions commonly used in the literature, and study how the bound on the estimation error changes with the loss. We also derive a general result on the minimizer of the expected risk for a convex loss function in the case of classification. The main outcome of our analysis is that for classification, the hinge loss appears to be the loss of choice. Other things being equal, the hinge loss leads to a convergence rate practically indistinguishable from the logistic loss rate and much better than the square loss rate. Furthermore, if the hypothesis space is sufficiently rich, the bounds obtained for the hinge loss are not loosened by the thresholding stage.


Analysis and Applications | 2006

DISCRETIZATION ERROR ANALYSIS FOR TIKHONOV REGULARIZATION

Ernesto De Vito; Lorenzo Rosasco; Andrea Caponnetto

We study the discretization of inverse problems defined by a Carleman operator. In particular, we develop a discretization strategy for this class of inverse problems and we give a convergence analysis. Learning from examples, as well as the discretization of integral equations, can be analyzed in our setting.


Analysis and Applications | 2011

A NOTE ON STABILITY OF ERROR BOUNDS IN STATISTICAL LEARNING THEORY

Ming Li; Andrea Caponnetto

We consider a wide class of error bounds developed in the context of statistical learning theory which are expressed in terms of functionals of the regression function, for instance, its norm in a reproducing kernel Hilbert space or other functional space. These bounds are unstable in the sense that a small perturbation of the regression function can induce an arbitrary large increase of the relevant functional and make the error bound useless. Using a known result involving Fano inequality, we show how stability can be recovered.


Journal of Machine Learning Research | 2005

Learning from Examples as an Inverse Problem

Ernesto De Vito; Lorenzo Rosasco; Andrea Caponnetto; Umberto De Giovannini; Francesca Odone


Journal of Machine Learning Research | 2008

Universal Multi-Task Kernels

Andrea Caponnetto; Charles A. Micchelli; Massimiliano Pontil; Yiming Ying


Journal of Machine Learning Research | 2004

Some Properties of Regularized Kernel Methods

Ernesto De Vito; Lorenzo Rosasco; Andrea Caponnetto; Michele Piana; Alessandro Verri


Analysis and Applications | 2010

CROSS-VALIDATION BASED ADAPTATION FOR REGULARIZATION OPERATORS IN LEARNING THEORY

Andrea Caponnetto; Yuan Yao


Archive | 2005

Fast Rates for Regularized Least-Squares Algorithm

Andrea Caponnetto; Ernesto De Vito


Journal of Machine Learning Research | 2006

Stability Properties of Empirical Risk Minimization over Donsker Classes

Andrea Caponnetto; Alexander Rakhlin


Archive | 2005

Some Properties of Empirical Risk Minimization Over Donsker Classes

Andrea Caponnetto; Alexander Rakhlin

Collaboration


Dive into the Andrea Caponnetto's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lorenzo Rosasco

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Alexander Rakhlin

University of Pennsylvania

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Charles A. Micchelli

State University of New York System

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge