Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Shai Shalev-Shwartz is active.

Publication


Featured researches published by Shai Shalev-Shwartz.


international conference on machine learning | 2007

Pegasos: Primal Estimated sub-GrAdient SOlver for SVM

Shai Shalev-Shwartz; Yoram Singer; Nathan Srebro

We describe and analyze a simple and effective iterative algorithm for solving the optimization problem cast by Support Vector Machines (SVM). Our method alternates between stochastic gradient descent steps and projection steps. We prove that the number of iterations required to obtain a solution of accuracy ε is Õ(1/ε). In contrast, previous analyses of stochastic gradient descent methods require Ω (1/ε2) iterations. As in previously devised SVM solvers, the number of iterations also scales linearly with 1/λ, where λ is the regularization parameter of SVM. For a linear kernel, the total run-time of our method is Õ (d/(λε)), where d is a bound on the number of non-zero features in each example. Since the run-time does not depend directly on the size of the training set, the resulting algorithm is especially suited for learning from large datasets. Our approach can seamlessly be adapted to employ non-linear kernels while working solely on the primal objective function. We demonstrate the efficiency and applicability of our approach by conducting experiments on large text classification problems, comparing our solver to existing state-of-the-art SVM solvers. For example, it takes less than 5 seconds for our solver to converge when solving a text classification problem from Reuters Corpus Volume 1 (RCV1) with 800,000 training examples.


international conference on machine learning | 2008

Efficient projections onto the l 1 -ball for learning in high dimensions

John C. Duchi; Shai Shalev-Shwartz; Yoram Singer; Tushar Deepak Chandra

We describe efficient algorithms for projecting a vector onto the l1-ball. We present two methods for projection. The first performs exact projection in O(n) expected time, where n is the dimension of the space. The second works on vectors k of whose elements are perturbed outside the l1-ball, projecting in O(k log(n)) time. This setting is especially useful for online learning in sparse feature spaces such as text categorization applications. We demonstrate the merits and effectiveness of our algorithms in numerous batch and online learning tasks. We show that variants of stochastic gradient projection methods augmented with our efficient projection procedures outperform interior point methods, which are considered state-of-the-art optimization techniques. We also show that in online settings gradient updates with l1 projections outperform the exponentiated gradient algorithm while obtaining models with high degrees of sparsity.


Foundations and Trends® in Machine Learning archive | 2012

Online Learning and Online Convex Optimization

Shai Shalev-Shwartz

Online learning is a well established learning paradigm which has both theoretical and practical appeals. The goal of online learning is to make a sequence of accurate predictions given knowledge of the correct answer to previous prediction tasks and possibly additional available information. Online learning has been studied in several research fields including game theory, information theory, and machine learning. It also became of great interest to practitioners due the recent emergence of large scale applications such as online advertisement placement and online web ranking. In this survey we provide a modern overview of online learning. Our goal is to give the reader a sense of some of the interesting ideas and in particular to underscore the centrality of convexity in deriving efficient online learning algorithms. We do not mean to be comprehensive but rather to give a high-level, rigorous yet easy to follow, survey.


Mathematical Programming | 2011

Pegasos: primal estimated sub-gradient solver for SVM

Shai Shalev-Shwartz; Yoram Singer; Nathan Srebro; Andrew Cotter

We describe and analyze a simple and effective stochastic sub-gradient descent algorithm for solving the optimization problem cast by Support Vector Machines (SVM). We prove that the number of iterations required to obtain a solution of accuracy


international conference on machine learning | 2004

Online and batch learning of pseudo-metrics

Shai Shalev-Shwartz; Yoram Singer; Andrew Y. Ng


international conference on machine learning | 2008

SVM optimization: inverse dependence on training set size

Shai Shalev-Shwartz; Nathan Srebro

{\epsilon}


international conference on machine learning | 2014

Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization

Shai Shalev-Shwartz; Tong Zhang


SIAM Journal on Computing | 2008

The Forgetron: A Kernel-Based Perceptron on a Budget

Ofer Dekel; Shai Shalev-Shwartz; Yoram Singer

is


international conference on machine learning | 2009

Stochastic methods for ℓ1 regularized loss minimization

Shai Shalev-Shwartz; Ambuj Tewari


Siam Journal on Optimization | 2010

Trading Accuracy for Sparsity in Optimization Problems with Sparsity Constraints

Shai Shalev-Shwartz; Nathan Srebro; Tong Zhang

{\tilde{O}(1 / \epsilon)}

Collaboration


Dive into the Shai Shalev-Shwartz's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ohad Shamir

Weizmann Institute of Science

View shared research outputs
Top Co-Authors

Avatar

Yoram Singer

Hebrew University of Jerusalem

View shared research outputs
Top Co-Authors

Avatar

Alon Gonen

Hebrew University of Jerusalem

View shared research outputs
Top Co-Authors

Avatar

Amnon Shashua

Hebrew University of Jerusalem

View shared research outputs
Top Co-Authors

Avatar

Amit Daniely

Hebrew University of Jerusalem

View shared research outputs
Top Co-Authors

Avatar

Nathan Srebro

Toyota Technological Institute at Chicago

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yoram Singer

Hebrew University of Jerusalem

View shared research outputs
Researchain Logo
Decentralizing Knowledge