Jakub Konečný
University of Edinburgh
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jakub Konečný.
Frontiers in Applied Mathematics and Statistics | 2017
Jakub Konečný; Peter Richtárik
In this paper we study the problem of minimizing the average of a large number (
Optimization Methods & Software | 2017
Jakub Konečný; Zheng Qu; Peter Richtárik
n
Optimization Methods & Software | 2017
Chenxin Ma; Jakub Konečný; Martin Jaggi; Virginia Smith; Michael I. Jordan; Peter Richtárik; Martin Takáč
) of smooth convex loss functions. We propose a new method, S2GD (Semi-Stochastic Gradient Descent), which runs for one or several epochs in each of which a single full gradient and a random number of stochastic gradients is computed, following a geometric law. The total work needed for the method to output an
neural information processing systems | 2015
Reza Babanezhad; Mohamed Osama Ahmed; Alim Virani; Mark W. Schmidt; Jakub Konečný; Scott Sallinen
\varepsilon
arXiv: Learning | 2015
Jakub Konečný; H. Brendan McMahan; Daniel Ramage
-accurate solution in expectation, measured in the number of passes over data, or equivalently, in units equivalent to the computation of a single gradient of the loss, is
arXiv: Learning | 2018
Jakub Konečný; H. Brendan McMahan; Felix X. Yu; Ananda Theertha Suresh; Dave Bacon; Peter Richtárik
O((\kappa/n)\log(1/\varepsilon))
arXiv: Optimization and Control | 2016
Sashank J. Reddi; Jakub Konečný; Peter Richtárik; Barnabás Póczos; Alexander J. Smola
, where
arXiv: Learning | 2016
Jakub Konečný; H. Brendan McMahan; Daniel Ramage; Peter Richtárik
\kappa
Archive | 2014
Jakub Konečný; Zheng Qu; Peter Richtárik
is the condition number. This is achieved by running the method for
arXiv: Optimization and Control | 2014
Jakub Konečný; Peter Richtárik
O(\log(1/\varepsilon))