Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daniel Golovin is active.

Publication


Featured researches published by Daniel Golovin.


knowledge discovery and data mining | 2013

Ad click prediction: a view from the trenches

H. Brendan McMahan; Gary Holt; D. Sculley; Michael Young; Dietmar Ebner; Julian Paul Grady; Lan Nie; Todd Phillips; Eugene Davydov; Daniel Golovin; Sharat Chikkerur; Dan Liu; Martin Wattenberg; Arnar Mar Hrafnkelsson; Tom Boulos; Jeremy Kubica

Predicting ad click-through rates (CTR) is a massive-scale learning problem that is central to the multi-billion dollar online advertising industry. We present a selection of case studies and topics drawn from recent experiments in the setting of a deployed CTR prediction system. These include improvements in the context of traditional supervised learning based on an FTRL-Proximal online learning algorithm (which has excellent sparsity and convergence properties) and the use of per-coordinate learning rates. We also explore some of the challenges that arise in a real-world system that may appear at first to be outside the domain of traditional machine learning research. These include useful tricks for memory savings, methods for assessing and visualizing performance, practical methods for providing confidence estimates for predicted probabilities, calibration methods, and methods for automated management of features. Finally, we also detail several directions that did not turn out to be beneficial for us, despite promising results elsewhere in the literature. The goal of this paper is to highlight the close relationship between theoretical advances and practical engineering in this industrial setting, and to show the depth of challenges that appear when applying traditional machine learning methods in a complex dynamic system.


knowledge discovery and data mining | 2017

Google Vizier: A Service for Black-Box Optimization

Daniel Golovin; Benjamin Solnik; Subhodeep Moitra; Greg Kochanski; John Elliot Karro; D. Sculley

Any sufficiently complex system acts as a black box when it becomes easier to experiment with than to understand. Hence, black-box optimization has become increasingly important as systems have become more complex. In this paper we describe Google Vizier, a Google-internal service for performing black-box optimization that has become the de facto parameter tuning engine at Google. Google Vizier is used to optimize many of our machine learning models and other systems, and also provides core capabilities to Googles Cloud Machine Learning HyperTune subsystem. We discuss our requirements, infrastructure design, underlying algorithms, and advanced features such as transfer learning and automated early stopping that the service provides.


Mathematical Programming | 2015

Improved approximations for two-stage min-cut and shortest path problems under uncertainty

Daniel Golovin; Vineet Goyal; Valentin Polishchuk; R. Ravi; Mikko Sysikaski

In this paper, we study the robust and stochastic versions of the two-stage min-cut and shortest path problems introduced in Dhamdhere et al. (in How to pay, come what may: approximation algorithms for demand-robust covering problems. In: FOCS, pp 367–378,xa02005), and give approximation algorithms with improved approximation factors. Specifically, we give a 2-approximation for the robust min-cut problem and a 4-approximation for the stochastic version. For the two-stage shortest path problem, we give a


Ai Magazine | 2014

Sequential decision making in computational sustainability via adaptive submodularity

Andreas Krause; Daniel Golovin; Sarah J. Converse


Archive | 2014

Machine Learning: The High Interest Credit Card of Technical Debt

D. Sculley; Gary Holt; Daniel Golovin; Eugene Davydov; Todd Phillips; Dietmar Ebner; Vinay Chaudhary; Michael Young

3.39


neural information processing systems | 2015

Hidden technical debt in Machine learning systems

D. Sculley; Gary Holt; Daniel Golovin; Eugene Davydov; Todd Phillips; Dietmar Ebner; Vinay Chaudhary; Michael Young; Jean-françcois Crespo; Dan Dennison


international conference on machine learning | 2013

Large-Scale Learning with Less RAM via Randomization

Daniel Golovin; D. Sculley; H. Brendan McMahan; Michael Young

3.39-approximation for the robust version and


Archive | 2012

BAYESIAN RAPID OPTIMAL ADAPTIVE DESIGN (BROAD): METHOD AND APPLICATION DISTINGUISHING MODELS OF RISKY CHOICE

Debajyoti Ray; Daniel Golovin; Andreas Krause; Colin F. Camerer


Archive | 2017

Bayesian Optimization for a Better Dessert

Benjamin Solnik; Daniel Golovin; Greg Kochanski; John Elliot Karro; Subhodeep Moitra; D. Sculley

6.78


Archive | 2017

Black Box Optimization via a Bayesian-Optimized Genetic Algorithm

Daniel Golovin; Greg Kochanski; John Elliot Karro

Collaboration


Dive into the Daniel Golovin's collaboration.

Researchain Logo
Decentralizing Knowledge