Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Rudy Bunel is active.

Publication


Featured researches published by Rudy Bunel.


international conference on robotics and automation | 2016

Detection of pedestrians at far distance

Rudy Bunel; Franck Davoine; Philippe Xu

Pedestrian detection is a well-studied problem. Even though many datasets contain challenging case studies, the performances of new methods are often only reported on cases of reasonable difficulty. In particular, the issue of small scale pedestrian detection is seldom considered. In this paper, we focus on the detection of small scale pedestrians, i.e., those that are at far distance from the camera. We show that classical features used for pedestrian detection are not well suited for our case of study. Instead, we propose a convolutional neural network based method to learn the features with an end-to-end approach. Experiments on the Caltech Pedestrian Detection Benchmark showed that we outperformed existing methods by more than 10% in terms of log-average miss rate.


european conference on computer vision | 2016

Efficient Continuous Relaxations for Dense CRF

Alban Desmaison; Rudy Bunel; Pushmeet Kohli; Philip H. S. Torr; M. Pawan Kumar

Dense conditional random fields (CRF) with Gaussian pairwise potentials have emerged as a popular framework for several computer vision applications such as stereo correspondence and semantic segmentation. By modeling long-range interactions, dense CRFs provide a more detailed labelling compared to their sparse counterparts. Variational inference in these dense models is performed using a filtering-based mean-field algorithm in order to obtain a fully-factorized distribution minimising the Kullback-Leibler divergence to the true distribution. In contrast to the continuous relaxation-based energy minimisation algorithms used for sparse CRFs, the mean-field algorithm fails to provide strong theoretical guarantees on the quality of its solutions. To address this deficiency, we show that it is possible to use the same filtering approach to speed-up the optimisation of several continuous relaxations. Specifically, we solve a convex quadratic programming (QP) relaxation using the efficient Frank-Wolfe algorithm. This also allows us to solve difference-of-convex relaxations via the iterative concave-convex procedure where each iteration requires solving a convex QP. Finally, we develop a novel divide-and-conquer method to compute the subgradients of a linear programming relaxation that provides the best theoretical bounds for energy minimisation. We demonstrate the advantage of continuous relaxations over the widely used mean-field algorithm on publicly available datasets.


computer vision and pattern recognition | 2017

Efficient Linear Programming for Dense CRFs

Thalaiyasingam Ajanthan; Alban Desmaison; Rudy Bunel; Mathieu Salzmann; Philip H. S. Torr; M. Pawan Kumar

The fully connected conditional random field (CRF) with Gaussian pairwise potentials has proven popular and effective for multi-class semantic segmentation. While the energy of a dense CRF can be minimized accurately using a linear programming (LP) relaxation, the state-of-the-art algorithm is too slow to be useful in practice. To alleviate this deficiency, we introduce an efficient LP minimization algorithm for dense CRFs. To this end, we develop a proximal minimization framework, where the dual of each proximal problem is optimized via block coordinate descent. We show that each block of variables can be efficiently optimized. Specifically, for one block, the problem decomposes into significantly smaller subproblems, each of which is defined over a single pixel. For the other block, the problem is optimized via conditional gradient descent. This has two advantages: 1) the conditional gradient can be computed in a time linear in the number of pixels and labels, and 2) the optimal step size can be computed analytically. Our experiments on standard datasets provide compelling evidence that our approach outperforms all existing baselines including the previous LP based approach for dense CRFs.


neural information processing systems | 2016

Adaptive Neural Compilation

Rudy Bunel; Alban Desmaison; Pawan Kumar Mudigonda; Pushmeet Kohli; Philip H. S. Torr


arXiv: Artificial Intelligence | 2018

Piecewise Linear Neural Networks verification: A comparative study

Rudy Bunel; Ilker Turkaslan; Philip H. S. Torr; Pushmeet Kohli; M. Pawan Kumar


international conference on learning representations | 2017

Learning to superoptimize programs

Rudy Bunel; Alban Desmaison; M. Pawan Kumar; Philip H. S. Torr; Pushmeet Kohli


neural information processing systems | 2017

Neural Program Meta-Induction

Jacob Devlin; Rudy Bunel; Rishabh Singh; Matthew J. Hausknecht; Pushmeet Kohli


international conference on learning representations | 2018

Leveraging Grammar and Reinforcement Learning for Neural Program Synthesis

Rudy Bunel; Matthew J. Hausknecht; Jacob Devlin; Rishabh Singh; Pushmeet Kohli


neural information processing systems | 2018

A Unified View of Piecewise Linear Neural Network Verification

Rudy Bunel; Ilker Turkaslan; Philip H. S. Torr; Pushmeet Kohli; Pawan K Mudigonda


arXiv: Computer Vision and Pattern Recognition | 2018

Efficient Relaxations for Dense CRFs with Sparse Higher Order Potentials.

Thomas Joy; Alban Desmaison; Thalaiyasingam Ajanthan; Rudy Bunel; Mathieu Salzmann; Pushmeet Kohli; Philip H. S. Torr; M. Pawan Kumar

Collaboration


Dive into the Rudy Bunel's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Matthew J. Hausknecht

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge