Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dilan Görür is active.

Publication


Featured researches published by Dilan Görür.


international conference on machine learning | 2006

A choice model with infinitely many latent features

Dilan Görür; Frank Jäkel; Carl Edward Rasmussen

Elimination by aspects (EBA) is a probabilistic choice model describing how humans decide between several options. The options from which the choice is made are characterized by binary features and associated weights. For instance, when choosing which mobile phone to buy the features to consider may be: long lasting battery, color screen, etc. Existing methods for inferring the parameters of the model assume pre-specified features. However, the features that lead to the observed choices are not always known. Here, we present a non-parametric Bayesian model to infer the features of the options and the corresponding weights from choice data. We use the Indian buffet process (IBP) as a prior over the features. Inference using Markov chain Monte Carlo (MCMC) in conjugate IBP models has been previously described. The main contribution of this paper is an MCMC algorithm for the EBA model that can also be used in inference for other non-conjugate IBP models---this may broaden the use of IBP priors considerably.


Journal of Computational and Graphical Statistics | 2011

Concave-Convex Adaptive Rejection Sampling

Dilan Görür; Yee Whye Teh

We describe a method for generating independent samples from univariate density functions using adaptive rejection sampling without the log-concavity requirement. The method makes use of the fact that many functions can be expressed as a sum of concave and convex functions. Using a concave-convex decomposition, we bound the log-density by separately bounding the concave and convex parts using piecewise linear functions. The upper bound can then be used as the proposal distribution in rejection sampling. We demonstrate the applicability of the concave-convex approach on a number of standard distributions and describe an application to the efficient construction of sequential Monte Carlo proposal distributions for inference over genealogical trees. Computer code for the proposed algorithms is available online.


joint pattern recognition symposium | 2004

Modelling Spikes with Mixtures of Factor Analysers

Dilan Görür; Carl Edward Rasmussen; As Tolias; Fabian H. Sinz; Nk Logothetis

Identifying the action potentials of individual neurons from extracellular recordings, known as spike sorting, is a challenging problem. We consider the spike sorting problem using a generative model, mixtures of factor analysers, which concurrently performs clustering and feature extraction. The most important advantage of this method is that it quantifies the certainty with which the spikes are classified. This can be used as a means for evaluating the quality of clustering and therefore spike isolation. Using this method, nearly simultaneously occurring spikes can also be modelled which is a hard task for many of the spike sorting methods. Furthermore, modelling the data with a generative model allows us to generate simulated data.


signal processing and communications applications conference | 2009

Nonparametric mixtures of factor analyzers

Dilan Görür; Carl Edward Rasmussen

The mixtures of factor analyzers (MFA) model allows data to be modeled as a mixture of Gaussians with a reduced parametrization. We present the formulation of a nonparametric form of the MFA model, the Dirichlet process MFA (DPMFA). The proposed model can be used for density estimation or clustering of high dimensiona data. We utilize the DPMFA for clustering the action potentials of different neurons from extracellular recordings, a problem known as spike sorting. DPMFA model is compared to Dirichlet process mixtures of Gaussians model (DPGMM) which has a higher computational complexity. We show that DPMFA has similar modeling performance in lower dimensions when compared to DPGMM, and is able to work in higher dimensions.


international conference on artificial intelligence and statistics | 2007

Stick-breaking Construction for the Indian Buffet Process

Yee Whye Teh; Dilan Görür; Zoubin Ghahramani


Journal of Computer Science and Technology | 2010

Dirichlet process Gaussian mixture models: choice of the base distribution

Dilan Görür; Carl Edward Rasmussen


neural information processing systems | 2008

Dependent Dirichlet Process Spike Sorting

Jan Gasthaus; Frank D. Wood; Dilan Görür; Yee Whye Teh


neural information processing systems | 2008

An Efficient Sequential Monte Carlo Algorithm for Coalescent Clustering

Dilan Görür; Yee Whye Teh


international conference on artificial intelligence and statistics | 2009

Infinite Hierarchical Hidden Markov Models

Katherine A. Heller; Yee Whye Teh; Dilan Görür


Doctoral thesis, UNSPECIFIED. | 2007

Nonparametric Bayesian Discrete Latent Variable Models for Unsupervised Learning

Dilan Görür

Collaboration


Dive into the Dilan Görür's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

As Tolias

Baylor College of Medicine

View shared research outputs
Top Co-Authors

Avatar

Frank Jäkel

University of Osnabrück

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Jan Gasthaus

University College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge