Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Debarghya Ghoshdastidar is active.

Publication


Featured researches published by Debarghya Ghoshdastidar.


international symposium on information theory | 2012

q-Gaussian based Smoothed Functional algorithms for stochastic optimization

Debarghya Ghoshdastidar; Ambedkar Dukkipati; Shalabh Bhatnagar

The q-Gaussian distribution results from maximizing certain generalizations of Shannon entropy under some constraints. The importance of q-Gaussian distributions stems from the fact that they exhibit power-law behavior, and also generalize Gaussian distributions. In this paper, we propose a Smoothed Functional (SF) scheme for gradient estimation using q-Gaussian distribution, and also propose an algorithm for optimization based on the above scheme. Convergence results of the algorithm are presented. Performance of the proposed algorithm is shown by simulation results on a queuing model.


Annals of Statistics | 2017

Consistency of spectral hypergraph partitioning under planted partition model

Debarghya Ghoshdastidar; Ambedkar Dukkipati

Hypergraph partitioning lies at the heart of a number of problems in machine learning and network sciences. Many algorithms for hypergraph partitioning have been proposed that extend standard approaches for graph partitioning to the case of hypergraphs. However, theoretical aspects of such methods have seldom received attention in the literature as compared to the extensive studies on the guarantees of graph partitioning. For instance, consistency results of spectral graph partitioning under the stochastic block model are well known. In this paper, we present a planted partition model for sparse random non-uniform hypergraphs that generalizes the stochastic block model. We derive an error bound for a spectral hypergraph partitioning algorithm under this model using matrix concentration inequalities. To the best of our knowledge, this is the first consistency result related to partitioning non-uniform hypergraphs.


ACM Transactions on Modeling and Computer Simulation | 2014

Smoothed Functional Algorithms for Stochastic Optimization Using q -Gaussian Distributions

Debarghya Ghoshdastidar; Ambedkar Dukkipati; Shalabh Bhatnagar

Smoothed functional (SF) schemes for gradient estimation are known to be efficient in stochastic optimization algorithms, especially when the objective is to improve the performance of a stochastic system. However, the performance of these methods depends on several parameters, such as the choice of a suitable smoothing kernel. Different kernels have been studied in the literature, which include Gaussian, Cauchy, and uniform distributions, among others. This article studies a new class of kernels based on the q-Gaussian distribution, which has gained popularity in statistical physics over the last decade. Though the importance of this family of distributions is attributed to its ability to generalize the Gaussian distribution, we observe that this class encompasses almost all existing smoothing kernels. This motivates us to study SF schemes for gradient estimation using the q-Gaussian distribution. Using the derived gradient estimates, we propose two-timescale algorithms for optimization of a stochastic objective function in a constrained setting with a projected gradient search approach. We prove the convergence of our algorithms to the set of stationary points of an associated ODE. We also demonstrate their performance numerically through simulations on a queuing model.


international joint conference on neural network | 2016

Mixture modeling with compact support distributions for unsupervised learning

Ambedkar Dukkipati; Debarghya Ghoshdastidar; Jinu Krishnan

The importance of the q-Gaussian distributions is attributed to their power law nature and the fact that they generalize the Gaussian distributions (q → 1 retrieves the Gaussian distributions). While for q > 1, a q-Gaussian distribution is nothing but a Students t-distribution, which is a long tailed distribution, for q <; 1 it is a distribution with a compact support. Though mixture modeling with t-distributions has been studied, mixture modeling with compact support distributions has not been explored in the literature. The main aim of this paper is to study mixture modeling using q-Gaussian distributions that have a compact support. We study estimation of the parameters of this model using Maximum Likelihood Estimator (MLE) via Expectation Maximization (EM) algorithm. We further study applications of these compact support distributions to clustering and anomaly detection. As far as our knowledge, this is the first work that studies compact support distributions in statistical modeling for unsupervised learning problems.


IEEE Transactions on Neural Networks | 2016

Learning With Jensen-Tsallis Kernels

Debarghya Ghoshdastidar; Ajay P. Adsul; Ambedkar Dukkipati

Jensen-type [Jensen-Shannon (JS) and Jensen-Tsallis] kernels were first proposed by Martins et al. (2009). These kernels are based on JS divergences that originated in the information theory. In this paper, we extend the Jensen-type kernels on probability measures to define positive-definite kernels on Euclidean space. We show that the special cases of these kernels include dot-product kernels. Since Jensen-type divergences are multidistribution divergences, we propose their multipoint variants, and study spectral clustering and kernel methods based on these. We also provide experimental studies on benchmark image database and gene expression database that show the benefits of the proposed kernels compared with the existing kernels. The experiments on clustering also demonstrate the use of constructing multipoint similarities.


Automatica | 2014

Newton-based stochastic optimization using q -Gaussian smoothed functional algorithms

Debarghya Ghoshdastidar; Ambedkar Dukkipati; Shalabh Bhatnagar

We present the first q -Gaussian smoothed functional (SF) estimator of the Hessian and the first Newton-based stochastic optimization algorithm that estimates both the Hessian and the gradient of the objective function using q -Gaussian perturbations. Our algorithm requires only two system simulations (regardless of the parameter dimension) and estimates both the gradient and the Hessian at each update epoch using these. We also present a proof of convergence of the proposed algorithm. In a related recent work (Ghoshdastidar, Dukkipati, & Bhatnagar, 2014), we presented gradient SF algorithms based on the q -Gaussian perturbations. Our work extends prior work on SF algorithms by generalizing the class of perturbation distributions as most distributions reported in the literature for which SF algorithms are known to work turn out to be special cases of the q -Gaussian distribution. Besides studying the convergence properties of our algorithm analytically, we also show the results of numerical simulations on a model of a queuing network, that illustrate the significance of the proposed method. In particular, we observe that our algorithm performs better in most cases, over a wide range of q -values, in comparison to Newton SF algorithms with the Gaussian and Cauchy perturbations, as well as the gradient q -Gaussian SF algorithms.


neural information processing systems | 2014

Consistency of Spectral Partitioning of Uniform Hypergraphs under Planted Partition Model

Debarghya Ghoshdastidar; Ambedkar Dukkipati


national conference on artificial intelligence | 2015

Spectral clustering using multilinear SVD: analysis, approximations and applications

Debarghya Ghoshdastidar; Ambedkar Dukkipati


international conference on machine learning | 2015

A Provable Generalized Tensor Spectral Method for Uniform Hypergraph Partitioning

Debarghya Ghoshdastidar; Ambedkar Dukkipati


international conference on artificial intelligence and statistics | 2017

Comparison-Based Nearest Neighbor Search

Siavash Haghiri; Debarghya Ghoshdastidar; Ulrike von Luxburg

Collaboration


Dive into the Debarghya Ghoshdastidar's collaboration.

Top Co-Authors

Avatar

Ambedkar Dukkipati

Indian Institute of Science

View shared research outputs
Top Co-Authors

Avatar

Shalabh Bhatnagar

Indian Institute of Science

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ajay P. Adsul

Indian Institute of Science

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Aparna S. Vijayan

Indian Institute of Science

View shared research outputs
Top Co-Authors

Avatar

Gaurav Pandey

Icahn School of Medicine at Mount Sinai

View shared research outputs
Top Co-Authors

Avatar

Jinu Krishnan

Indian Institute of Science

View shared research outputs
Researchain Logo
Decentralizing Knowledge