Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Abolfazl Hashemi is active.

Publication


Featured researches published by Abolfazl Hashemi.


ieee global conference on signal and information processing | 2016

Sparse linear regression via generalized orthogonal least-squares

Abolfazl Hashemi; Haris Vikalo

The Orthogonal Least Squares (OLS) algorithm sequentially selects columns of the coefficient matrix to greedily find an approximate sparse solution to an underdetermined system of linear equations. In this paper, conditions under which OLS recovers sparse signals from a low number of random linear measurements with probability arbitrarily close to one are stated. Moreover, a computationally efficient generalization of Orthogonal Least-Squares which relies on a recursive relation between the components of the optimal solution to select L columns at each step and solve the resulting overdetermined system of equations is proposed. This generalized OLS algorithm is empirically shown to outperform existing greedy algorithms broadly used in literature.


international conference on acoustics, speech, and signal processing | 2017

Recovery of sparse signals via Branch and Bound Least-Squares

Abolfazl Hashemi; Haris Vikalo

We present an algorithm, referred to as Branch and Bound Least-Squares (BBLS), for the recovery of sparse signals from a few linear combinations of their entries. Sparse signal reconstruction is readily cast as the problem of finding a sparse solution to an underdetermined system of linear equations. To solve it, BBLS employs an efficient search strategy of traversing a tree whose nodes represent the columns of the coefficient matrix and selects a subset of those columns by relying on Orthogonal Least-Squares (OLS) procedure. We state sufficient conditions under which in noise-free settings BBLS with high probability constructs a tree path which corresponds to the true support of the unknown sparse signal. Moreover, we empirically demonstrate that BBLS provides performance superior to that of existing algorithms in terms of accuracy, running time, or both. In the scenarios where the columns of the coefficient matrix are characterized by high correlation, BBLS is particularly beneficial and significantly outperforms existing methods.


Digital Signal Processing | 2018

Accelerated orthogonal least-squares for large-scale sparse reconstruction

Abolfazl Hashemi; Haris Vikalo

Abstract We study the problem of inferring a sparse vector from random linear combinations of its components. We propose the Accelerated Orthogonal Least-Squares (AOLS) algorithm that improves performance of the well-known Orthogonal Least-Squares (OLS) algorithm while requiring significantly lower computational costs. While OLS greedily selects columns of the coefficient matrix that correspond to non-zero components of the sparse vector, AOLS employs a novel computationally efficient procedure that speeds up the search by anticipating future selections via choosing L columns in each step, where L is an adjustable hyper-parameter. We analyze the performance of AOLS and establish lower bounds on the probability of exact recovery for both noiseless and noisy random linear measurements. In the noiseless scenario, it is shown that when the coefficients are samples from a Gaussian distribution, AOLS with high probability recovers a k-sparse m-dimensional sparse vector using O ( k log ⁡ m k + L − 1 ) measurements. Similar result is established for the bounded-noise scenario where an additional condition on the smallest nonzero element of the unknown vector is required. The asymptotic sampling complexity of AOLS is lower than the asymptotic sampling complexity of the existing sparse reconstruction algorithms. In simulations, AOLS is compared to state-of-the-art sparse recovery techniques and shown to provide better performance in terms of accuracy, running time, or both. Finally, we consider an application of AOLS to clustering high-dimensional data lying on the union of low-dimensional subspaces and demonstrate its superiority over existing methods.


advances in computing and communications | 2018

A Randomized Greedy Algorithm for Near-Optimal Sensor Scheduling in Large-Scale Sensor Networks

Abolfazl Hashemi; Mahsa Ghasemi; Haris Vikalo; Ufuk Topcu


arXiv: Machine Learning | 2016

Sampling Requirements and Accelerated Schemes for Sparse Linear Regression with Orthogonal Least-Squares.

Abolfazl Hashemi; Haris Vikalo


international conference on acoustics, speech, and signal processing | 2018

Sampling and Reconstruction of Graph Signals via Weak Submodularity and Semidefinite Relaxation.

Abolfazl Hashemi; Rasoul Shafipour; Haris Vikalo; Gonzalo Mateos


arxiv:eess.SP | 2018

A Novel Scheme for Support Identification and Iterative Sampling of Bandlimited Graph Signals

Abolfazl Hashemi; Rasoul Shafipour; Haris Vikalo; Gonzalo Mateos


arxiv:eess.SP | 2018

Efficient Sampling of Bandlimited Graph Signals

Abolfazl Hashemi; Rasoul Shafipour; Haris Vikalo; Gonzalo Mateos


IEEE Journal of Selected Topics in Signal Processing | 2018

Evolutionary Self-Expressive Models for Subspace Clustering

Abolfazl Hashemi; Haris Vikalo


international conference on bioinformatics | 2017

Sparse Tensor Decomposition for Haplotype Assembly of Diploids and Polyploids

Abolfazl Hashemi; Banghua Zhu; Haris Vikalo

Collaboration


Dive into the Abolfazl Hashemi's collaboration.

Top Co-Authors

Avatar

Haris Vikalo

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mahsa Ghasemi

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Ufuk Topcu

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge