Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Anna Choromanska is active.

Publication


Featured researches published by Anna Choromanska.


Frontiers in Neural Circuits | 2012

Automatic Reconstruction of Neural Morphologies with Multi-Scale Tracking

Anna Choromanska; Shih Fu Chang; Rafael Yuste

Neurons have complex axonal and dendritic morphologies that are the structural building blocks of neural circuits. The traditional method to capture these morphological structures using manual reconstructions is time-consuming and partly subjective, so it appears important to develop automatic or semi-automatic methods to reconstruct neurons. Here we introduce a fast algorithm for tracking neural morphologies in 3D with simultaneous detection of branching processes. The method is based on existing tracking procedures, adding the machine vision technique of multi-scaling. Starting from a seed point, our algorithm tracks axonal or dendritic arbors within a sphere of a variable radius, then moves the sphere center to the point on its surface with the shortest Dijkstra path, detects branching points on the surface of the sphere, scales it until branches are well separated and then continues tracking each branch. We evaluate the performance of our algorithm on preprocessed data stacks obtained by manual reconstructions of neural cells, corrupted with different levels of artificial noise, and unprocessed data sets, achieving 90% precision and 81% recall in branch detection. We also discuss limitations of our method, such as reconstructing highly overlapping neural processes, and suggest possible improvements. Multi-scaling techniques, well suited to detect branching structures, appear a promising strategy for automatic neuronal reconstructions.


algorithmic learning theory | 2013

Fast spectral clustering via the Nystrom method

Anna Choromanska; Tony Jebara; Hyungtae Kim; Mahesh Mohan; Claire Monteleoni

We propose and analyze a fast spectral clustering algorithm with computational complexity linear in the number of data points that is directly applicable to large-scale datasets. The algorithm combines two powerful techniques in machine learning: spectral clustering algorithms and Nystrom methods commonly used to obtain good quality low rank approximations of large matrices. The proposed algorithm applies the Nystrom approximation to the graph Laplacian to perform clustering. We provide theoretical analysis of the performance of the algorithm and show the error bound it achieves and we discuss the conditions under which the algorithm performance is comparable to spectral clustering with the original graph Laplacian. We also present empirical results.


algorithmic learning theory | 2013

Differentially-Private Learning of Low Dimensional Manifolds

Anna Choromanska; Krzysztof Choromanski; Geetha Jagannathan; Claire Monteleoni

In this paper, we study the problem of differentially-private learning of low dimensional manifolds embedded in high dimensional spaces. The problems one faces in learning in high dimensional spaces are compounded in differentially-private learning. We achieve the dual goals of learning the manifold while maintaining the privacy of the dataset by constructing a differentially-private data structure that adapts to the doubling dimension of the dataset. Our differentially-private manifold learning algorithm extends random projection trees of Dasgupta and Freund. A naive construction of differentially-private random projection trees could involve queries with high global sensitivity that would affect the usefulness of the trees. Instead, we present an alternate way of constructing differentially-private random projection trees that uses low sensitivity queries that are precise enough for learning the low dimensional manifolds. We prove that the size of the tree depends only on the doubling dimension of the dataset and not its extrinsic dimension.


intelligent robots and systems | 2017

Sensor modality fusion with CNNs for UGV autonomous driving in indoor environments

Naman Patel; Anna Choromanska; P. Krishnamurthy; Farshad Khorrami

We present a novel end-to-end learning framework to enable ground vehicles to autonomously navigate unknown environments by fusing raw pixels from cameras and depth measurements from a LiDAR. A deep neural network architecture is introduced to effectively perform modality fusion and reliably predict steering commands even in the presence of sensor failures. The proposed network is trained on our own dataset, from LiDAR and a camera mounted on a UGV taken in an indoor corridor environment. Comprehensive experimental evaluation to demonstrate the robustness of our network architecture is performed to show that the proposed deep learning neural network is able to autonomously navigate in the corridor environment. Furthermore, we demonstrate that the fusion of the camera and LiDAR modalities provides further benefits beyond robustness to sensor failures. Specifically, the multimodal fused system shows a potential to navigate around static and dynamic obstacles and to handle changes in environment geometry without being trained for these tasks.


Theoretical Computer Science | 2016

Differentially-private learning of low dimensional manifolds

Anna Choromanska; Krzysztof Choromanski; Geetha Jagannathan; Claire Monteleoni

In this paper, we study the problem of differentially-private learning of low dimensional manifolds embedded in high dimensional spaces. The problems one faces in learning in high dimensional spaces are compounded in a differentially-private learning. We achieve the dual goals of learning the manifold while maintaining the privacy of the dataset by constructing a differentially-private data structure that adapts to the doubling dimension of the dataset. Our differentially-private manifold learning algorithm extends random projection trees of Dasgupta and Freund. A naive construction of differentially-private random projection trees could involve queries with high global sensitivity that would affect the usefulness of the trees. Instead, we present an alternate way of constructing differentially-private random projection trees that uses low sensitivity queries that are precise enough for learning the low dimensional manifolds. We prove that the size of the tree depends only on the doubling dimension of the dataset and not its extrinsic dimension.


Archive | 2014

Selected machine learning reductions

Anna Choromanska

Selected machine learning reductions


international conference on artificial intelligence and statistics | 2015

The Loss Surfaces of Multilayer Networks

Anna Choromanska; Mikael Henaff; Michael Mathieu; Gérard Ben Arous; Yann LeCun


neural information processing systems | 2015

Deep learning with elastic averaging SGD

Sixin Zhang; Anna Choromanska; Yann LeCun


international conference on learning representations | 2017

Entropy-SGD: Biasing Gradient Descent Into Wide Valleys

Pratik Chaudhari; Anna Choromanska; Stefano Soatto; Yann LeCun; Carlo Baldassi; Christian Borgs; Jennifer T. Chayes; Levent Sagun; Riccardo Zecchina


Archive | 2014

The Loss Surface of Multilayer Networks.

Anna Choromanska; Mikael Henaff; Michael Mathieu; Gérard Ben Arous; Yann LeCun

Collaboration


Dive into the Anna Choromanska's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Claire Monteleoni

George Washington University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge