Ngoc Mai Tran
University of Texas at Austin
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ngoc Mai Tran.
Bulletin of The London Mathematical Society | 2013
Bernd Sturmfels; Ngoc Mai Tran
The map which takes a square matrix to its tropical eigenvalue-eigenvector pair is piecewise linear. We determine the cones of linearity of this map. They are simplicial but they do not form a fan. Motivated by statistical ranking, we also study the restriction of that cone decomposition to the subspace of skew-symmetric matrices.
Bernoulli | 2015
Jim Pitman; Ngoc Mai Tran
This paper focuses on the size-biased permutation of
Journal of Mathematical Neuroscience | 2018
Christopher J. Hillar; Ngoc Mai Tran
n
Discrete and Computational Geometry | 2014
Ngoc Mai Tran
independent and identically distributed (i.i.d.) positive random variables. This is a finite dimensional analogue of the size-biased permutation of ranked jumps of a subordinator studied in Perman-Pitman-Yor (PPY) [Probab. Theory Related Fields 92 (1992) 21-39], as well as a special form of induced order statistics [Bull. Inst. Internat. Statist. 45 (1973) 295-300; Ann. Statist. 2 (1974) 1034-1039]. This intersection grants us different tools for deriving distributional properties. Their comparisons lead to new results, as well as simpler proofs of existing ones. Our main contribution, Theorem 25 in Section 6, describes the asymptotic distribution of the last few terms in a finite i.i.d. size-biased permutation via a Poisson coupling with its few smallest order statistics.
Mathematics of Operations Research | 2016
Ngoc Mai Tran
The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch–Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size. Here, we discover such networks by minimizing probability flow, a recently proposed objective for estimating parameters in discrete maximum entropy models. By descending the gradient of the convex probability flow, our networks adapt synaptic weights to achieve robust exponential storage, even when presented with vanishingly small numbers of training patterns. In addition to providing a new set of low-density error-correcting codes that achieve Shannon’s noisy channel bound, these networks also efficiently solve a variant of the hidden clique problem in computer science, opening new avenues for real-world applications of computational models originating from biology.
Linear Algebra and its Applications | 2013
Ngoc Mai Tran
The map which takes a square matrix A to its polytrope is piecewise linear. We show that cones of linearity of this map form a polytopal fan partition of
arXiv: Computer Science and Game Theory | 2015
Ngoc Mai Tran; Josephine Yu
\mathbb{R}^{n \times n}
Journal of Multivariate Analysis | 2018
Ngoc Mai Tran; Petra Burdejova; Maria Osipenko; Wolfgang Karl Härdle
, whose face lattice is anti-isomorphic to the lattice of complete set of connected relations. This fan refines the non-fan partition of
arXiv: Neurons and Cognition | 2012
Christopher J. Hillar; Ngoc Mai Tran; Kilian Koepsell
\mathbb{R}^{n \times n}
arXiv: Computer Science and Game Theory | 2016
Robert Alexander Crowell; Ngoc Mai Tran
corresponding to cones of linearity of the eigenvector map. Our results answer open questions in a previous work with Sturmfels (Bull. Lond. Math. Soc. 54:27–36, 2013) and lead to a new combinatorial classification of polytropes and tropical eigenspaces.