Daniel D. Lee
Alcatel-Lucent
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Daniel D. Lee.
Physical Review Letters | 2001
Jonathan E. Rubin; Daniel D. Lee; Haim Sompolinsky
A theory of temporally asymmetric Hebb rules, which depress or potentiate synapses depending upon whether the postsynaptic cell fires before or after the presynaptic one, is presented. Using the Fokker-Planck formalism, we show that the equilibrium synaptic distribution induced by such rules is highly sensitive to the manner in which bounds on the allowed range of synaptic values are imposed. In a biologically plausible multiplicative model, the synapses in asynchronous networks reach a distribution that is invariant to the firing rates of either the presynaptic or postsynaptic cells. When these cells are temporally correlated, the synaptic strength varies smoothly with the degree and phase of their synchrony.
Physical Review Letters | 2004
Olivia L. White; Daniel D. Lee; Haim Sompolinsky
We study the ability of linear recurrent networks obeying discrete time dynamics to store long temporal sequences that are retrievable from the instantaneous state of the network. We calculate this temporal memory capacity for both distributed shift register and random orthogonal connectivity matrices. We show that the memory capacity of these networks scales with system size.
Physical Review E | 2016
SueYeon Chung; Daniel D. Lee; Haim Sompolinsky
Objects are represented in sensory systems by continuous manifolds due to sensitivity of neuronal responses to changes in physical features such as location, orientation, and intensity. What makes certain sensory representations better suited for invariant decoding of objects by downstream networks? We present a theory that characterizes the ability of a linear readout network, the perceptron, to classify objects from variable neural responses. We show how the readout perceptron capacity depends on the dimensionality, size, and shape of the object manifolds in its input neural representation.
Neural Computation | 2018
SueYeon Chung; Uri Cohen; Haim Sompolinsky; Daniel D. Lee
We consider the problem of classifying data manifolds where each manifold represents invariances that are parameterized by continuous degrees of freedom. Conventional data augmentation methods rely on sampling large numbers of training examples from these manifolds. Instead, we propose an iterative algorithm, MCP, based on a cutting plane approach that efficiently solves a quadratic semi-infinite programming problem to find the maximum margin solution. We provide a proof of convergence as well as a polynomial bound on the number of iterations required for a desired tolerance in the objective function. The efficiency and performance of MCP are demonstrated in high-dimensional simulations and on image manifolds generated from the ImageNet data set. Our results indicate that MCP is able to rapidly learn good classifiers and shows superior generalization performance compared with conventional maximum margin methods using data augmentation methods.
neural information processing systems | 2000
Daniel D. Lee; H. Sebastian Seung
Science | 2000
H. Sebastian Seung; Daniel D. Lee
neural information processing systems | 1997
Nicholas D. Socci; Daniel D. Lee; H. Sebastian Seung
neural information processing systems | 1996
Daniel D. Lee; H. Sebastian Seung
neural information processing systems | 2000
Oren Shriki; Haim Sompolinsky; Daniel D. Lee
Archive | 1998
Daniel D. Lee; Hyunjune Sebastian Seung