Edward W. Kairiss
Yale University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Edward W. Kairiss.
Neural Models of Plasticity#R##N#Experimental and Theoretical Approaches | 1989
Thomas H. Brown; Alan H. Ganong; Edward W. Kairiss; Claude L. Keenan; Stephen R. Kelso
Publisher Summary This chapter focuses on some recent developments that bear on the current and projected understanding of long-term potentiation (LTP) in two synaptic systems. LTP is a use-dependent synaptic strengthening that can be induced by seconds or less of the appropriate activity. The chapter explores some testable models of LTP and presents the notion that the underlying mechanisms and activity-modification relationships may not be identical in different regions of the hippocampus. It also illustrates a schematic representation of the trisynaptic circuit elements within a transverse hippocampal slice. The problem of understanding the mechanisms responsible for LTP can be divided into three parts: (1) the induction of LTP, (2) the expression of LTP, and (3) the coupling of the early induction events to the final expression of the synaptic enhancement. The induction of LTP refers to the initial sequence of events that triggers or sets into motion the process of synaptic modification. The expression of LTP refers to those neurophysiological and biophysical changes that represent the ultimate consequence of this modification process and constitute the proximal cause of the observed synaptic enhancement.
Brain Research | 1994
John M. Beggs; Edward W. Kairiss
The intrinsic membrane properties of perirhinal cortical neurons were studied by intracellular recording in in vitro rat brain slices. Gross morphology was also examined through injection of the fluorescent dye carboxyfluorescein. The cells encountered displayed a diversity of electrophysiological properties, and were similar to cells reported in other neocortical areas with regard to spiking patterns, afterpotentials, and morphology. However, very few (4%) intrinsically bursting neurons were encountered. Two pyramidal cells with thick apical dendrites were filled, and both fired doublets of action potentials for their first suprathreshold events. Of the filled pyramidal cells with thin apical dendrites, most (9/11) fired single action potentials for their first suprathreshold events. A variety of classification schemes were used to group the data, and several schemes were found to be equally successful. According to one of the schemes, cells recorded with carboxyfluorescein filled electrodes had significantly greater action potential widths at half-amplitude and more depolarized resting potentials than cells recorded without this dye.
Archive | 1992
Edward W. Kairiss; Zachary F. Mainen; Brenda J. Claiborne; Thomas H. Brown
In his influential 1949 work, The Organization of Behavior, the Canadian psychologist Donald Hebb expressed the following postulate for synaptic modification: When an axon of cell A is near enough to excite cell B, or repeatedly or consistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased. The original idea underlying this proposal was that use-dependent synaptic modifications could form the substrate for cognitive learning and memory. More recently, it has been suggested that a similar process may also contribute to the self-organization of perceptual systems during development. Hebbian synaptic plasticity is interesting for at least two reasons. First, the idea has inspired many provocative theories and computational models of learning and self-organization (Linsker, 1990). Second, a process similar to what Hebb predicted has been shown to exist in the mammalian central nervous system (Kelso et al, 1986).
Biological Cybernetics | 1998
Edward W. Kairiss; Willard L. Miranker
Abstract. Biological memories have a number of unique features, including (1) hierarchical, reciprocally interacting layers, (2) lateral inhibitory interactions within layers, and (3) Hebbian synaptic modifications. We incorporate these key features into a mathematical and computational model in which we derive and study Hebbian learning dynamics and recall dynamics. Introducing the construct of a feasible memory (a memory that formally responds correctly to a specified collection of noisy cues that are known in advance), we study stability and convergence of the two kinds of dynamics by both analytical and computational methods. A conservation law for memory feasibility under Hebbian dynamics is derived. An infomax net is one where the synaptic weights resolve the most uncertainty about a neural input based on knowledge of the output. The infomax notion is described and is used to grade memories and memory performance. We characterize the recall dynamics of the most favorable solutions from an infomax perspective. This characterization includes the dynamical behavior when the net is presented with external stimuli (noisy cues) and a description of the accuracy of recall. The observed richness of dynamical behavior, such as its initial state sensitivity, provides some hints for possible biological parallels to this model.
CNS '96 Proceedings of the annual conference on Computational neuroscience : trends in research, 1997: trends in research, 1997 | 1997
Sean D. Murphy; Edward W. Kairiss
Common features of activity are seen in biological neural networks in different species and different neural structures, such as partially synchronized oscillations. More detailed distinct categories (or modes) of biological neural network activity may be identifiable by combining traditional techniques in analyzing dynamical systems with additional techniques that are particular to parameterizing biological neural networks. Once the basic dynamical modes present in biological networks are identified, physiological data on network and multi-network scales may be more easily classified and understood in relation to behavior. Some dynamical modes might be better suited to implementing spatial filters, such as might be expected in cortical area V1 of the visual system. Other dynamical modes might be suited to replaying or identifying spatio-temporal signals, such as in the auditory cortex. Other modes might be useful for storing and retrieving associative mnemonic representations, such as those that might be found in the hippocampus.
siguccs: user services conference | 2002
Philip E. Long; Jonathan Lizee; Ann G. Green; Edward W. Kairiss; Charles Powell
This paper presents a design philosophy for the targeted and cost-effective delivery of information technology services that balance innovation and infrastructure needs within an academic institution. Utilizing case studies from Yale University, the model which describes this philosophy is presented as generalizeable allowing accommodation of academic populations of different sizes, cultures and budgets. A key element is the ability to readily adapt to new technology through appropriate innovation while maintaining core commitments to production services, both of which are necessary for the effective infusion of IT in an academic environment.
Archive | 1995
Sean D. Murphy; Edward W. Kairiss
Understanding how the mammalian cortex processes information is one of the central goals of neuroscience. Progress in this area depends not only on modeling biophysical processes of neural substrates, but modeling at higher levels of abstraction where neural elements are implemented as computing units with mathematically formalized input/output functions. Currently popular paradigms (e.g., Hopfield nets, backprop nets, etc.) utilize computing units that ignore the spatio-temporal dynamical properties of biological neurons. Our modeling approach at the network level (which will not be discussed here in detail) is to consider cortical networks to have a basic set of features likely to influence information processing. These features include neurons with dynamical properties at different time scales, sparse connectivity, intrinsic inhibition, non-linear dendritic input summation, and axonal propagation delay.
Annual Review of Neuroscience | 1990
Thomas H. Brown; Edward W. Kairiss; Claude L. Keenan
Science | 1988
Thomas H. Brown; Pf Chapman; Edward W. Kairiss; Claude L. Keenan
Synapse | 1990
Paul F. Chapman; Edward W. Kairiss; Claude L. Keenan; Thomas H. Brown