Steven Hampson
University of California, Irvine
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Steven Hampson.
Bioinformatics | 2002
Pierre-François Baisnée; Steven Hampson; Pierre Baldi
MOTIVATION Over sufficiently long windows, complementary strands of DNA tend to have the same base composition. A few reports have indicated that this first-order parity rule extends at higher orders to oligonucleotide composition, at least in some organisms or taxa. However, the scientific literature falls short of providing a comprehensive study of reverse-complement symmetry at multiple orders and across the kingdom of life. It also lacks a characterization of this symmetry and a convincing explanation or clarification of its origin. RESULTS We develop methods to measure and characterize symmetry at multiple orders, and analyze a wide set of genomes, encompassing single- and double-stranded RNA and DNA viruses, bacteria, archae, mitochondria, and eukaryota. We quantify symmetry at orders 1 to 9 for contiguous sequences and pools of coding and non-coding upstream regions, compare the observed symmetry levels to those predicted by simple statistical models, and factor out the effect of lower-order distributions. We establish the universality and variability range of first-order strand symmetry, as well as of its higher-order extensions, and demonstrate the existence of genuine high-order symmetric constraints. We show that ubiquitous reverse-complement symmetry does not result from a single cause, such as point mutation or recombination, but rather emerges from the combined effects of a wide spectrum of mechanisms operating at multiple orders and length scales.
Biological Cybernetics | 1986
Steven Hampson; D. J. Volper
Three different representations for a thresholded linear equation are developed. For binary input they are shown to be representationally equivalent though their training characteristics differ.A training algorithm for linear equations is discussed. The similarities between its simplest mathematical representation (perceptron training), a formal model of animal learning (Rescorla-Wagner learning), and one mechanism of neural learning (Aplysia gill withdrawal) are pointed out. For d input features, perceptron training is shown to have a lower bound of 2d and an upper bound of dd adjusts. It is possible that the true upper bound is 4d, though this has not been proved. Average performance is shown to have a lower bound of 1.4d. Learning time is shown to increase linearly with the number of irrelevant or replicated features. The (X of N) function (a subset of linearly separable functions containing OR and AND) is shown to be learnable in d3 time.A method of utilizing conditional probability to accelerate learning is proposed. This reduces the observed growth rate from 4d to the theoretical minimum (for the unmodified version) of 2d. A different version reduces the growth rate to about 1.7d. The linear effect of irrelevant features can also be eliminated. Whether such an approach can be made provably convergent is not known.
Biological Cybernetics | 1983
Steven Hampson; Dennis F. Kibler
A multi-layered neural assembly is developed which has the capability of learning arbitrary Boolean functions. Though the model neuron is more powerful than those previously considered, assemblies of neurons are needed to detect non-linearly separable patterns. Algorithms for learning at the neuron and assembly level are described. The model permits multiple output systems to share a common memory. Learned evaluation allows sequences of actions to be organized. Computer simulations demonstrate the capabilities of the model.
Circulation Research | 2003
Tao Li; Yung-Hsiang Chen; Tsun-Jui Liu; Jia Jia; Steven Hampson; Yue-Xin Shan; Dennis F. Kibler; Ping H. Wang
Abstract— High throughput gene expression profiling with DNA microarray provides an opportunity to analyze transcriptional regulation of hundreds or thousands of similarly regulated genes. Transcriptional regulation of gene expression plays an important role in myocardial remodeling. We have studied cardiac muscle gene expression with DNA microarray and used a computational strategy to identify common promoter motifs that respond to insulin-like growth factor 1 (IGF-1) stimulation in cardiac muscle cells. The analysis showed that the Sp1 binding site is a likely target of IGF-1 action. Further experiments with gel shift assay indicated that IGF-1 regulated the Sp1 site in cardiomyocytes, by increasing the abundance of Sp1 and Sp3 proteins. Using firefly luciferase as reporter gene, additional experiments showed that IGF-1 activated the promoter of cyclin D3 and Glut1. Both promoters contain one Sp1 site. The effect of IGF-1 on these two promoters was abolished with siRNA for Sp1. Thus, the transcriptional activation of these two promoters by IGF-1 requires the induction of Sp1 protein. These experiments suggest that the global transcriptional regulatory actions of IGF-1 involve activation of the Sp1 site in cardiac muscle. The computational model we have developed is a prototypical method that may be further developed to identify unique cis- and trans-acting elements in response to hormonal stimulation during cardiac muscle growth, repair, and remodeling in normal and abnormal cardiac muscle.
Biological Cybernetics | 1987
Steven Hampson; D. J. Volper
Four connectionistic/neural models which are capable of learning arbitrary Boolean functions are presented. Three are provably convergent, but of differing generalization power. The fourth is not necessarily convergent, but its empirical behavior is quite good. The time and space characteristics of the four models are compared over a diverse range of functions and testing conditions. These include the ability to learn specific instances, to effectively generalize, and to deal with irrelevant or redundant information. Trade-offs between time and space are demonstrated by the various approaches.
Biological Cybernetics | 1987
D. J. Volper; Steven Hampson
A biologically plausible method for rapidly learning specific instances is described. It is contrasted with a formal model of classical conditioning (Rescorla-Wagner learning/perception training), which is shown to be relatively good for learning generalizations, but correspondingly poor for learning specific instances. A number of behaviorally relevant applications of specific instance learning are considered. For category learning, various combinations of specific instance learning and generalization are described and analyzed. Two general approaches are considered: the simple inclusion of Specific Instance Detectors (SIDs) as additional features during perception training, and specialized treatment in which SID-based categorization takes precedence over generalization-based categorization. Using the first approach, analysis and empirical results demonstrate a potential problem in representing feature presence and absence in a symmetric fashion when the frequencies of feature presence and absence are very different. However, it is shown that by using the proper representation, the addition of SIDs can only improve the convergence rate of perceptron training, the greatest improvement being achieved when SIDs are preferentially allocated for peripheral positive and negative instances. Some further improvement is possible if SIDs are treated in a specialized manner.
Current Biology | 2004
Robert E. Steele; Steven Hampson; Nicholas A. Stover; Dennis F. Kibler; Hans R. Bode
The Hydra EST project is being carried out in collaboration with the Genome Sequencing Center at Washington University. The project is generously supported by grant IBN-0120591 from the National Science Foundation.
systems man and cybernetics | 1990
Steven Hampson; Dennis J. Volper
An analysis and empirical measurement of threshold linear functions of multivalued features is presented. The number of thresholded linear functions, maximum weight size, training speed, and the number of nodes necessary to represent arbitrary Boolean functions are all shown to increase polynomially with the number of distinct values the input features can assume and exponentially with the number of features. Two network training algorithms, focusing and back propagation, are described. Empirically, they are capable of learning arbitrary Boolean functions of multivalued features in a two-level net. Focusing is proved to converge to a correct classification and permits some time-space complexity analysis. Training time for this algorithm is polynomial in the number of values of a feature can assume, and exponential in the number of features. Back propagation is not necessarily convergent, but for randomly generated Boolean functions, the empirical behavior of the implementation is similar to that of the focusing algorithm. >
Biological Cybernetics | 1986
D. J. Volper; Steven Hampson
Several distinct connectionistic/neural representations capable of computing arbitrary Boolean functions are described and discussed in terms of possible tradeoffs between time, space, and expressive clarity. It is suggested that the ability of a threshold logic unit (TLU) to represent prototypical groupings has significant advantages for representing real world categories. Upper and lower bounds on the number of nodes needed for Boolean completeness are demonstrated. The necessary number of nodes is shown to increase exponentially with the number of input features, the exact rate of increase depending on the representation scheme. In addition, in non-recurrent networks, connection weights are shown to increase exponentially with a linear reduction in the number of nodes below approximately 2d. This result suggests that optimum memory efficiency may require unacceptable learning time. Finally, two possible extensions to deal with non-Boolean values are considered.
The Archaeology of Frontiers and Boundaries | 1985
John Justeson; Steven Hampson
Publisher Summary Cultural systems are open systems. Semipermeable boundaries separate them from one another and from noncultural systems. In contrast, any explicit model must be closed, as it must incorporate a definite number of variables and must specify the relationships among them. An anthropologist has to deal with the problem of effectively modeling an open system as though it were closed. This chapter reviews the consequences of such modeling in terms of system and model boundaries—a crucial focus inasmuch as the impermeability of its boundary is what closes a systems model. This focus suggests the methodological importance of a topical focus on the frontiers of social systems. While models focused on stability and change are useful for investigating the effects of particular processes on system structure, models in which stability is assumed are useful for investigating the effects of system structure on system behavior.