Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jim-Shih Liaw is active.

Publication


Featured researches published by Jim-Shih Liaw.


Hippocampus | 1996

Dynamic synapse: A new concept of neural representation and computation

Jim-Shih Liaw

Presynaptic mechanisms influencing the probability of neurotransmitter release from an axon terminal, such as facilitation, augmentation, and presynaptic feedback inhibition, are fundamental features of biological neurons and are cardinal physiological properties of synaptic connections in the hippocampus. The consequence of these presynaptic mechanisms is that the probability of release becomes a function of the temporal pattern of action potential occurrence, and hence, the strength of a given synapse varies upon the arrival of each action potential invading the terminal region. From the perspective of neural information processing, the capability of dynamically tuning the synaptic strength as a function of the level of neuronal activation gives rise to a significant representational and processing power of temporal spike patterns at the synaptic level. Furthermore, there is an exponential growth in such computational power when the specific dynamics of presynaptic mechanisms varies quantitatively across axon terminals of a single neuron, a recently established characteristic of hippocampal synapses. During learning, alterations in the presynaptic mechanisms lead to different pattern transformation functions, whereas changes in the postsynaptic mechanisms determine how the synaptic signals are to be combined. We demonstrate the computational capability of dynamic synapses by performing speech recognition from unprocessed, noisy raw waveforms of words spoken by multiple speakers with a simple neural network consisting of a small number of neurons connected with synapses incorporating dynamically determined probability of release. The dynamics included in the model are consistent with available experimental data on hippocampal neurons in that parameter values were chosen so as to be consistent with time constants of facilitative and inhibitory processes governing the dynamics of hippocampal synaptic transmission studied using nonlinear systems analytic procedures.


Neurocomputing | 1999

Dynamic synapse: Harnessing the computing power of synaptic dynamics

Jim-Shih Liaw

Abstract A major unresolved issues in neuroscience is the emergence of functional capability from underlying molecular and cellular mechanisms. The concept of a dynamic synapse is extended to provide a formalism to incorporate synaptic mechanisms into a general scheme of neural information processing: A synapse is composed of two distinct functional units, presynaptic terminals for transforming the sequence of action potentials into multiple sequences of discrete release events, and postsynaptic components for combining such synaptic signals. The complex interaction of various cellular and molecular processes in synapses can be concisely expressed and interpreted in these two synaptic terms. Learning involves modifying the synaptic dynamics such that each axon terminal performs proper transformation function.


international symposium on neural networks | 2001

A new dynamic synapse neural network for speech recognition

Hassan H. Namarvar; Jim-Shih Liaw

A new version of dynamic synapse neural network (DSNN) has been applied to recognize noisy raw waveforms of words spoken by multiple speakers. The new architecture of DSNN is based on the original DSNN and a wavelet filter bank, which decomposes speech signals in multiresolution frequency bands. In this study we applied a genetic algorithm (GA) learning method to optimize the neural network. The advantage of the GA method is that it facilitates finding of a semi-optimal parameter set in the search space domain. In order to speed up the training time of the network, a new discrete time implementation of the DSNN was introduced based on the impulse invariant transformation. The network was tested for difficult discrimination conditions.


Neurocomputing | 1999

Consequence of morphological alterations on synaptic function

Taraneh Ghaffari-Farazi; Jim-Shih Liaw

Abstract Structural modification of the synapse is a pervasive phenomenon, however, its relation to functional dynamics of the system remains an unexplored research area due to technical difficulties both experimentally and computationally. The purpose of this study was to mathematically model a hippocampal synapse, incorporating detailed morphological characteristics, to study the functional consequence of structural modifications. We showed that subsynaptic morphological alterations, as in partitioned synapses, influence synaptic transmission by altering calcium diffusion within the axon terminal, and hence, the probability of release. Furthermore, the model revealed a novel concept that functional dynamics can emerge from structural alterations.


international symposium on neural networks | 1997

Computing with dynamic synapses: a case study of speech recognition

Jim-Shih Liaw

A novel concept of dynamic synapse is presented which incorporates fundamental features of biological neurons including presynaptic mechanisms influencing the probability of neurotransmitter release from an axon terminal. The consequence of the presynaptic mechanisms is that the probability of release becomes a function of the temporal pattern of action potential occurrence, and hence, the strength of a given synapse varies upon the arrival of each action potential invading the terminal region. From the perspective of neural information processing, the capability of dynamically tuning the synaptic strength as a function of the level of neuronal activation gives rise to a significant representational and processing power at the synaptic level. Furthermore, there is an exponential growth in such computational power when the specific dynamics of presynaptic mechanisms varies quantitatively across axon terminals of a single neuron. A dynamic learning algorithm is developed in which alterations of the presynaptic mechanisms lead to different pattern transformation functions while changes in the postsynaptic mechanisms determines how the synaptic signals are to be combined. We demonstrate the computational capability of dynamic synapses by performing speech recognition from unprocessed, noisy raw waveforms of words spoken by multiple speakers with a simple neural network consisting of a small number of neurons connected with synapses incorporating dynamically determined probability of release.


Adaptive Behavior | 1993

Neural mechanisms underlying direction-selective avoidance behavior

Jim-Shih Liaw; Michael A. Arbib

Avoiding looming objects (possible predators) is essential for animalssurvival. This article presents a neural network model to account for the detection of and response to a looming stimulus. The generation of an appropriate response includes five tasks: detection of a looming stimulus, localization of the stimulus position, computation of the direction of the stimulus movement, determination of escape direction, and selection of a proper motor action. The detection of a looming stimulus is achieved based on the expansion of the retinal image and depth information. The spatial location of the stimulus is encoded by a population of neurons. The direction of the looming stimulus is computed by monitoring the shift of the peak of neuronal activity in this population. The signal encoding the stimulus location is gated by the direction- selective neurons onto a motor heading map, which specifies the escape direction. The selection of a proper action is achieved through competition among different groups of motor neurons. The model is based on the analysis of predator-avoidance in frog and toad but leads to a comparative analysis of mammalian visual systems.


Biological Cybernetics | 2005

Schema-based learning of adaptable and flexible prey-catching in anurans I. The basic architecture

Fernando J. Corbacho; Kiisa C. Nishikawa; Ananda Weerasuriya; Jim-Shih Liaw; Michael A. Arbib

A motor action often involves the coordination of several motor synergies and requires flexible adjustment of the ongoing execution based on feedback signals. To elucidate the neural mechanisms underlying the construction and selection of motor synergies, we study prey-capture in anurans. Experimental data demonstrate the intricate interaction between different motor synergies, including the interplay of their afferent feedback signals (Weerasuriya 1991; Anderson and Nishikawa 1996). Such data provide insights for the general issues concerning two-way information flow between sensory centers, motor circuits and periphery in motor coordination. We show how different afferent feedback signals about the status of the different components of the motor apparatus play a critical role in motor control as well as in learning. This paper, along with its companion paper, extend the model by Liaw et al. (1994) by integrating a number of different motor pattern generators, different types of afferent feedback, as well as the corresponding control structure within an adaptive framework we call Schema-Based Learning. We develop a model of the different MPGs involved in prey-catching as a vehicle to investigate the following questions: What are the characteristic features of the activity of a single muscle? How can these features be controlled by the premotor circuit? What are the strategies employed to generate and synchronize motor synergies? What is the role of afferent feedback in shaping the activity of a MPG? How can several MPGs share the same underlying circuitry and yet give rise to different motor patterns under different input conditions? In the companion paper we also extend the model by incorporating learning components that give rise to more flexible, adaptable and robust behaviors. To show these aspects we incorporate studies on experiments on lesions and the learning processes that allow the animal to recover its proper functioning


Visual structures and integrated functions | 1991

A neural network model for response to looming objects by frog and toad

Jim-Shih Liaw; Michael A. Arbib

Toads exhibit a wide variety of avoidance patterns depending on the stimulus situation. The analysis of this situation is achieved through interaction between the optic tectum and pretectum. The retinal signal is first received and processed by tectal neurons and their outputs then converge onto pretectal neurons. Based on these converging inputs, neurons in thalamic pretectum are able to analyze the stimulus situation and determine an appropriate avoidance action. The spatial location of the stimulus is encoded in the topography of tectal neurons. This signal is projected onto a motor heading map which specifies the direction of the avoidance movement. We develop a neural network model to account for the toad’s detection of and response to a looming stimulus.


international symposium on neural networks | 1998

Robust speech recognition with dynamic synapses

Jim-Shih Liaw

We have developed a speech recognition system employing the concept of dynamic synapses. A dynamic synapse incorporates fundamental features of biological neurons including presynaptic mechanisms influencing the probability of neurotransmitter release from an axon terminal. With these mechanisms, the probability of neurotransmitter release becomes a function of the temporal pattern of action potential occurrence, and hence, transforming a spike train into a sequence of discrete release events. When presynaptic mechanisms vary quantitatively across the axon terminals of a single neuron, an array of spatially distributed temporal patterns can be generated. In other words, information is coded in the spatio-temporal patterns of release events which provides an exponential growth of coding capacity for the output signals of a single neuron. A dynamic learning algorithm is developed in which alterations of the presynaptic mechanisms lead to different pattern transformation functions while changes in the postsynaptic mechanisms determines how the synaptic signals are to be combined. We demonstrate the computational capability of dynamic synapses by performing speech recognition from unprocessed, noisy raw waveforms of words spoken by multiple speakers with a simple neural network consisting of a small number of neurons connected with dynamic synapses. The system is highly robust against noise, and outperformed human listeners under some conditions.


Biological Cybernetics | 2005

Schema-based learning of adaptable and flexible prey- catching in anurans II. Learning after lesioning

Fernando J. Corbacho; Kiisa C. Nishikawa; Ananda Weerasuriya; Jim-Shih Liaw; Michael A. Arbib

The previous companion paper describes the initial (seed) schema architecture that gives rise to the observed prey-catching behavior. In this second paper in the series we describe the fundamental adaptive processes required during learning after lesioning. Following bilateral transections of the hypoglossal nerve, anurans lunge toward mealworms with no accompanying tongue or jaw movement. Nevertheless anurans with permanent hypoglossal transections eventually learn to catch their prey by first learning to open their mouth again and then lunging their body further and increasing their head angle.In this paper we present a new learning framework, called schema-based learning (SBL). SBL emphasizes the importance of the current existent structure (schemas), that defines a functioning system, for the incremental and autonomous construction of ever more complex structure to achieve ever more complex levels of functioning. We may rephrase this statement into the language of Schema Theory (Arbib 1992, for a comprehensive review) as the learning of new schemas based on the stock of current schemas. SBL emphasizes a fundamental principle of organization called coherence maximization, that deals with the maximization of congruence between the results of an interaction (external or internal) and the expectations generated for that interaction. A central hypothesis consists of the existence of a hierarchy of predictive internal models (predictive schemas) all over the control center-brain-of the agent. Hence, we will include predictive models in the perceptual, sensorimotor, and motor components of the autonomous agent architecture. We will then show that predictive models are fundamental for structural learning. In particular we will show how a system can learn a new structural component (augment the overall network topology) after being lesioned in order to recover (or even improve) its original functionality. Learning after lesioning is a special case of structural learning but clearly shows that solutions cannot be known/hardwired a priori since it cannot be known, in advance, which substructure is going to break down.

Collaboration


Dive into the Jim-Shih Liaw's collaboration.

Top Co-Authors

Avatar

Michael A. Arbib

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Michel Baudry

Western University of Health Sciences

View shared research outputs
Top Co-Authors

Avatar

Hassan H. Namarvar

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Alireza A. Dibazar

University of Southern California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Xiaping Xie

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Ying Shu

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Bing J. Sheu

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Fernando J. Corbacho

University of Southern California

View shared research outputs
Top Co-Authors

Avatar

Irwin King

The Chinese University of Hong Kong

View shared research outputs
Researchain Logo
Decentralizing Knowledge