Robert H. Fujii
University of Aizu
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Robert H. Fujii.
Theoretical Computer Science | 1998
Lothar M. Schmitt; Chrystopher L. Nehaniv; Robert H. Fujii
Original article can be found at: http://www.sciencedirect.com/science/journal/03043975 Copyright Elsevier B.V. DOI: 10.1016/S0304-3975(98)00004-8 [Full text of this article is not available in the UHRA]
midwest symposium on circuits and systems | 2005
Hesham H. Amin; Robert H. Fujii
An algorithm for spike train mapping and learning using spiking neural network (SNN) is introduced. Analysis of the proposed spike train mapping-learning algorithm provides a method for selecting appropriate mapping and learning parameters such as neuron threshold and mapping time window. An application is used to show the robustness of the learning algorithm. A comparison of the proposed SNN learning algorithm with the back-propagation learning algorithm for a classification problem is presented
computer and information technology | 2007
Taiki Ichishita; Robert H. Fujii
The performance evaluation of a temporal sequence learning spiking neural network was carried out. Neural network characteristics that were evaluated included: long temporal sequence length recognition, factors that affect size of the neural network, and network robustness against random input noise. Music melodies of various lengths were used as temporal sequential input data for the evaluation. Results have shown that the spiking neural network can be made to learn inter-spike time sequences comprised of as many as 900 inter-spike times. The size of the neural network was influenced by the amount and type of random noise used during the supervised learning phase. The spiking neural network system performance was approximately 90% accurate in recognizing sequences even in the presence of various types of random noise.
IEICE Transactions on Information and Systems | 2005
Hesham H. Amin; Robert H. Fujii
Information transmission among biological neurons is carried out by a complex series of spike signals. The input inter-spike arrival times at a neuron are believed to carry information which the neurons utilize to carry out a task. In this paper, a new scheme which utilizes the input inter-spike intervals (ISI) for decoding an input spike train is proposed. A spike train consists of a sequence on input spikes with various inter-spike times. This decoding scheme can also be used for neurons which have multiple synaptic inputs but for which each synapse receives a single spike within one input time window. The ISI decoding neural network requires only a few neurons. Example applications show the usefulness of the decoding scheme.
international symposium on circuits and systems | 1992
Y. Cheng; Robert H. Fujii
The SAUCES (Sensitivity Analysis Using Circuit Elements in Symbolic format) program is described. SAUCES computes the sensitivity of an analog circuit either symbolically or numerically by utilizing the results from symbolic circuit analysis. A simple method to drive the sensitivity of a symbolic circuit transfer function with respect to each circuit element is presented. A method to derive the sensitivity of zeros and poles symbolically for the circuit transfer function is also presented. The advantages of the symbolic representation of the pole and zero sensitivity are discussed. An example is used to show how SAUCES can be used to improve circuit design.<<ETX>>
international conference on natural computation | 2005
Hesham H. Amin; Robert H. Fujii
Spiking Neural Networks (SNNs) use inter-spike time coding to process input data. In this paper, a new learning algorithm for SNNs that uses the inter-spike times within a spike train is introduced. The learning algorithm utilizes the spatio-temporal pattern produced by the spike train input mapping unit and adjusts synaptic weights during learning. The approach was applied to classification problems.
international conference on intelligent computing | 2005
Hesham H. Amin; Robert H. Fujii
The capabilities and robustness of a new spiking neural network (SNN) learning algorithm are demonstrated with sound classification and function approximation applications. The proposed SNN learning algorithm and the radial basis function (RBF) learning method for function approximation are compared. The complexity of the learning algorithm is analyzed.
midwest symposium on circuits and systems | 2001
Robert H. Fujii; R. Nemoto; N. Satou
A transistor-level analog circuit design of a spiking neuron is proposed. The circuit was simulated using using BSIM3 0.8 /spl mu/m geometry MOS transistor parameters provided by MOSIS. Most of the circuits work in the MOS sub-threshold region of operation to achieve very low power consumption. Supply voltage was set at 2.8 V As examples of neural networks, feed-forward and feed-back neural networks capable of recognizing black and white patterns were simulated using a transistor level circuit simulator Static power dissipation of the proposed neuron was estimated to be approximately 180 pW for the dendrite and 56 pW for the soma. In the dynamic mode, energy consumption was estimated to be 5.1 pJ (dendrite) and 1.2 pJ (soma) per activation. An analog HDL simulator was also used to simulate neural network behavior for the larger neural network examples.
international conference on communications | 2012
Fuyuko Watanabe; Robert H. Fujii
A new neural network that uses the ReSuMe supervised learning algorithm for generating a desired spike output sequence in response to a given input spike sequence is proposed. Possible advantages of the proposed new neural network system compared to the Liquid State Machine based ReSuMe network system include better learning convergence and a smaller neural network size.
international conference on machine learning and applications | 2007
Robert H. Fujii; Taiki Ichishita
Three synaptic weight assignment schemes were proposed and their effect on the behavior of a spiking neuron was analyzed. To evaluate the proposed synaptic assignment schemes, a feed-forward Spiking Neural Network that can learn to recognize temporal sequences was proposed. This spiking neural network uses two of the proposed synaptic weight assignment schemes in conjunction with a spiking neuron model that uses a simplified linear soma potential function. The robustness and reliability of the proposed spiking neural network system were shown by the high (approximately 99%) temporal sequence recognition rate achieved during testing. Practical hardware implementation issues were also discussed.