Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jousuke Kuroiwa is active.

Publication


Featured researches published by Jousuke Kuroiwa.


Physics Letters A | 2002

Completely reproducible description of digital sound data with cellular automata

Masato Wada; Jousuke Kuroiwa; Shigetoshi Nara

Abstract A novel method of compressive and completely reproducible description of digital sound data by means of rule dynamics of CA (cellular automata) is proposed. The digital data of spoken words and music recorded with the standard format of a compact disk are reproduced completely by this method with use of only two rules in a one-dimensional CA without loss of information.


International Journal of Bifurcation and Chaos | 2004

SENSITIVE RESPONSE OF A CHAOTIC WANDERING STATE TO MEMORY FRAGMENT INPUTS IN A CHAOTIC NEURAL NETWORK MODEL

Jousuke Kuroiwa; Naoki Masutani; Shigetoshi Nara; Kazuyuki Aihara

Dynamical properties of a chaotic neural network model in a chaotically wandering state are studied with respect to sensitivity to weak input of a memory fragment. In certain parameter regions, the network shows weakly chaotic wandering, which means that the orbits of network dynamics in the state space are localized around several memory patterns. In the other parameter regions, the network shows highly developed chaotic wandering, that is, the orbits become itinerant through ruins of all the memory patterns. In the latter case, once the external input consisting of a memory fragment is applied to the network, the orbit quickly moves to the vicinity of the corresponding memory pattern including the memory fragment within several iteration steps. Thus, chaotic dynamics in the model is effective for instantaneous search among memory patterns.


International Journal of Bifurcation and Chaos | 2001

RESPONSE PROPERTIES OF A SINGLE CHAOTIC NEURON TO STOCHASTIC INPUTS

Jousuke Kuroiwa; Shigetoshi Nara; Kazuyuki Aihara

Response properties of a single chaotic neuron to stochastic inputs are investigated by means of numerical simulations in the context of a nonlinear dynamical approach to analyzing chaotic behaviors of a neuron. We apply six kinds of stochastic inputs with the same mean rate but different correlations of interspike intervals, whose timings are determined by a stochastic process, namely, Markovian processes and Gaussian/Poisson random processes. From numerical evaluations of entropy and conditional entropies with respect to interspike intervals of outputs, it is shown that interspike intervals of outputs represent dynamical structures of each input. Numerical calculations of Lyapunov exponents, trajectories of dynamics and return plots of internal states make meaningful difference in dynamical properties of the model depending on inputs even if mean interspike intervals of outputs are almost the same values. In order to extract dynamical features of outputs, we calculate a time-delayed space representation of output responses to inputs, and the results provide different trajectories in a time-delayed phase space, which reflect a higher order statistical feature of inputs, amplifying their feature differences. For signals containing noise, the behaviors of the model do not suffer degradation, showing robustness to noise in the inputs. As conclusion, our results show that dynamical properties of inputs can be extracted with clear difference of response properties of the model, that is, the model gives a variety of the amplitude and the interspike intervals of outputs depending on inputs. In other words, the model can realize dynamical sampling of inputs with sensitivity of response properties to inputs and robustness to inputs with noise.


systems man and cybernetics | 1999

Functional possibility of chaotic behaviors in a single chaotic neuron model for dynamical signal processing elements

Jousuke Kuroiwa; S. Nara; K. Aihara

Dynamical behaviors of a single chaotic neuron model are studied by means of numerical methods in the context of dynamical signal processing. As external signals, six kinds of temporal spiking inputs with the same mean rate but different correlations of spiking intervals are employed. A decay effect of internal state of the neuron and a relative refractoriness play important roles in leading to complex dynamics of outputs categorized in the three types; (i) 1 or 0 responses, (ii) weak complex dynamical responses, and (iii) highly developed complex dynamical responses. In the responses of the categories (ii) and (iii), it is found that, typically, speaking, time structures of interspike intervals of inputs are reflected on dynamical properties of outputs even though mean interspike intervals of outputs are almost equal to all the inputs. We find that, by embedding of outputs in two dimensional space, higher order statistical features of spiking intervals of inputs are extracted with amplification of feature difference between them. Our results show that (i) the single chaotic neuron can work as a dynamical sampling element for inputs with sensitive responses to input and with noise robustness, and (ii) it can extract dynamical structures of inputs, for instance a second or higher order statistical features included in spike trains.


Neural Networks | 2000

Self-organization of orientation maps in a formal neuron model using a cluster learning rule

Jousuke Kuroiwa; Sakari Inawashiro; Shogo Miyake; Hirotomo Aso

Self-organization of orientation maps due to external stimuli in the primary visual area of the cerebral cortex is studied in a two-layered neural network which consists of formal neuron models with a sigmoidal output function. A cluster learning rule is proposed as an extended Hebbian learning rule, where a modification of synaptic connections is influenced by an activation of neighboring output neurons. By making use of self-consistent Monte Carlo method, we evaluate output responses of neurons against explicit inputs after the learning. An orientation map calculated from the output responses reproduces characteristic features of biological ones. Moreover quantitative analysis of our results are consistent with those of experimental results. It is shown that the cluster learning rule plays an important role in forming smooth changes of preferred orientations.


Journal of the Physical Society of Japan | 2000

Mean Field Theory and Self-Consistent Monte Carlo Method for Self-Organization of Formal Neuron Model

Jousuke Kuroiwa; Sakari Inawashiro; Shogo Miyake; Hirotomo Aso

An Ising spin system with strong self-fields and two types of long-range antiferromagnetic interactions is investigated by a mean field theory and a self-consistent Monte Carlo (SCMC) method. Spin averages of the Ising spin system correspond to output responses of neurons for self-organization of a formal neuron model with a sigmoidal output function. Since the strong self-fields and the long-range antiferromagnetic interactions give a variety of spin averages, it is difficult to know an appropriate distribution of spin averages beforehand. An iterative procedure for mean field equations often falls into an oscillation between two sets of iterative variables and fails to converge, depending on parameters and the initial distribution of spin averages. On the other hand, SCMC method is able to give spin averages independent of parameters and of initial conditions even if the iterative procedure fails to converge. SCMC method is effective and practical in finding a variety of distributions of spin averages.


Neural Computing and Applications | 2002

Formation of an Orientation Preference Map – Gradual Inhibition Method

Sakari Inawashiro; Jousuke Kuroiwa; Hisashi Nakamura; Shogo Miyake

Formation of an orientation preference map due to external visual stimuli in the primary visual area of the cerebral cortex is investigated in a two-layered neural network model, where a cluster learning rule is used together with ordinary Hebbian learning rule. We succeed in solving directly a set of simultaneous equations by numerical iterations using a gradual inhibition method. Previously, an alternative solution was found by applying a Self-Consistent Monte Carlo (SCMC) method. The gradual inhibition method in an iterative procedure can give a more precise solution efficiently with fewer computational tasks than the SCMC method.


international conference on neural information processing | 1999

A hippocampal CA3 model for temporal sequences

Makoto Ito; Shogo Miyake; S. Inawashiro; Jousuke Kuroiwa; Yasuji Sawada

We propose a pulse-neuron model with transmission delays for the field CA3 of the hippocampus and the new learning rule. We use temporal sequences of patterns which consist of trains of bursts. In simulations, it is shown that the model successfully learns and recalls the temporal sequences. The new learning rule works much more effectively than the Hebbian learning rule in learning temporal sequences of patterns.


Physical Review E | 2003

Errorless reproduction of given pattern dynamics by means of cellular automata.

Teruhiko Tamura; Jousuke Kuroiwa; Shigetoshi Nara


international conference on neural information processing | 1997

Recognition of rotated patterns using neocognitron

Shunji Satoh; Jousuke Kuroiwa; Hirotomo Aso; Shogo Miyake

Collaboration


Dive into the Jousuke Kuroiwa's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Shunji Satoh

University of Electro-Communications

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge