Masafumi Oizumi
University of Tokyo
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Masafumi Oizumi.
The Journal of Neuroscience | 2010
Masafumi Oizumi; Toshiyuki Ishii; Kazuya Ishibashi; Toshihiko Hosoya; Masato Okada
“How is information decoded in the brain?” is one of the most difficult and important questions in neuroscience. We have developed a general framework for investigating to what extent the decoding process in the brain can be simplified. First, we hierarchically constructed simplified probabilistic models of neural responses that ignore more than Kth-order correlations using the maximum entropy principle. We then computed how much information is lost when information is decoded using these simplified probabilistic models (i.e., “mismatched decoders”). To evaluate the information obtained by mismatched decoders, we introduced an information theoretic quantity, I*, which was derived by extending the mutual information in terms of communication rate across a channel. We showed that I* provides consistent results with the minimum mean-square error as well as the mutual information, and demonstrated that a previously proposed measure quantifying the importance of correlations in decoding substantially deviates from I* when many cells are analyzed. We then applied this proposed framework to spike data for vertebrate retina using short natural scene movies of 100 ms duration as a set of stimuli and computing the information contained in neural activities. Although significant correlations were observed in population activities of ganglion cells, information loss was negligibly small even if all orders of correlation were ignored in decoding. We also found that, if we inappropriately assumed stationarity for long durations in the information analysis of dynamically changing stimuli, such as natural scene movies, correlations appear to carry a large proportion of total information regardless of their actual importance.
PLOS ONE | 2010
Ryota Satoh; Masafumi Oizumi; Hokto Kazama; Masato Okada
We examined the presence of maximum information preservation, which may be a fundamental principle of information transmission in all sensory modalities, in the Drosophila antennal lobe using an experimentally grounded network model and physiological data. Recent studies have shown a nonlinear firing rate transformation between olfactory receptor neurons (ORNs) and second-order projection neurons (PNs). As a result, PNs can use their dynamic range more uniformly than ORNs in response to a diverse set of odors. Although this firing rate transformation is thought to assist the decoder in discriminating between odors, there are no comprehensive, quantitatively supported studies examining this notion. Therefore, we quantitatively investigated the efficiency of this firing rate transformation from the viewpoint of information preservation by computing the mutual information between odor stimuli and PN responses in our network model. In the Drosophila olfactory system, all ORNs and PNs are divided into unique functional processing units called glomeruli. The nonlinear transformation between ORNs and PNs is formed by intraglomerular transformation and interglomerular interaction through local neurons (LNs). By exploring possible nonlinear transformations produced by these two factors in our network model, we found that mutual information is maximized when a weak ORN input is preferentially amplified within a glomerulus and the net LN input to each glomerulus is inhibitory. It is noteworthy that this is the very combination observed experimentally. Furthermore, the shape of the resultant nonlinear transformation is similar to that observed experimentally. These results imply that information related to odor stimuli is almost maximally preserved in the Drosophila olfactory circuit. We also discuss how intraglomerular transformation and interglomerular inhibition combine to maximize mutual information.
Journal of the Physical Society of Japan | 2010
Yasuhiko Igarashi; Masafumi Oizumi; Masato Okada
We investigated the effects of synaptic depression on the macroscopic behavior of stochastic neural networks. Dynamical mean field equations were derived for such networks by taking the average of two stochastic variables: a firing state variable and a synaptic variable. In these equations, their average product is decoupled as the product of averaged them because the two stochastic variables are independent. We proved the independence of these two stochastic variables assuming that the synaptic weight is of the order of 1/N with respect to the number of neurons N. Using these equations, we derived macroscopic steady state equations for a network with uniform connections and a ring attractor network with Mexican hat type connectivity and investigated the stability of the steady state solutions. An oscillatory uniform state was observed in the network with uniform connections due to a Hopf instability. With the ring network, high-frequency perturbations were shown not to affect system stability. Two mechanisms destabilize the inhomogeneous steady state, leading two oscillatory states. A Turing instability leads to a rotating bump state, while a Hopf instability leads to an oscillatory bump state, which was previous unreported. Various oscillatory states take place in a network with synaptic depression depending on the strength of the interneuron connections.
Journal of the Physical Society of Japan | 2010
Yosuke Otsubo; Kenji Nagata; Masafumi Oizumi; Masato Okada
We investigated how the stability of macroscopic states in the associative memory model is affected by synaptic depression. To this model, we applied the dynamical mean-field theory, which has recently been developed in stochastic neural network models with synaptic depression. By introducing a sublattice method, we derived macroscopic equations for firing state variables and depression variables. By using the macroscopic equations, we obtained the phase diagram when the strength of synaptic depression and the correlation level among stored patterns were changed. We found that there is an unstable region in which both the memory state and mixed state cannot be stable and that various switching phenomena can occur in this region.
Journal of the Physical Society of Japan | 2007
Masafumi Oizumi; Yoichi Miyawaki; Masato Okada
We propose a systematic method of rate reduction for a Hodgkin–Huxley type neural network model. In this context, Shriki et al. assumed that the threshold of the f – I curve for the reduced rate model depends linearly on the leak conductance of the Hodgkin–Huxley equation, while its gain remains constant. First, we show that the threshold and gain have second order dependence on the leak conductance. Second, we show that the Hodgkin–Huxley type network with second order interaction can be naturally reduced to an analog type neural network model with higher order interaction based on this finding. Finally, we construct statistical mechanics for the Hodgkin–Huxley type network with the Mexican-hat interaction through our rate reduction technique.
Frontiers in Computational Neuroscience | 2011
Masafumi Oizumi; Masato Okada; Shun-ichi Amari
We consider two types of causes leading to information loss when neural activities are passed and processed in the brain. One is responses of upstream neurons to stimuli being imperfectly observed by downstream neurons. The other is upstream neurons non-optimally decoding stimuli information contained in the activities of the downstream neurons. To investigate the importance of neural correlation in information processing in the brain, we specifically consider two situations. One is when neural responses are not simultaneously observed, i.e., neural correlation data is lost. This situation means that stimuli information is decoded without any specific assumption about neural correlations. The other is when stimuli information is decoded by a wrong statistical model where neural responses are assumed to be independent even when they are not. We provide the information geometric interpretation of these two types of information loss and clarify their relationship. We then concretely evaluate these types of information loss in some simple examples. Finally, we discuss use of these evaluations of information loss to elucidate the importance of correlation in neural information processing.
arXiv: Optimization and Control | 2018
Shun-ichi Amari; Ryo Karakida; Masafumi Oizumi
Two geometrical structures have been extensively studied for a manifold of probability distributions. One is based on the Fisher information metric, which is invariant under reversible transformations of random variables, while the other is based on the Wasserstein distance of optimal transportation, which reflects the structure of the distance between underlying random variables. Here, we propose a new information-geometrical theory that provides a unified framework connecting the Wasserstein distance and Kullback–Leibler (KL) divergence. We primarily considered a discrete case consisting of n elements and studied the geometry of the probability simplex
Frontiers in Computational Neuroscience | 2012
Masafumi Oizumi; Ryota Satoh; Hokto Kazama; Masato Okada
Journal of the Physical Society of Japan | 2011
Yosuke Otsubo; Kenji Nagata; Masafumi Oizumi; Masato Okada
S_{n-1}
Journal of Physics: Conference Series | 2009
Yasuhiko Igarashi; Masafumi Oizumi; Yosuke Otsubo; Kenji Nagata; Masato Okada
Collaboration
Dive into the Masafumi Oizumi's collaboration.
National Institute of Information and Communications Technology
View shared research outputs