Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joseph T. Lizier is active.

Publication


Featured researches published by Joseph T. Lizier.


Journal of Computational Neuroscience | 2011

Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity

Joseph T. Lizier; Jakob Heinzle; Annette Horstmann; John-Dylan Haynes; Mikhail Prokopenko

The human brain undertakes highly sophisticated information processing facilitated by the interaction between its sub-regions. We present a novel method for interregional connectivity analysis, using multivariate extensions to the mutual information and transfer entropy. The method allows us to identify the underlying directed information structure between brain regions, and how that structure changes according to behavioral conditions. This method is distinguished in using asymmetric, multivariate, information-theoretical analysis, which captures not only directional and non-linear relationships, but also collective interactions. Importantly, the method is able to estimate multivariate information measures with only relatively little data. We demonstrate the method to analyze functional magnetic resonance imaging time series to establish the directed information structure between brain regions involved in a visuo-motor tracking task. Importantly, this results in a tiered structure, with known movement planning regions driving visual and motor control regions. Also, we examine the changes in this structure as the difficulty of the tracking task is increased. We find that task difficulty modulates the coupling strength between regions of a cortical network involved in movement planning and between motor cortex and the cerebellum which is involved in the fine-tuning of motor control. It is likely these methods will find utility in identifying interregional structure (and experimentally induced changes in this structure) in other cognitive tasks and data modalities.


PLOS ONE | 2013

Measuring Information-Transfer Delays

Michael Wibral; Nicolae Pampu; Viola Priesemann; Felix Siebenhühner; Hannes Seiwert; Michael Lindner; Joseph T. Lizier; Raul Vicente

In complex networks such as gene networks, traffic systems or brain circuits it is important to understand how long it takes for the different parts of the network to effectively influence one another. In the brain, for example, axonal delays between brain areas can amount to several tens of milliseconds, adding an intrinsic component to any timing-based processing of information. Inferring neural interaction delays is thus needed to interpret the information transfer revealed by any analysis of directed interactions across brain structures. However, a robust estimation of interaction delays from neural activity faces several challenges if modeling assumptions on interaction mechanisms are wrong or cannot be made. Here, we propose a robust estimator for neuronal interaction delays rooted in an information-theoretic framework, which allows a model-free exploration of interactions. In particular, we extend transfer entropy to account for delayed source-target interactions, while crucially retaining the conditioning on the embedded target state at the immediately previous time step. We prove that this particular extension is indeed guaranteed to identify interaction delays between two coupled systems and is the only relevant option in keeping with Wiener’s principle of causality. We demonstrate the performance of our approach in detecting interaction delays on finite data by numerical simulations of stochastic and deterministic processes, as well as on local field potential recordings. We also show the ability of the extended transfer entropy to detect the presence of multiple delays, as well as feedback loops. While evaluated on neuroscience data, we expect the estimator to be useful in other fields dealing with network dynamics.


Frontiers in Neuroinformatics | 2014

Local active information storage as a tool to understand distributed neural information processing.

Michael Wibral; Joseph T. Lizier; Sebastian Vögler; Viola Priesemann; Ralf A. W. Galuske

Every act of information processing can in principle be decomposed into the component operations of information storage, transfer, and modification. Yet, while this is easily done for todays digital computers, the application of these concepts to neural information processing was hampered by the lack of proper mathematical definitions of these operations on information. Recently, definitions were given for the dynamics of these information processing operations on a local scale in space and time in a distributed system, and the specific concept of local active information storage was successfully applied to the analysis and optimization of artificial neural systems. However, no attempt to measure the space-time dynamics of local active information storage in neural data has been made to date. Here we measure local active information storage on a local scale in time and space in voltage sensitive dye imaging data from area 18 of the cat. We show that storage reflects neural properties such as stimulus preferences and surprise upon unexpected stimulus change, and in area 18 reflects the abstract concept of an ongoing stimulus despite the locally random nature of this stimulus. We suggest that LAIS will be a useful quantity to test theories of cortical function, such as predictive coding.


Information Sciences | 2012

Local measures of information storage in complex distributed computation

Joseph T. Lizier; Mikhail Prokopenko; Albert Y. Zomaya

Information storage is a key component of intrinsic distributed computation. Despite the existence of appropriate measures for it (e.g. excess entropy), its role in interacting with information transfer and modification to give rise to distributed computation is not yet well-established. We explore how to quantify information storage on a local scale in space and time, so as to understand its role in the dynamics of distributed computation. To assist these explorations, we introduce the active information storage, which quantifies the information storage component that is directly in use in the computation of the next state of a process. We present the first profiles of local excess entropy and local active information storage in cellular automata, providing evidence that blinkers and background domains are dominant information storage processes in these systems. This application also demonstrates the manner in which these two measures of information storage are distinct but complementary. It also reveals other information storage phenomena, including the misinformative nature of local storage when information transfer dominates the computation, and demonstrates that the local entropy rate is a useful spatiotemporal filter for information transfer structure.


Chaos | 2010

Information modification and particle collisions in distributed computation

Joseph T. Lizier; Mikhail Prokopenko; Albert Y. Zomaya

Distributed computation can be described in terms of the fundamental operations of information storage, transfer, and modification. To describe the dynamics of information in computation, we need to quantify these operations on a local scale in space and time. In this paper we extend previous work regarding the local quantification of information storage and transfer, to explore how information modification can be quantified at each spatiotemporal point in a system. We introduce the separable information, a measure which locally identifies information modification events where separate inspection of the sources to a computation is misleading about its outcome. We apply this measure to cellular automata, where it is shown to be the first direct quantitative measure to provide evidence for the long-held conjecture that collisions between emergent particles therein are the dominant information modification events.


Theory in Biosciences | 2012

Information processing in echo state networks at the edge of chaos

Joschka Boedecker; Oliver Obst; Joseph T. Lizier; N. Michael Mayer; Minoru Asada

We investigate information processing in randomly connected recurrent neural networks. It has been shown previously that the computational capabilities of these networks are maximized when the recurrent layer is close to the border between a stable and an unstable dynamics regime, the so called edge of chaos. The reasons, however, for this maximized performance are not completely understood. We adopt an information-theoretical framework and are for the first time able to quantify the computational capabilities between elements of these networks directly as they undergo the phase transition to chaos. Specifically, we present evidence that both information transfer and storage in the recurrent layer are maximized close to this phase transition, providing an explanation for why guiding the recurrent layer toward the edge of chaos is computationally useful. As a consequence, our study suggests self-organized ways of improving performance in recurrent neural networks, driven by input data. Moreover, the networks we study share important features with biological systems such as feedback connections and online computation on input streams. A key example is the cerebral cortex, which was shown to also operate close to the edge of chaos. Consequently, the behavior of model systems as studied here is likely to shed light on reasons why biological systems are tuned into this specific regime.


Frontiers in Robotics and AI | 2014

JIDT: an information-theoretic toolkit for studying the dynamics of complex systems

Joseph T. Lizier

Complex systems are increasingly being viewed as distributed information processing systems, particularly in the domains of computational neuroscience, bioinformatics and Artificial Life. This trend has resulted in a strong uptake in the use of (Shannon) information-theoretic measures to analyse the dynamics of complex systems in these fields. We introduce the Java Information Dynamics Toolkit (JIDT): a Google code project which provides a standalone, (GNU GPL v3 licensed) open-source code implementation for empirical estimation of information-theoretic measures from time-series data. While the toolkit provides classic information-theoretic measures (e.g. entropy, mutual information, conditional mutual information), it ultimately focusses on implementing higher-level measures for information dynamics. That is, JIDT focusses on quantifying information storage, transfer and modification, and the dynamics of these operations in space and time. For this purpose, it includes implementations of the transfer entropy and active information storage, their multivariate extensions and local or pointwise variants. JIDT provides implementations for both discrete and continuous-valued data for each measure, including various types of estimator for continuous data (e.g. Gaussian, box-kernel and Kraskov-Stoegbauer-Grassberger) which can be swapped at run-time due to Javas object-oriented polymorphism. Furthermore, while written in Java, the toolkit can be used directly in MATLAB, GNU Octave, Python and other environments. We present the principles behind the code design, and provide several examples to guide users.


Artificial Life | 2011

Information dynamics in small-world boolean networks

Joseph T. Lizier; Siddharth Pritam; Mikhail Prokopenko

Small-world networks have been one of the most influential concepts in complex systems science, partly due to their prevalence in naturally occurring networks. It is often suggested that this prevalence is due to an inherent capability to store and transfer information efficiently. We perform an ensemble investigation of the computational capabilities of small-world networks as compared to ordered and random topologies. To generate dynamic behavior for this experiment, we imbue the nodes in these networks with random Boolean functions. We find that the ordered phase of the dynamics (low activity in dynamics) and topologies with low randomness are dominated by information storage, while the chaotic phase (high activity in dynamics) and topologies with high randomness are dominated by information transfer. Information storage and information transfer are somewhat balanced (crossed over) near the small-world regime, providing quantitative evidence that small-world networks do indeed have a propensity to combine comparably large information storage and transfer capacity.


IEEE Photonics Technology Letters | 2001

Splice losses in holey optical fibers

Joseph T. Lizier; G.E. Town

Splice losses between standard step-index fiber and holey optical fibers were calculated for a range of fiber parameters and wavelengths using finite-difference time-domain simulations. The optimal holey fiber parameters for minimum splice loss were determined. It was found that the optimal parameters could also be predicted using analytical approximations incorporating the effective index model.


PLOS ONE | 2012

Quantifying and Tracing Information Cascades in Swarms

X. Rosalind Wang; Jennifer M. Miller; Joseph T. Lizier; Mikhail Prokopenko; Louis F. Rossi

We propose a novel, information-theoretic, characterisation of cascades within the spatiotemporal dynamics of swarms, explicitly measuring the extent of collective communications. This is complemented by dynamic tracing of collective memory, as another element of distributed computation, which represents capacity for swarm coherence. The approach deals with both global and local information dynamics, ultimately discovering diverse ways in which an individual’s spatial position is related to its information processing role. It also allows us to contrast cascades that propagate conflicting information with waves of coordinated motion. Most importantly, our simulation experiments provide the first direct information-theoretic evidence (verified in a simulation setting) for the long-held conjecture that the information cascades occur in waves rippling through the swarm. Our experiments also exemplify how features of swarm dynamics, such as cascades’ wavefronts, can be filtered and predicted. We observed that maximal information transfer tends to follow the stage with maximal collective memory, and principles like this may be generalised in wider biological and social contexts.

Collaboration


Dive into the Joseph T. Lizier's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

X. Rosalind Wang

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar

Michael Wibral

Goethe University Frankfurt

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Oliver Obst

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Patricia Wollstadt

Goethe University Frankfurt

View shared research outputs
Researchain Logo
Decentralizing Knowledge