Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mikhail Prokopenko is active.

Publication


Featured researches published by Mikhail Prokopenko.


Journal of Computational Neuroscience | 2011

Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity

Joseph T. Lizier; Jakob Heinzle; Annette Horstmann; John-Dylan Haynes; Mikhail Prokopenko

The human brain undertakes highly sophisticated information processing facilitated by the interaction between its sub-regions. We present a novel method for interregional connectivity analysis, using multivariate extensions to the mutual information and transfer entropy. The method allows us to identify the underlying directed information structure between brain regions, and how that structure changes according to behavioral conditions. This method is distinguished in using asymmetric, multivariate, information-theoretical analysis, which captures not only directional and non-linear relationships, but also collective interactions. Importantly, the method is able to estimate multivariate information measures with only relatively little data. We demonstrate the method to analyze functional magnetic resonance imaging time series to establish the directed information structure between brain regions involved in a visuo-motor tracking task. Importantly, this results in a tiered structure, with known movement planning regions driving visual and motor control regions. Also, we examine the changes in this structure as the difficulty of the tracking task is increased. We find that task difficulty modulates the coupling strength between regions of a cortical network involved in movement planning and between motor cortex and the cerebellum which is involved in the fine-tuning of motor control. It is likely these methods will find utility in identifying interregional structure (and experimentally induced changes in this structure) in other cognitive tasks and data modalities.


Information Sciences | 2012

Local measures of information storage in complex distributed computation

Joseph T. Lizier; Mikhail Prokopenko; Albert Y. Zomaya

Information storage is a key component of intrinsic distributed computation. Despite the existence of appropriate measures for it (e.g. excess entropy), its role in interacting with information transfer and modification to give rise to distributed computation is not yet well-established. We explore how to quantify information storage on a local scale in space and time, so as to understand its role in the dynamics of distributed computation. To assist these explorations, we introduce the active information storage, which quantifies the information storage component that is directly in use in the computation of the next state of a process. We present the first profiles of local excess entropy and local active information storage in cellular automata, providing evidence that blinkers and background domains are dominant information storage processes in these systems. This application also demonstrates the manner in which these two measures of information storage are distinct but complementary. It also reveals other information storage phenomena, including the misinformative nature of local storage when information transfer dominates the computation, and demonstrates that the local entropy rate is a useful spatiotemporal filter for information transfer structure.


Chaos | 2010

Information modification and particle collisions in distributed computation

Joseph T. Lizier; Mikhail Prokopenko; Albert Y. Zomaya

Distributed computation can be described in terms of the fundamental operations of information storage, transfer, and modification. To describe the dynamics of information in computation, we need to quantify these operations on a local scale in space and time. In this paper we extend previous work regarding the local quantification of information storage and transfer, to explore how information modification can be quantified at each spatiotemporal point in a system. We introduce the separable information, a measure which locally identifies information modification events where separate inspection of the sources to a computation is misleading about its outcome. We apply this measure to cellular automata, where it is shown to be the first direct quantitative measure to provide evidence for the long-held conjecture that collisions between emergent particles therein are the dominant information modification events.


IEEE/ACM Transactions on Computational Biology and Bioinformatics | 2012

Assortative Mixing in Directed Biological Networks

Mahendra Piraveenan; Mikhail Prokopenko; Albert Y. Zomaya

We analyze assortative mixing patterns of biological networks which are typically directed. We develop a theoretical background for analyzing mixing patterns in directed networks before applying them to specific biological networks. Two new quantities are introduced, namely the in-assortativity and the out-assortativity, which are shown to be useful in quantifying assortative mixing in directed networks. We also introduce the local (node level) assortativity quantities for in- and out-assortativity. Local assortativity profiles are the distributions of these local quantities over node degrees and can be used to analyze both canonical and real-world directed biological networks. Many biological networks, which have been previously classified as disassortative, are shown to be assortative with respect to these new measures. Finally, we demonstrate the use of local assortativity profiles in analyzing the functionalities of particular nodes and groups of nodes in real-world biological networks.


Artificial Life | 2011

Information dynamics in small-world boolean networks

Joseph T. Lizier; Siddharth Pritam; Mikhail Prokopenko

Small-world networks have been one of the most influential concepts in complex systems science, partly due to their prevalence in naturally occurring networks. It is often suggested that this prevalence is due to an inherent capability to store and transfer information efficiently. We perform an ensemble investigation of the computational capabilities of small-world networks as compared to ordered and random topologies. To generate dynamic behavior for this experiment, we imbue the nodes in these networks with random Boolean functions. We find that the ordered phase of the dynamics (low activity in dynamics) and topologies with low randomness are dominated by information storage, while the chaotic phase (high activity in dynamics) and topologies with high randomness are dominated by information transfer. Information storage and information transfer are somewhat balanced (crossed over) near the small-world regime, providing quantitative evidence that small-world networks do indeed have a propensity to combine comparably large information storage and transfer capacity.


PLOS ONE | 2012

Quantifying and Tracing Information Cascades in Swarms

X. Rosalind Wang; Jennifer M. Miller; Joseph T. Lizier; Mikhail Prokopenko; Louis F. Rossi

We propose a novel, information-theoretic, characterisation of cascades within the spatiotemporal dynamics of swarms, explicitly measuring the extent of collective communications. This is complemented by dynamic tracing of collective memory, as another element of distributed computation, which represents capacity for swarm coherence. The approach deals with both global and local information dynamics, ultimately discovering diverse ways in which an individual’s spatial position is related to its information processing role. It also allows us to contrast cascades that propagate conflicting information with waves of coordinated motion. Most importantly, our simulation experiments provide the first direct information-theoretic evidence (verified in a simulation setting) for the long-held conjecture that the information cascades occur in waves rippling through the swarm. Our experiments also exemplify how features of swarm dynamics, such as cascades’ wavefronts, can be filtered and predicted. We observed that maximal information transfer tends to follow the stage with maximal collective memory, and principles like this may be generalised in wider biological and social contexts.


international conference on embedded wireless systems and networks | 2008

Spatiotemporal anomaly detection in gas monitoring sensor networks

X. Rosalind Wang; Joseph T. Lizier; Oliver Obst; Mikhail Prokopenko; Peter Wang

In this paper, we use Bayesian Networks as a means for unsupervised learning and anomaly (event) detection in gas monitoring sensor networks for underground coal mines. We show that the Bayesian Network model can learn cyclical baselines for gas concentrations, thus reducing false alarms usually caused by flatline thresholds. Further, we show that the system can learn dependencies between changes of concentration in different gases and at multiple locations. We define and identify new types of events that can occur in a sensor network. In particular, we analyse joint events in a group of sensors based on learning the Bayesian model of the system, contrasting these events with merely aggregating single events.We demonstrate that anomalous events in individual gas data might be explained if considered jointly with the changes in other gases. Vice versa, a network-wide spatiotemporal anomaly may be detected even if individual sensor readings were within their thresholds. The presented Bayesian approach to spatiotemporal anomaly detection is applicable to a wide range of sensor networks.


PLOS ONE | 2013

Percolation Centrality: Quantifying Graph-Theoretic Impact of Nodes during Percolation in Networks

Mahendra Piraveenan; Mikhail Prokopenko; Liaquat Hossain

A number of centrality measures are available to determine the relative importance of a node in a complex network, and betweenness is prominent among them. However, the existing centrality measures are not adequate in network percolation scenarios (such as during infection transmission in a social network of individuals, spreading of computer viruses on computer networks, or transmission of disease over a network of towns) because they do not account for the changing percolation states of individual nodes. We propose a new measure, percolation centrality, that quantifies relative impact of nodes based on their topological connectivity, as well as their percolation states. The measure can be extended to include random walk based definitions, and its computational complexity is shown to be of the same order as that of betweenness centrality. We demonstrate the usage of percolation centrality by applying it to a canonical network as well as simulated and real world scale-free and random networks.


Entropy | 2013

On Thermodynamic Interpretation of Transfer Entropy

Mikhail Prokopenko; Joseph T. Lizier; Don Price

We propose a thermodynamic interpretation of transfer entropy near equilibrium, using a specialised Boltzmann’s principle. The approach relates conditional probabilities to the probabilities of the corresponding state transitions. This in turn characterises transfer entropy as a difference of two entropy rates: the rate for a resultant transition and another rate for a possibly irreversible transition within the system affected by an additional source. We then show that this difference, the local transfer entropy, is proportional to the external entropy production, possibly due to irreversibility. Near equilibrium, transfer entropy is also interpreted as the difference in equilibrium stabilities with respect to two scenarios: a default case and the case with an additional source. Finally, we demonstrated that such a thermodynamic treatment is not applicable to information flow, a measure of causal effect.


Artificial Life | 2005

Self-Organizing Hierarchies in Sensor and Communication Networks

Mikhail Prokopenko; Peter Wang; Philip Valencia; Don Price; Mark Foreman; Anthony Farmer

We consider a hierarchical multicellular sensing and communication network, embedded in an ageless aerospace vehicle that is expected to detect and react to multiple impacts and damage over a wide range of impact energies. In particular, we investigate self-organization of impact boundaries enclosing critically damaged areas, and impact networks connecting remote cells that have detected noncritical impacts. Each level of the hierarchy is shown to have distinct higher-order emergent properties, desirable in self-monitoring and self-repairing vehicles. In addition, cells and communication messages are shown to need memory (hysteresis) in order to retain desirable emergent behavior within and between various hierarchical levels. Spatiotemporal robustness of self-organizing hierarchies is quantitatively measured with graph-theoretic and information-theoretic techniques, such as the Shannon entropy. This allows us to clearly identify phase transitions separating chaotic dynamics from ordered and robust patterns.

Collaboration


Dive into the Mikhail Prokopenko's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Peter Wang

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

X. Rosalind Wang

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar

Don Price

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar

Oliver Obst

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Philip Valencia

Commonwealth Scientific and Industrial Research Organisation

View shared research outputs
Top Co-Authors

Avatar

Daniel Polani

University of Hertfordshire

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge